Apr 16 14:52:45.429284 ip-10-0-129-76 systemd[1]: Starting Kubernetes Kubelet... Apr 16 14:52:46.019421 ip-10-0-129-76 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:46.019421 ip-10-0-129-76 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 14:52:46.019421 ip-10-0-129-76 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:46.019421 ip-10-0-129-76 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 14:52:46.019421 ip-10-0-129-76 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:46.024973 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.024884 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 14:52:46.028696 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028680 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:46.028696 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028696 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:46.028763 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028702 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:46.028763 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028707 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:46.028763 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028735 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:46.028763 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028739 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:46.028763 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028743 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:46.028763 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028746 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:46.028763 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028749 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:46.028763 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028752 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:46.028763 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028755 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:46.028763 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028758 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:46.028763 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028761 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:46.028763 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028763 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:46.028763 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028766 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:46.028763 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028769 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:46.029124 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028772 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:46.029124 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028774 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:46.029124 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028778 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:46.029124 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028780 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:46.029124 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028783 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:46.029124 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028787 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:46.029124 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028790 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:46.029124 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028792 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:46.029124 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028795 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:46.029124 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028798 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:46.029124 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028801 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:46.029124 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028804 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:46.029124 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028806 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:46.029124 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028809 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:46.029124 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028811 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:46.029124 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028814 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:46.029124 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028817 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:46.029124 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028819 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:46.029124 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028822 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:46.029124 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028824 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:46.029611 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028827 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:46.029611 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028830 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:46.029611 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028832 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:46.029611 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028835 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:46.029611 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028837 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:46.029611 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028840 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:46.029611 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028842 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:46.029611 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028845 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:46.029611 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028848 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:46.029611 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028850 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:46.029611 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028853 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:46.029611 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028856 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:46.029611 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028858 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:46.029611 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028862 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:46.029611 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028864 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:46.029611 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028869 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:46.029611 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028873 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:46.029611 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028876 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:46.029611 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028879 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:46.029611 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028882 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:46.030119 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028884 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:46.030119 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028887 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:46.030119 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028890 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:46.030119 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028893 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:46.030119 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028895 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:46.030119 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028898 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:46.030119 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028901 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:46.030119 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028903 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:46.030119 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028906 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:46.030119 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028909 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:46.030119 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028912 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:46.030119 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028917 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:46.030119 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028919 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:46.030119 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028922 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:46.030119 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028924 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:46.030119 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028927 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:46.030119 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028930 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:46.030119 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028932 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:46.030119 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028935 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:46.030119 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028937 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:46.030603 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028941 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:46.030603 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028943 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:46.030603 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028946 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:46.030603 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028948 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:46.030603 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028951 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:46.030603 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028955 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:46.030603 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028957 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:46.030603 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028960 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:46.030603 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028963 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:46.030603 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.028965 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:46.030603 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029379 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:46.030603 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029384 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:46.030603 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029388 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:46.030603 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029391 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:46.030603 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029394 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:46.030603 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029396 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:46.030603 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029399 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:46.030603 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029402 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:46.030603 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029404 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:46.030603 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029407 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:46.031102 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029409 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:46.031102 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029412 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:46.031102 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029415 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:46.031102 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029417 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:46.031102 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029420 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:46.031102 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029423 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:46.031102 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029425 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:46.031102 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029429 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:46.031102 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029431 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:46.031102 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029434 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:46.031102 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029436 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:46.031102 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029439 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:46.031102 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029442 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:46.031102 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029444 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:46.031102 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029447 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:46.031102 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029449 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:46.031102 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029452 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:46.031102 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029456 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:46.031102 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029459 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:46.031102 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029462 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:46.031638 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029465 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:46.031638 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029467 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:46.031638 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029470 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:46.031638 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029472 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:46.031638 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029475 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:46.031638 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029478 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:46.031638 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029480 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:46.031638 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029482 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:46.031638 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029485 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:46.031638 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029489 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:46.031638 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029492 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:46.031638 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029495 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:46.031638 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029498 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:46.031638 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029502 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:46.031638 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029505 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:46.031638 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029509 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:46.031638 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029511 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:46.031638 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029515 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:46.031638 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029518 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:46.032118 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029520 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:46.032118 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029523 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:46.032118 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029525 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:46.032118 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029528 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:46.032118 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029530 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:46.032118 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029533 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:46.032118 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029535 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:46.032118 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029538 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:46.032118 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029541 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:46.032118 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029543 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:46.032118 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029547 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:46.032118 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029550 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:46.032118 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029553 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:46.032118 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029556 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:46.032118 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029558 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:46.032118 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029561 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:46.032118 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029563 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:46.032118 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029566 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:46.032118 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029568 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:46.032582 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029571 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:46.032582 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029574 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:46.032582 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029577 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:46.032582 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029579 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:46.032582 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029582 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:46.032582 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029584 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:46.032582 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029587 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:46.032582 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029590 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:46.032582 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029593 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:46.032582 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029595 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:46.032582 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029599 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:46.032582 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029601 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:46.032582 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029603 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:46.032582 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029606 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:46.032582 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029608 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:46.032582 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029611 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:46.032582 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029613 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:46.032582 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.029616 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:46.032582 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030449 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 14:52:46.032582 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030458 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 14:52:46.032582 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030466 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030471 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030475 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030479 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030484 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030489 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030492 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030496 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030499 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030503 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030507 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030510 2576 flags.go:64] FLAG: --cgroup-root="" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030513 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030516 2576 flags.go:64] FLAG: --client-ca-file="" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030519 2576 flags.go:64] FLAG: --cloud-config="" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030522 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030524 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030529 2576 flags.go:64] FLAG: --cluster-domain="" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030531 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030535 2576 flags.go:64] FLAG: --config-dir="" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030537 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030541 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030545 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030556 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 14:52:46.033109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030559 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030563 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030566 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030569 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030572 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030575 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030578 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030586 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030589 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030593 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030596 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030600 2576 flags.go:64] FLAG: --enable-server="true" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030603 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030609 2576 flags.go:64] FLAG: --event-burst="100" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030612 2576 flags.go:64] FLAG: --event-qps="50" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030616 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030619 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030622 2576 flags.go:64] FLAG: --eviction-hard="" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030626 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030629 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030632 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030636 2576 flags.go:64] FLAG: --eviction-soft="" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030639 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030641 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030644 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 14:52:46.033705 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030648 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030650 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030653 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030656 2576 flags.go:64] FLAG: --feature-gates="" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030660 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030663 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030666 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030670 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030673 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030676 2576 flags.go:64] FLAG: --help="false" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030679 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-129-76.ec2.internal" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030683 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030686 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030689 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030692 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030696 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030699 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030702 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030706 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030709 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030712 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030715 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030718 2576 flags.go:64] FLAG: --kube-reserved="" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030721 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 14:52:46.034338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030723 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030727 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030729 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030732 2576 flags.go:64] FLAG: --lock-file="" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030735 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030738 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030741 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030747 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030750 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030753 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030756 2576 flags.go:64] FLAG: --logging-format="text" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030759 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030762 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030765 2576 flags.go:64] FLAG: --manifest-url="" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030768 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030772 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030776 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030780 2576 flags.go:64] FLAG: --max-pods="110" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030783 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030786 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030789 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030792 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030796 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030799 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030802 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 14:52:46.034948 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030809 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030813 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030816 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030819 2576 flags.go:64] FLAG: --pod-cidr="" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030822 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030828 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030831 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030834 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030837 2576 flags.go:64] FLAG: --port="10250" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030840 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030843 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0b140a6d7cd1d8931" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030846 2576 flags.go:64] FLAG: --qos-reserved="" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030849 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030852 2576 flags.go:64] FLAG: --register-node="true" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030855 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030858 2576 flags.go:64] FLAG: --register-with-taints="" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030861 2576 flags.go:64] FLAG: --registry-burst="10" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030864 2576 flags.go:64] FLAG: --registry-qps="5" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030867 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030870 2576 flags.go:64] FLAG: --reserved-memory="" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030873 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030876 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030879 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030882 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030885 2576 flags.go:64] FLAG: --runonce="false" Apr 16 14:52:46.035557 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030888 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030891 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030894 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030897 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030900 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030903 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030906 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030909 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030913 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030916 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030919 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030922 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030925 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030928 2576 flags.go:64] FLAG: --system-cgroups="" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030931 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030936 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030939 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030942 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030946 2576 flags.go:64] FLAG: --tls-min-version="" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030949 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030952 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030955 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030958 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030961 2576 flags.go:64] FLAG: --v="2" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030966 2576 flags.go:64] FLAG: --version="false" Apr 16 14:52:46.036252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030970 2576 flags.go:64] FLAG: --vmodule="" Apr 16 14:52:46.036877 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030975 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 14:52:46.036877 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.030979 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 14:52:46.036877 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031093 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:46.036877 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031097 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:46.036877 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031100 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:46.036877 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031103 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:46.036877 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031107 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:46.036877 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031110 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:46.036877 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031113 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:46.036877 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031117 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:46.036877 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031120 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:46.036877 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031123 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:46.036877 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031126 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:46.036877 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031130 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:46.036877 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031133 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:46.036877 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031136 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:46.036877 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031139 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:46.036877 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031141 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:46.036877 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031144 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:46.036877 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031147 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:46.037383 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031150 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:46.037383 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031153 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:46.037383 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031155 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:46.037383 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031158 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:46.037383 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031160 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:46.037383 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031162 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:46.037383 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031165 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:46.037383 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031168 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:46.037383 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031170 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:46.037383 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031173 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:46.037383 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031175 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:46.037383 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031178 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:46.037383 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031181 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:46.037383 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031183 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:46.037383 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031186 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:46.037383 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031188 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:46.037383 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031191 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:46.037383 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031193 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:46.037383 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031196 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:46.037383 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031199 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:46.037900 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031202 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:46.037900 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031205 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:46.037900 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031207 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:46.037900 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031210 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:46.037900 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031212 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:46.037900 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031215 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:46.037900 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031220 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:46.037900 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031222 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:46.037900 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031225 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:46.037900 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031228 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:46.037900 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031230 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:46.037900 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031233 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:46.037900 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031235 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:46.037900 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031238 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:46.037900 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031240 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:46.037900 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031243 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:46.037900 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031245 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:46.037900 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031248 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:46.037900 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031251 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:46.037900 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031253 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:46.038419 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031256 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:46.038419 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031258 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:46.038419 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031261 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:46.038419 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031264 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:46.038419 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031266 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:46.038419 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031269 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:46.038419 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031271 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:46.038419 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031274 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:46.038419 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031276 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:46.038419 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031279 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:46.038419 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031281 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:46.038419 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031284 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:46.038419 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031287 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:46.038419 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031290 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:46.038419 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031293 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:46.038419 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031295 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:46.038419 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031298 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:46.038419 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031303 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:46.038419 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031307 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:46.038419 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031310 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:46.038896 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031313 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:46.038896 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031317 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:46.038896 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031321 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:46.038896 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031324 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:46.038896 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031327 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:46.038896 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031330 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:46.038896 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031333 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:46.038896 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.031336 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:46.038896 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.031345 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:46.041653 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.041628 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 14:52:46.041653 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.041650 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 14:52:46.041784 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041713 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:46.041784 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041718 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:46.041784 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041722 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:46.041784 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041725 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:46.041784 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041728 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:46.041784 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041731 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:46.041784 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041733 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:46.041784 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041736 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:46.041784 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041738 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:46.041784 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041741 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:46.041784 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041744 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:46.041784 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041747 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:46.041784 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041750 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:46.041784 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041752 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:46.041784 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041755 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:46.041784 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041758 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:46.041784 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041760 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:46.041784 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041763 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:46.041784 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041765 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:46.042285 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041769 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:46.042285 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041771 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:46.042285 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041774 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:46.042285 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041777 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:46.042285 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041780 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:46.042285 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041783 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:46.042285 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041786 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:46.042285 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041789 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:46.042285 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041792 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:46.042285 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041794 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:46.042285 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041797 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:46.042285 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041800 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:46.042285 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041803 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:46.042285 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041806 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:46.042285 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041809 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:46.042285 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041812 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:46.042285 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041814 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:46.042285 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041818 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:46.042285 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041823 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:46.042285 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041827 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:46.042808 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041830 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:46.042808 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041833 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:46.042808 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041835 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:46.042808 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041838 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:46.042808 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041841 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:46.042808 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041844 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:46.042808 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041846 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:46.042808 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041849 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:46.042808 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041852 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:46.042808 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041854 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:46.042808 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041857 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:46.042808 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041859 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:46.042808 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041862 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:46.042808 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041865 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:46.042808 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041868 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:46.042808 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041870 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:46.042808 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041873 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:46.042808 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041875 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:46.042808 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041878 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:46.042808 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041881 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:46.043317 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041883 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:46.043317 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041886 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:46.043317 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041888 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:46.043317 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041891 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:46.043317 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041894 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:46.043317 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041897 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:46.043317 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041900 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:46.043317 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041902 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:46.043317 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041905 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:46.043317 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041907 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:46.043317 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041910 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:46.043317 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041913 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:46.043317 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041915 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:46.043317 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041918 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:46.043317 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041921 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:46.043317 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041923 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:46.043317 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041926 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:46.043317 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041928 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:46.043317 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041931 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:46.043317 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041933 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:46.043806 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041937 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:46.043806 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041940 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:46.043806 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041943 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:46.043806 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041946 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:46.043806 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041949 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:46.043806 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041952 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:46.043806 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.041954 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:46.043806 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.041960 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:46.043806 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042081 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:46.043806 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042088 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:46.043806 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042091 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:46.043806 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042094 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:46.043806 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042097 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:46.043806 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042100 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:46.043806 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042102 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:46.044187 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042106 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:46.044187 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042111 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:46.044187 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042114 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:46.044187 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042117 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:46.044187 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042120 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:46.044187 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042123 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:46.044187 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042125 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:46.044187 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042128 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:46.044187 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042131 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:46.044187 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042134 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:46.044187 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042137 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:46.044187 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042139 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:46.044187 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042142 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:46.044187 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042144 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:46.044187 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042147 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:46.044187 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042149 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:46.044187 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042152 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:46.044187 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042154 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:46.044187 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042157 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:46.044187 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042160 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:46.044687 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042163 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:46.044687 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042165 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:46.044687 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042169 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:46.044687 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042172 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:46.044687 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042175 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:46.044687 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042177 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:46.044687 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042180 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:46.044687 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042182 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:46.044687 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042185 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:46.044687 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042188 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:46.044687 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042190 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:46.044687 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042193 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:46.044687 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042196 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:46.044687 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042198 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:46.044687 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042201 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:46.044687 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042203 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:46.044687 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042206 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:46.044687 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042209 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:46.044687 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042211 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:46.044687 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042214 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:46.045192 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042216 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:46.045192 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042219 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:46.045192 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042222 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:46.045192 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042224 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:46.045192 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042228 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:46.045192 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042232 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:46.045192 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042235 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:46.045192 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042238 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:46.045192 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042241 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:46.045192 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042244 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:46.045192 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042246 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:46.045192 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042256 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:46.045192 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042260 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:46.045192 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042263 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:46.045192 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042266 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:46.045192 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042269 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:46.045192 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042272 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:46.045192 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042275 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:46.045192 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042277 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:46.045662 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042280 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:46.045662 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042282 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:46.045662 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042285 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:46.045662 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042288 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:46.045662 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042290 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:46.045662 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042293 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:46.045662 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042296 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:46.045662 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042300 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:46.045662 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042302 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:46.045662 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042305 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:46.045662 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042308 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:46.045662 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042310 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:46.045662 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042313 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:46.045662 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042316 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:46.045662 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042318 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:46.045662 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042321 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:46.045662 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042324 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:46.045662 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042326 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:46.045662 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042329 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:46.045662 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:46.042331 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:46.046170 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.042336 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:46.046170 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.043083 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 14:52:46.046170 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.046086 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 14:52:46.046984 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.046972 2576 server.go:1019] "Starting client certificate rotation" Apr 16 14:52:46.047061 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.047045 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:46.047098 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.047082 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:46.087814 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.087791 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:46.094122 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.094087 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:46.110586 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.110566 2576 log.go:25] "Validated CRI v1 runtime API" Apr 16 14:52:46.118050 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.118013 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:46.120828 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.120814 2576 log.go:25] "Validated CRI v1 image API" Apr 16 14:52:46.122643 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.122627 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 14:52:46.125469 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.125448 2576 fs.go:135] Filesystem UUIDs: map[00ad8b94-858c-453c-99d9-cef1be015a46:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 af4d3ed6-9ad0-4d22-9fde-5bcb6b726c7a:/dev/nvme0n1p4] Apr 16 14:52:46.125552 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.125469 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 14:52:46.131535 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.131417 2576 manager.go:217] Machine: {Timestamp:2026-04-16 14:52:46.129549928 +0000 UTC m=+0.548684897 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3103887 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27118a9469b324707f67af4c273766 SystemUUID:ec27118a-9469-b324-707f-67af4c273766 BootID:393d6d18-1084-445d-99da-ef5be84d387c Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:2d:bb:7e:74:55 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:2d:bb:7e:74:55 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:da:a2:58:13:95:98 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 14:52:46.132102 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.132090 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 14:52:46.132217 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.132201 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 14:52:46.133927 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.133899 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 14:52:46.134110 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.133928 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-76.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 14:52:46.134188 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.134120 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 14:52:46.134188 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.134134 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 14:52:46.134188 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.134152 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:46.135109 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.135096 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:46.138183 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.138171 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:46.138479 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.138466 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 14:52:46.141813 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.141801 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 16 14:52:46.141883 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.141819 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 14:52:46.141883 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.141835 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 14:52:46.141883 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.141848 2576 kubelet.go:397] "Adding apiserver pod source" Apr 16 14:52:46.141883 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.141859 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 14:52:46.143601 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.143582 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:46.143680 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.143611 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:46.148173 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.148152 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 14:52:46.150355 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.150324 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 14:52:46.152211 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.152195 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 14:52:46.152301 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.152215 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 14:52:46.152301 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.152226 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 14:52:46.152301 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.152242 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 14:52:46.152301 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.152249 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 14:52:46.152301 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.152256 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 14:52:46.152301 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.152277 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 14:52:46.152301 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.152284 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 14:52:46.152301 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.152292 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 14:52:46.152301 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.152298 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 14:52:46.152590 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.152313 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 14:52:46.152590 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.152323 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 14:52:46.153046 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.153009 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-79wt8" Apr 16 14:52:46.153942 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:46.153920 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-76.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 14:52:46.154010 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:46.153936 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 14:52:46.154340 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.154326 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 14:52:46.154401 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.154353 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 14:52:46.157910 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.157894 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 14:52:46.157997 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.157938 2576 server.go:1295] "Started kubelet" Apr 16 14:52:46.158119 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.158090 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 14:52:46.158197 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.158073 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 14:52:46.158279 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.158220 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 14:52:46.158511 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.158494 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-79wt8" Apr 16 14:52:46.158831 ip-10-0-129-76 systemd[1]: Started Kubernetes Kubelet. Apr 16 14:52:46.161675 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.161658 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 14:52:46.162189 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.162174 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 16 14:52:46.164155 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.164136 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-76.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 14:52:46.164965 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:46.164136 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-76.ec2.internal.18a6ddf95f3163d7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-76.ec2.internal,UID:ip-10-0-129-76.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-76.ec2.internal,},FirstTimestamp:2026-04-16 14:52:46.157906903 +0000 UTC m=+0.577041872,LastTimestamp:2026-04-16 14:52:46.157906903 +0000 UTC m=+0.577041872,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-76.ec2.internal,}" Apr 16 14:52:46.169650 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.169627 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:46.169944 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:46.169910 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 14:52:46.171007 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.170994 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 14:52:46.171590 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.171572 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 14:52:46.171590 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.171571 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 14:52:46.171747 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.171600 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 14:52:46.171747 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.171574 2576 factory.go:55] Registering systemd factory Apr 16 14:52:46.171747 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.171666 2576 factory.go:223] Registration of the systemd container factory successfully Apr 16 14:52:46.171747 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.171737 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 16 14:52:46.171747 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.171748 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 16 14:52:46.171972 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:46.171828 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-76.ec2.internal\" not found" Apr 16 14:52:46.171972 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.171892 2576 factory.go:153] Registering CRI-O factory Apr 16 14:52:46.171972 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.171902 2576 factory.go:223] Registration of the crio container factory successfully Apr 16 14:52:46.171972 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.171954 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 14:52:46.172176 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.171975 2576 factory.go:103] Registering Raw factory Apr 16 14:52:46.172176 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.171990 2576 manager.go:1196] Started watching for new ooms in manager Apr 16 14:52:46.173190 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.173162 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:46.173288 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.173192 2576 manager.go:319] Starting recovery of all containers Apr 16 14:52:46.176216 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:46.176009 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-76.ec2.internal\" not found" node="ip-10-0-129-76.ec2.internal" Apr 16 14:52:46.182889 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.182777 2576 manager.go:324] Recovery completed Apr 16 14:52:46.187215 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.187192 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:46.189810 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.189791 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-76.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:46.189891 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.189824 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-76.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:46.189891 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.189837 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-76.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:46.190321 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.190305 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 14:52:46.190321 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.190318 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 14:52:46.190420 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.190333 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:46.193636 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.193625 2576 policy_none.go:49] "None policy: Start" Apr 16 14:52:46.193683 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.193641 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 14:52:46.193683 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.193651 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 16 14:52:46.231586 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.231570 2576 manager.go:341] "Starting Device Plugin manager" Apr 16 14:52:46.252481 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:46.231605 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 14:52:46.252481 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.231618 2576 server.go:85] "Starting device plugin registration server" Apr 16 14:52:46.252481 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.231848 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 14:52:46.252481 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.231858 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 14:52:46.252481 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.231955 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 14:52:46.252481 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.232052 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 14:52:46.252481 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.232060 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 14:52:46.252481 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:46.232509 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 14:52:46.252481 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:46.232542 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-76.ec2.internal\" not found" Apr 16 14:52:46.300809 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.300741 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 14:52:46.302072 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.302046 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 14:52:46.302156 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.302080 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 14:52:46.302156 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.302108 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 14:52:46.302156 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.302116 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 14:52:46.302156 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:46.302151 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 14:52:46.305875 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.305854 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:46.332708 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.332683 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:46.333666 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.333648 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-76.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:46.333744 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.333675 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-76.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:46.333744 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.333684 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-76.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:46.333744 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.333707 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-76.ec2.internal" Apr 16 14:52:46.341267 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.341253 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-76.ec2.internal" Apr 16 14:52:46.341369 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:46.341274 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-76.ec2.internal\": node \"ip-10-0-129-76.ec2.internal\" not found" Apr 16 14:52:46.351588 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:46.351571 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-76.ec2.internal\" not found" Apr 16 14:52:46.403081 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.403054 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-76.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-76.ec2.internal"] Apr 16 14:52:46.403158 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.403128 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:46.403963 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.403949 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-76.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:46.404059 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.403976 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-76.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:46.404059 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.403990 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-76.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:46.406202 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.406190 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:46.406342 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.406326 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-76.ec2.internal" Apr 16 14:52:46.406392 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.406355 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:46.407268 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.407072 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-76.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:46.407268 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.407100 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-76.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:46.407268 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.407115 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-76.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:46.407833 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.407816 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-76.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:46.407923 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.407840 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-76.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:46.407923 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.407853 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-76.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:46.409955 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.409941 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-76.ec2.internal" Apr 16 14:52:46.410039 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.409973 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:46.410710 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.410692 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-76.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:46.410805 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.410724 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-76.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:46.410805 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.410737 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-76.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:46.425544 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:46.425520 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-76.ec2.internal\" not found" node="ip-10-0-129-76.ec2.internal" Apr 16 14:52:46.429784 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:46.429768 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-76.ec2.internal\" not found" node="ip-10-0-129-76.ec2.internal" Apr 16 14:52:46.452300 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:46.452280 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-76.ec2.internal\" not found" Apr 16 14:52:46.473870 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.473849 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/eeeae1dda6131e6cc2d2b873cb53b9f0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-76.ec2.internal\" (UID: \"eeeae1dda6131e6cc2d2b873cb53b9f0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-76.ec2.internal" Apr 16 14:52:46.473930 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.473875 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eeeae1dda6131e6cc2d2b873cb53b9f0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-76.ec2.internal\" (UID: \"eeeae1dda6131e6cc2d2b873cb53b9f0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-76.ec2.internal" Apr 16 14:52:46.473930 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.473899 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fa97968d5b4634bd4f9419795593b093-config\") pod \"kube-apiserver-proxy-ip-10-0-129-76.ec2.internal\" (UID: \"fa97968d5b4634bd4f9419795593b093\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-76.ec2.internal" Apr 16 14:52:46.552642 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:46.552578 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-76.ec2.internal\" not found" Apr 16 14:52:46.574955 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.574934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eeeae1dda6131e6cc2d2b873cb53b9f0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-76.ec2.internal\" (UID: \"eeeae1dda6131e6cc2d2b873cb53b9f0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-76.ec2.internal" Apr 16 14:52:46.575010 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.574963 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fa97968d5b4634bd4f9419795593b093-config\") pod \"kube-apiserver-proxy-ip-10-0-129-76.ec2.internal\" (UID: \"fa97968d5b4634bd4f9419795593b093\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-76.ec2.internal" Apr 16 14:52:46.575010 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.574983 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/eeeae1dda6131e6cc2d2b873cb53b9f0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-76.ec2.internal\" (UID: \"eeeae1dda6131e6cc2d2b873cb53b9f0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-76.ec2.internal" Apr 16 14:52:46.575089 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.575038 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eeeae1dda6131e6cc2d2b873cb53b9f0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-76.ec2.internal\" (UID: \"eeeae1dda6131e6cc2d2b873cb53b9f0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-76.ec2.internal" Apr 16 14:52:46.575089 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.575058 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/eeeae1dda6131e6cc2d2b873cb53b9f0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-76.ec2.internal\" (UID: \"eeeae1dda6131e6cc2d2b873cb53b9f0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-76.ec2.internal" Apr 16 14:52:46.575089 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.575041 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fa97968d5b4634bd4f9419795593b093-config\") pod \"kube-apiserver-proxy-ip-10-0-129-76.ec2.internal\" (UID: \"fa97968d5b4634bd4f9419795593b093\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-76.ec2.internal" Apr 16 14:52:46.653338 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:46.653302 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-76.ec2.internal\" not found" Apr 16 14:52:46.729830 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.729810 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-76.ec2.internal" Apr 16 14:52:46.732400 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.732380 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-76.ec2.internal" Apr 16 14:52:46.753403 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:46.753367 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-76.ec2.internal\" not found" Apr 16 14:52:46.853948 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:46.853853 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-76.ec2.internal\" not found" Apr 16 14:52:46.954384 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:46.954337 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-76.ec2.internal\" not found" Apr 16 14:52:46.969883 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.969854 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:46.971696 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.971680 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-76.ec2.internal" Apr 16 14:52:46.981838 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.981816 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:46.983564 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.983548 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-76.ec2.internal" Apr 16 14:52:46.994011 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:46.993990 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:47.047126 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.047098 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 14:52:47.047657 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.047245 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:52:47.047657 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.047274 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:52:47.047657 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.047276 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:52:47.142770 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.142710 2576 apiserver.go:52] "Watching apiserver" Apr 16 14:52:47.158720 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.158695 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 14:52:47.159156 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.159131 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-tb9c9","kube-system/kube-apiserver-proxy-ip-10-0-129-76.ec2.internal","openshift-cluster-node-tuning-operator/tuned-btrdx","openshift-dns/node-resolver-mjsr6","openshift-image-registry/node-ca-rxvdm","openshift-multus/multus-additional-cni-plugins-8kcqd","openshift-multus/multus-cdg2t","openshift-network-diagnostics/network-check-target-6nklq","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-76.ec2.internal","openshift-multus/network-metrics-daemon-9p5t7","openshift-network-operator/iptables-alerter-v7nk5","openshift-ovn-kubernetes/ovnkube-node-ddt96"] Apr 16 14:52:47.163480 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.163447 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 14:47:46 +0000 UTC" deadline="2027-09-18 11:28:19.75853961 +0000 UTC" Apr 16 14:52:47.163480 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.163479 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12476h35m32.595064905s" Apr 16 14:52:47.163670 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.163655 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tb9c9" Apr 16 14:52:47.166566 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.166108 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 14:52:47.166566 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.166153 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.166566 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.166213 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 14:52:47.166782 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.166571 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mjsr6" Apr 16 14:52:47.166782 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.166587 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-ptp4g\"" Apr 16 14:52:47.168918 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.168860 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:47.168918 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.168863 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:47.169094 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.168928 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-6795f\"" Apr 16 14:52:47.169094 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.168963 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 14:52:47.169094 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.169003 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-t52mz\"" Apr 16 14:52:47.169241 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.169091 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 14:52:47.169316 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.169300 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rxvdm" Apr 16 14:52:47.169881 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.169868 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:47.171233 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.171202 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 14:52:47.171300 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.171207 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 14:52:47.171417 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.171402 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 14:52:47.171471 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.171455 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.171549 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.171536 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-r2ffp\"" Apr 16 14:52:47.173516 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.173499 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 14:52:47.173593 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.173569 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.173712 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.173691 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 14:52:47.173878 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.173859 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 14:52:47.173974 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.173886 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 14:52:47.173974 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.173931 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 14:52:47.173974 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.173932 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zggwx\"" Apr 16 14:52:47.175434 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.175417 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 14:52:47.175525 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.175490 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-6s4dj\"" Apr 16 14:52:47.175738 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.175725 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:52:47.175790 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:47.175780 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6nklq" podUID="d6e0e8e5-d659-4175-b96f-52c250d77fd0" Apr 16 14:52:47.177723 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.177704 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-etc-modprobe-d\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.177833 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.177736 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-run\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.177833 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.177765 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31294a51-df01-4523-afff-845ceb6be0cc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.177833 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.177789 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-system-cni-dir\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.177968 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.177832 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a6e086d8-b850-425d-9896-6df3cec2442b-agent-certs\") pod \"konnectivity-agent-tb9c9\" (UID: \"a6e086d8-b850-425d-9896-6df3cec2442b\") " pod="kube-system/konnectivity-agent-tb9c9" Apr 16 14:52:47.177968 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.177833 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.178123 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.177857 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a6e086d8-b850-425d-9896-6df3cec2442b-konnectivity-ca\") pod \"konnectivity-agent-tb9c9\" (UID: \"a6e086d8-b850-425d-9896-6df3cec2442b\") " pod="kube-system/konnectivity-agent-tb9c9" Apr 16 14:52:47.178213 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178134 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31294a51-df01-4523-afff-845ceb6be0cc-cni-binary-copy\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.178213 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178165 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd9ts\" (UniqueName: \"kubernetes.io/projected/31294a51-df01-4523-afff-845ceb6be0cc-kube-api-access-sd9ts\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.178318 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178209 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-etc-sysctl-d\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.178318 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178243 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/281d16c8-10bf-4c91-91f2-472d3584db2f-etc-tuned\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.178318 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178268 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/69a10374-32da-4de3-b491-3854f69f1613-tmp-dir\") pod \"node-resolver-mjsr6\" (UID: \"69a10374-32da-4de3-b491-3854f69f1613\") " pod="openshift-dns/node-resolver-mjsr6" Apr 16 14:52:47.178318 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178294 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9mmc\" (UniqueName: \"kubernetes.io/projected/69a10374-32da-4de3-b491-3854f69f1613-kube-api-access-r9mmc\") pod \"node-resolver-mjsr6\" (UID: \"69a10374-32da-4de3-b491-3854f69f1613\") " pod="openshift-dns/node-resolver-mjsr6" Apr 16 14:52:47.178478 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178320 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-multus-cni-dir\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.178478 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178345 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-multus-conf-dir\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.178478 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178372 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-multus-daemon-config\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.178478 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178397 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-etc-systemd\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.178478 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178420 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-sys\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.178478 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178463 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab74fce9-eb83-4941-97e9-42f6ed125bf5-host\") pod \"node-ca-rxvdm\" (UID: \"ab74fce9-eb83-4941-97e9-42f6ed125bf5\") " pod="openshift-image-registry/node-ca-rxvdm" Apr 16 14:52:47.178749 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178496 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31294a51-df01-4523-afff-845ceb6be0cc-cnibin\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.178749 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178525 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-cnibin\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.178749 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178549 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-cni-binary-copy\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.178749 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178572 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-etc-kubernetes\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.178749 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178598 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-etc-sysconfig\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.178749 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ab74fce9-eb83-4941-97e9-42f6ed125bf5-serviceca\") pod \"node-ca-rxvdm\" (UID: \"ab74fce9-eb83-4941-97e9-42f6ed125bf5\") " pod="openshift-image-registry/node-ca-rxvdm" Apr 16 14:52:47.178749 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178645 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31294a51-df01-4523-afff-845ceb6be0cc-os-release\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.178749 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178669 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31294a51-df01-4523-afff-845ceb6be0cc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.178749 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178703 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-host-run-netns\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.178749 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178735 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-host-var-lib-cni-multus\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.179148 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-host-var-lib-kubelet\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.179148 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178799 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-lib-modules\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.179148 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178822 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-etc-kubernetes\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.179148 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178855 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-host\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.179148 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178882 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d67wt\" (UniqueName: \"kubernetes.io/projected/281d16c8-10bf-4c91-91f2-472d3584db2f-kube-api-access-d67wt\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.179148 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178908 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/69a10374-32da-4de3-b491-3854f69f1613-hosts-file\") pod \"node-resolver-mjsr6\" (UID: \"69a10374-32da-4de3-b491-3854f69f1613\") " pod="openshift-dns/node-resolver-mjsr6" Apr 16 14:52:47.179148 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178945 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9sn9\" (UniqueName: \"kubernetes.io/projected/ab74fce9-eb83-4941-97e9-42f6ed125bf5-kube-api-access-k9sn9\") pod \"node-ca-rxvdm\" (UID: \"ab74fce9-eb83-4941-97e9-42f6ed125bf5\") " pod="openshift-image-registry/node-ca-rxvdm" Apr 16 14:52:47.179148 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.178977 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-hostroot\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.179148 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.179001 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/281d16c8-10bf-4c91-91f2-472d3584db2f-tmp\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.179148 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.179057 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-os-release\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.179148 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.179103 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-multus-socket-dir-parent\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.179148 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.179142 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-host-var-lib-cni-bin\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.179648 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.179168 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-host-run-multus-certs\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.179648 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.179197 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t72sz\" (UniqueName: \"kubernetes.io/projected/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-kube-api-access-t72sz\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.179648 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.179230 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-etc-sysctl-conf\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.179648 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.179256 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-var-lib-kubelet\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.179648 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.179281 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31294a51-df01-4523-afff-845ceb6be0cc-system-cni-dir\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.179648 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.179320 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/31294a51-df01-4523-afff-845ceb6be0cc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.179648 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.179356 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-host-run-k8s-cni-cncf-io\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.179968 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.179934 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 14:52:47.180072 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.179976 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-6cbql\"" Apr 16 14:52:47.180072 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.180010 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 14:52:47.180245 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.180225 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 14:52:47.180310 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.180245 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:47.182418 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.182391 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:52:47.182519 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:47.182459 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p5t7" podUID="deecc941-e868-4306-99e5-4f30afef0f95" Apr 16 14:52:47.182519 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.182499 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-v7nk5" Apr 16 14:52:47.184637 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.184617 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:47.184722 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.184663 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 14:52:47.184722 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.184701 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2h4df\"" Apr 16 14:52:47.184827 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.184816 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:47.185010 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.184991 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.187010 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.186993 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 14:52:47.187164 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.187150 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 14:52:47.187241 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.187220 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 14:52:47.187299 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.187234 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 14:52:47.187299 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.187293 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-tqtvc\"" Apr 16 14:52:47.187478 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.187465 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 14:52:47.187683 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.187670 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 14:52:47.200998 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.200979 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-mz9cc" Apr 16 14:52:47.209979 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.209903 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-mz9cc" Apr 16 14:52:47.272377 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.272356 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 14:52:47.280218 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280193 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-host-var-lib-kubelet\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.280321 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280234 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-etc-kubernetes\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.280321 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280260 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-host\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.280321 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280285 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d67wt\" (UniqueName: \"kubernetes.io/projected/281d16c8-10bf-4c91-91f2-472d3584db2f-kube-api-access-d67wt\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.280321 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280302 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-host-var-lib-kubelet\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.280321 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280312 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/69a10374-32da-4de3-b491-3854f69f1613-hosts-file\") pod \"node-resolver-mjsr6\" (UID: \"69a10374-32da-4de3-b491-3854f69f1613\") " pod="openshift-dns/node-resolver-mjsr6" Apr 16 14:52:47.280524 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280376 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/69a10374-32da-4de3-b491-3854f69f1613-hosts-file\") pod \"node-resolver-mjsr6\" (UID: \"69a10374-32da-4de3-b491-3854f69f1613\") " pod="openshift-dns/node-resolver-mjsr6" Apr 16 14:52:47.280524 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280384 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-etc-kubernetes\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.280524 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280375 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-host\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.280524 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280385 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9sn9\" (UniqueName: \"kubernetes.io/projected/ab74fce9-eb83-4941-97e9-42f6ed125bf5-kube-api-access-k9sn9\") pod \"node-ca-rxvdm\" (UID: \"ab74fce9-eb83-4941-97e9-42f6ed125bf5\") " pod="openshift-image-registry/node-ca-rxvdm" Apr 16 14:52:47.280524 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280431 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t45ww\" (UniqueName: \"kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww\") pod \"network-check-target-6nklq\" (UID: \"d6e0e8e5-d659-4175-b96f-52c250d77fd0\") " pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:52:47.280524 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280457 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-run-ovn\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.280524 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280497 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-host-cni-netd\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.280825 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/281d16c8-10bf-4c91-91f2-472d3584db2f-tmp\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.280825 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280560 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-multus-socket-dir-parent\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.280825 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-host-var-lib-cni-bin\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.280825 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280607 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9tsb\" (UniqueName: \"kubernetes.io/projected/13941107-91c6-410e-a282-6657d7c5de03-kube-api-access-s9tsb\") pod \"aws-ebs-csi-driver-node-zpz4t\" (UID: \"13941107-91c6-410e-a282-6657d7c5de03\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.280825 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280648 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-multus-socket-dir-parent\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.280825 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280653 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-host-var-lib-cni-bin\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.280825 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280690 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqjkz\" (UniqueName: \"kubernetes.io/projected/85fdf4e1-8738-483f-a40e-a9112c7098d5-kube-api-access-lqjkz\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.280825 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280725 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-etc-sysctl-conf\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.280825 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280752 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/31294a51-df01-4523-afff-845ceb6be0cc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.280825 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-host-run-k8s-cni-cncf-io\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.280825 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280807 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/13941107-91c6-410e-a282-6657d7c5de03-device-dir\") pod \"aws-ebs-csi-driver-node-zpz4t\" (UID: \"13941107-91c6-410e-a282-6657d7c5de03\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.281236 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280845 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 14:52:47.281236 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280854 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-host-run-k8s-cni-cncf-io\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.281236 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280857 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/85fdf4e1-8738-483f-a40e-a9112c7098d5-ovnkube-config\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.281236 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280865 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-etc-sysctl-conf\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.281236 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280907 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-run\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.281236 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280929 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31294a51-df01-4523-afff-845ceb6be0cc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.281236 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280952 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs\") pod \"network-metrics-daemon-9p5t7\" (UID: \"deecc941-e868-4306-99e5-4f30afef0f95\") " pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:52:47.281236 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280969 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-var-lib-openvswitch\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.281236 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280980 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-run\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.281236 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.280993 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-node-log\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.281236 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281098 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.281236 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281118 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/85fdf4e1-8738-483f-a40e-a9112c7098d5-ovnkube-script-lib\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.281236 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281134 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31294a51-df01-4523-afff-845ceb6be0cc-cni-binary-copy\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.281236 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281149 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-systemd-units\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.281922 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281251 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-run-systemd\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.281922 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281277 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/281d16c8-10bf-4c91-91f2-472d3584db2f-etc-tuned\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.281922 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281307 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/69a10374-32da-4de3-b491-3854f69f1613-tmp-dir\") pod \"node-resolver-mjsr6\" (UID: \"69a10374-32da-4de3-b491-3854f69f1613\") " pod="openshift-dns/node-resolver-mjsr6" Apr 16 14:52:47.281922 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281339 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-multus-cni-dir\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.281922 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281365 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/31294a51-df01-4523-afff-845ceb6be0cc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.281922 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281371 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-multus-conf-dir\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.281922 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281420 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-multus-cni-dir\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.281922 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281421 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-multus-conf-dir\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.281922 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281440 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/616aaf5a-f208-4fe5-97a1-96f1815fe9ac-iptables-alerter-script\") pod \"iptables-alerter-v7nk5\" (UID: \"616aaf5a-f208-4fe5-97a1-96f1815fe9ac\") " pod="openshift-network-operator/iptables-alerter-v7nk5" Apr 16 14:52:47.281922 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281472 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-host-cni-bin\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.281922 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-etc-systemd\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.281922 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281529 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-cnibin\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.281922 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/13941107-91c6-410e-a282-6657d7c5de03-sys-fs\") pod \"aws-ebs-csi-driver-node-zpz4t\" (UID: \"13941107-91c6-410e-a282-6657d7c5de03\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.281922 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281580 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-etc-sysconfig\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.281922 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281586 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31294a51-df01-4523-afff-845ceb6be0cc-cni-binary-copy\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.281922 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281591 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-etc-systemd\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.281922 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-host-var-lib-cni-multus\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.282788 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281630 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-lib-modules\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.282788 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281654 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-hostroot\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.282788 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281657 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-cnibin\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.282788 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281681 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/85fdf4e1-8738-483f-a40e-a9112c7098d5-env-overrides\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.282788 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281712 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/85fdf4e1-8738-483f-a40e-a9112c7098d5-ovn-node-metrics-cert\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.282788 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281728 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-host-var-lib-cni-multus\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.282788 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281738 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-os-release\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.282788 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281767 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-host-run-multus-certs\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.282788 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281775 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-hostroot\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.282788 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t72sz\" (UniqueName: \"kubernetes.io/projected/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-kube-api-access-t72sz\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.282788 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-etc-sysconfig\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.282788 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281836 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-host-kubelet\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.282788 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281860 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-host-slash\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.282788 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281891 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-etc-openvswitch\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.282788 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281904 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-lib-modules\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.282788 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281923 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-var-lib-kubelet\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.282788 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31294a51-df01-4523-afff-845ceb6be0cc-system-cni-dir\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.282788 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281954 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-host-run-multus-certs\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.283616 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.281727 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/69a10374-32da-4de3-b491-3854f69f1613-tmp-dir\") pod \"node-resolver-mjsr6\" (UID: \"69a10374-32da-4de3-b491-3854f69f1613\") " pod="openshift-dns/node-resolver-mjsr6" Apr 16 14:52:47.283616 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.282073 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-os-release\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.283616 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.282132 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31294a51-df01-4523-afff-845ceb6be0cc-system-cni-dir\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.283616 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.282183 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-var-lib-kubelet\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.283616 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.282191 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31294a51-df01-4523-afff-845ceb6be0cc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.283616 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.282213 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/13941107-91c6-410e-a282-6657d7c5de03-registration-dir\") pod \"aws-ebs-csi-driver-node-zpz4t\" (UID: \"13941107-91c6-410e-a282-6657d7c5de03\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.283616 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.282246 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/13941107-91c6-410e-a282-6657d7c5de03-etc-selinux\") pod \"aws-ebs-csi-driver-node-zpz4t\" (UID: \"13941107-91c6-410e-a282-6657d7c5de03\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.283616 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.282272 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-host-run-netns\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.283616 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.282312 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-etc-modprobe-d\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.283616 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.282350 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-system-cni-dir\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.283616 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.282371 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/616aaf5a-f208-4fe5-97a1-96f1815fe9ac-host-slash\") pod \"iptables-alerter-v7nk5\" (UID: \"616aaf5a-f208-4fe5-97a1-96f1815fe9ac\") " pod="openshift-network-operator/iptables-alerter-v7nk5" Apr 16 14:52:47.283616 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.282393 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-run-openvswitch\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.283616 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.282420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a6e086d8-b850-425d-9896-6df3cec2442b-agent-certs\") pod \"konnectivity-agent-tb9c9\" (UID: \"a6e086d8-b850-425d-9896-6df3cec2442b\") " pod="kube-system/konnectivity-agent-tb9c9" Apr 16 14:52:47.283616 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.282473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-etc-modprobe-d\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.283616 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.282485 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a6e086d8-b850-425d-9896-6df3cec2442b-konnectivity-ca\") pod \"konnectivity-agent-tb9c9\" (UID: \"a6e086d8-b850-425d-9896-6df3cec2442b\") " pod="kube-system/konnectivity-agent-tb9c9" Apr 16 14:52:47.283616 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.282537 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sd9ts\" (UniqueName: \"kubernetes.io/projected/31294a51-df01-4523-afff-845ceb6be0cc-kube-api-access-sd9ts\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.283616 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.282557 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-system-cni-dir\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.284412 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.282568 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-etc-sysctl-d\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.284412 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.282593 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9mmc\" (UniqueName: \"kubernetes.io/projected/69a10374-32da-4de3-b491-3854f69f1613-kube-api-access-r9mmc\") pod \"node-resolver-mjsr6\" (UID: \"69a10374-32da-4de3-b491-3854f69f1613\") " pod="openshift-dns/node-resolver-mjsr6" Apr 16 14:52:47.284412 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.283089 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-multus-daemon-config\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.284412 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.283120 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a6e086d8-b850-425d-9896-6df3cec2442b-konnectivity-ca\") pod \"konnectivity-agent-tb9c9\" (UID: \"a6e086d8-b850-425d-9896-6df3cec2442b\") " pod="kube-system/konnectivity-agent-tb9c9" Apr 16 14:52:47.284412 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.283136 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khnql\" (UniqueName: \"kubernetes.io/projected/616aaf5a-f208-4fe5-97a1-96f1815fe9ac-kube-api-access-khnql\") pod \"iptables-alerter-v7nk5\" (UID: \"616aaf5a-f208-4fe5-97a1-96f1815fe9ac\") " pod="openshift-network-operator/iptables-alerter-v7nk5" Apr 16 14:52:47.284412 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.283188 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wms4n\" (UniqueName: \"kubernetes.io/projected/deecc941-e868-4306-99e5-4f30afef0f95-kube-api-access-wms4n\") pod \"network-metrics-daemon-9p5t7\" (UID: \"deecc941-e868-4306-99e5-4f30afef0f95\") " pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:52:47.284412 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.283218 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-log-socket\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.284412 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.283239 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-etc-sysctl-d\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.284412 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.283760 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-multus-daemon-config\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.284412 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.283850 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-host-run-ovn-kubernetes\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.284412 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.283897 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-sys\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.284412 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.283920 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab74fce9-eb83-4941-97e9-42f6ed125bf5-host\") pod \"node-ca-rxvdm\" (UID: \"ab74fce9-eb83-4941-97e9-42f6ed125bf5\") " pod="openshift-image-registry/node-ca-rxvdm" Apr 16 14:52:47.284412 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.283948 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31294a51-df01-4523-afff-845ceb6be0cc-cnibin\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.284412 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.283971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-cni-binary-copy\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.284412 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.283989 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab74fce9-eb83-4941-97e9-42f6ed125bf5-host\") pod \"node-ca-rxvdm\" (UID: \"ab74fce9-eb83-4941-97e9-42f6ed125bf5\") " pod="openshift-image-registry/node-ca-rxvdm" Apr 16 14:52:47.284412 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.283972 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/281d16c8-10bf-4c91-91f2-472d3584db2f-sys\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.284412 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.283998 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-etc-kubernetes\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.285085 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.284062 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13941107-91c6-410e-a282-6657d7c5de03-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zpz4t\" (UID: \"13941107-91c6-410e-a282-6657d7c5de03\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.285085 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.284086 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/281d16c8-10bf-4c91-91f2-472d3584db2f-tmp\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.285085 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.284092 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/13941107-91c6-410e-a282-6657d7c5de03-socket-dir\") pod \"aws-ebs-csi-driver-node-zpz4t\" (UID: \"13941107-91c6-410e-a282-6657d7c5de03\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.285085 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.284089 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-etc-kubernetes\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.285085 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.284130 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ab74fce9-eb83-4941-97e9-42f6ed125bf5-serviceca\") pod \"node-ca-rxvdm\" (UID: \"ab74fce9-eb83-4941-97e9-42f6ed125bf5\") " pod="openshift-image-registry/node-ca-rxvdm" Apr 16 14:52:47.285085 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.284166 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31294a51-df01-4523-afff-845ceb6be0cc-os-release\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.285085 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.284124 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31294a51-df01-4523-afff-845ceb6be0cc-cnibin\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.285085 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.284195 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31294a51-df01-4523-afff-845ceb6be0cc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.285085 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.284249 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-host-run-netns\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.285085 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.284225 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/281d16c8-10bf-4c91-91f2-472d3584db2f-etc-tuned\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.285085 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.284331 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-host-run-netns\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.285085 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.284352 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31294a51-df01-4523-afff-845ceb6be0cc-os-release\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.285085 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.284505 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-cni-binary-copy\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.285085 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.284585 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ab74fce9-eb83-4941-97e9-42f6ed125bf5-serviceca\") pod \"node-ca-rxvdm\" (UID: \"ab74fce9-eb83-4941-97e9-42f6ed125bf5\") " pod="openshift-image-registry/node-ca-rxvdm" Apr 16 14:52:47.285085 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.284617 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31294a51-df01-4523-afff-845ceb6be0cc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.285085 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.285042 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a6e086d8-b850-425d-9896-6df3cec2442b-agent-certs\") pod \"konnectivity-agent-tb9c9\" (UID: \"a6e086d8-b850-425d-9896-6df3cec2442b\") " pod="kube-system/konnectivity-agent-tb9c9" Apr 16 14:52:47.288801 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.288776 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9sn9\" (UniqueName: \"kubernetes.io/projected/ab74fce9-eb83-4941-97e9-42f6ed125bf5-kube-api-access-k9sn9\") pod \"node-ca-rxvdm\" (UID: \"ab74fce9-eb83-4941-97e9-42f6ed125bf5\") " pod="openshift-image-registry/node-ca-rxvdm" Apr 16 14:52:47.289103 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.289091 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d67wt\" (UniqueName: \"kubernetes.io/projected/281d16c8-10bf-4c91-91f2-472d3584db2f-kube-api-access-d67wt\") pod \"tuned-btrdx\" (UID: \"281d16c8-10bf-4c91-91f2-472d3584db2f\") " pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.291524 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.291499 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t72sz\" (UniqueName: \"kubernetes.io/projected/a36d1747-2a52-4941-aa0e-8d1fe90b9b00-kube-api-access-t72sz\") pod \"multus-cdg2t\" (UID: \"a36d1747-2a52-4941-aa0e-8d1fe90b9b00\") " pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.291856 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.291836 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9mmc\" (UniqueName: \"kubernetes.io/projected/69a10374-32da-4de3-b491-3854f69f1613-kube-api-access-r9mmc\") pod \"node-resolver-mjsr6\" (UID: \"69a10374-32da-4de3-b491-3854f69f1613\") " pod="openshift-dns/node-resolver-mjsr6" Apr 16 14:52:47.292033 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.292001 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd9ts\" (UniqueName: \"kubernetes.io/projected/31294a51-df01-4523-afff-845ceb6be0cc-kube-api-access-sd9ts\") pod \"multus-additional-cni-plugins-8kcqd\" (UID: \"31294a51-df01-4523-afff-845ceb6be0cc\") " pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.301604 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:47.301580 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeeae1dda6131e6cc2d2b873cb53b9f0.slice/crio-c9c2060543091746d25cdfe8849f5bdf46a53adb23490f6d6c262ae49ec65973 WatchSource:0}: Error finding container c9c2060543091746d25cdfe8849f5bdf46a53adb23490f6d6c262ae49ec65973: Status 404 returned error can't find the container with id c9c2060543091746d25cdfe8849f5bdf46a53adb23490f6d6c262ae49ec65973 Apr 16 14:52:47.301830 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:47.301811 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa97968d5b4634bd4f9419795593b093.slice/crio-912c84ede68c7eb394df9979954d72a5d93ce1c3f705e013d6caf0aa6e3e6a96 WatchSource:0}: Error finding container 912c84ede68c7eb394df9979954d72a5d93ce1c3f705e013d6caf0aa6e3e6a96: Status 404 returned error can't find the container with id 912c84ede68c7eb394df9979954d72a5d93ce1c3f705e013d6caf0aa6e3e6a96 Apr 16 14:52:47.306360 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.306260 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-76.ec2.internal" event={"ID":"fa97968d5b4634bd4f9419795593b093","Type":"ContainerStarted","Data":"912c84ede68c7eb394df9979954d72a5d93ce1c3f705e013d6caf0aa6e3e6a96"} Apr 16 14:52:47.306454 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.306270 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:52:47.309585 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.309558 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-76.ec2.internal" event={"ID":"eeeae1dda6131e6cc2d2b873cb53b9f0","Type":"ContainerStarted","Data":"c9c2060543091746d25cdfe8849f5bdf46a53adb23490f6d6c262ae49ec65973"} Apr 16 14:52:47.385239 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385208 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9tsb\" (UniqueName: \"kubernetes.io/projected/13941107-91c6-410e-a282-6657d7c5de03-kube-api-access-s9tsb\") pod \"aws-ebs-csi-driver-node-zpz4t\" (UID: \"13941107-91c6-410e-a282-6657d7c5de03\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.385239 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385240 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqjkz\" (UniqueName: \"kubernetes.io/projected/85fdf4e1-8738-483f-a40e-a9112c7098d5-kube-api-access-lqjkz\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.385447 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/13941107-91c6-410e-a282-6657d7c5de03-device-dir\") pod \"aws-ebs-csi-driver-node-zpz4t\" (UID: \"13941107-91c6-410e-a282-6657d7c5de03\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.385447 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385381 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/85fdf4e1-8738-483f-a40e-a9112c7098d5-ovnkube-config\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.385447 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385426 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs\") pod \"network-metrics-daemon-9p5t7\" (UID: \"deecc941-e868-4306-99e5-4f30afef0f95\") " pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:52:47.385447 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385424 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/13941107-91c6-410e-a282-6657d7c5de03-device-dir\") pod \"aws-ebs-csi-driver-node-zpz4t\" (UID: \"13941107-91c6-410e-a282-6657d7c5de03\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.385578 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:47.385512 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:47.385578 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385542 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-var-lib-openvswitch\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.385578 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385572 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-node-log\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.385709 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:47.385612 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs podName:deecc941-e868-4306-99e5-4f30afef0f95 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:47.885579816 +0000 UTC m=+2.304714788 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs") pod "network-metrics-daemon-9p5t7" (UID: "deecc941-e868-4306-99e5-4f30afef0f95") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:47.385709 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385617 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-var-lib-openvswitch\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.385709 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385641 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.385709 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385656 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-node-log\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.385709 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.385709 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385696 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/85fdf4e1-8738-483f-a40e-a9112c7098d5-ovnkube-script-lib\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.385984 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385727 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-systemd-units\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.385984 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385742 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-run-systemd\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.385984 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385760 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/616aaf5a-f208-4fe5-97a1-96f1815fe9ac-iptables-alerter-script\") pod \"iptables-alerter-v7nk5\" (UID: \"616aaf5a-f208-4fe5-97a1-96f1815fe9ac\") " pod="openshift-network-operator/iptables-alerter-v7nk5" Apr 16 14:52:47.385984 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385785 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-systemd-units\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.385984 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385794 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-run-systemd\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.385984 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385818 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-host-cni-bin\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.385984 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385840 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/13941107-91c6-410e-a282-6657d7c5de03-sys-fs\") pod \"aws-ebs-csi-driver-node-zpz4t\" (UID: \"13941107-91c6-410e-a282-6657d7c5de03\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.385984 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385860 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/85fdf4e1-8738-483f-a40e-a9112c7098d5-env-overrides\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.385984 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385885 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/85fdf4e1-8738-483f-a40e-a9112c7098d5-ovn-node-metrics-cert\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.385984 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385895 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-host-cni-bin\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.385984 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385913 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-host-kubelet\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.385984 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385919 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/13941107-91c6-410e-a282-6657d7c5de03-sys-fs\") pod \"aws-ebs-csi-driver-node-zpz4t\" (UID: \"13941107-91c6-410e-a282-6657d7c5de03\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.385984 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.385960 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-host-kubelet\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.386588 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386002 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/85fdf4e1-8738-483f-a40e-a9112c7098d5-ovnkube-config\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.386588 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386044 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-host-slash\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.386588 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386073 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-etc-openvswitch\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.386588 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/13941107-91c6-410e-a282-6657d7c5de03-registration-dir\") pod \"aws-ebs-csi-driver-node-zpz4t\" (UID: \"13941107-91c6-410e-a282-6657d7c5de03\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.386588 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386101 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-host-slash\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.386588 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386125 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/13941107-91c6-410e-a282-6657d7c5de03-etc-selinux\") pod \"aws-ebs-csi-driver-node-zpz4t\" (UID: \"13941107-91c6-410e-a282-6657d7c5de03\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.386588 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386150 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-host-run-netns\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.386588 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386160 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-etc-openvswitch\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.386588 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386172 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/13941107-91c6-410e-a282-6657d7c5de03-registration-dir\") pod \"aws-ebs-csi-driver-node-zpz4t\" (UID: \"13941107-91c6-410e-a282-6657d7c5de03\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.386588 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386175 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/616aaf5a-f208-4fe5-97a1-96f1815fe9ac-host-slash\") pod \"iptables-alerter-v7nk5\" (UID: \"616aaf5a-f208-4fe5-97a1-96f1815fe9ac\") " pod="openshift-network-operator/iptables-alerter-v7nk5" Apr 16 14:52:47.386588 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386211 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-run-openvswitch\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.386588 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386245 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-host-run-netns\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.386588 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386259 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khnql\" (UniqueName: \"kubernetes.io/projected/616aaf5a-f208-4fe5-97a1-96f1815fe9ac-kube-api-access-khnql\") pod \"iptables-alerter-v7nk5\" (UID: \"616aaf5a-f208-4fe5-97a1-96f1815fe9ac\") " pod="openshift-network-operator/iptables-alerter-v7nk5" Apr 16 14:52:47.386588 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386282 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/13941107-91c6-410e-a282-6657d7c5de03-etc-selinux\") pod \"aws-ebs-csi-driver-node-zpz4t\" (UID: \"13941107-91c6-410e-a282-6657d7c5de03\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.386588 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386286 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wms4n\" (UniqueName: \"kubernetes.io/projected/deecc941-e868-4306-99e5-4f30afef0f95-kube-api-access-wms4n\") pod \"network-metrics-daemon-9p5t7\" (UID: \"deecc941-e868-4306-99e5-4f30afef0f95\") " pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:52:47.386588 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386292 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/85fdf4e1-8738-483f-a40e-a9112c7098d5-env-overrides\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.386588 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386304 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/616aaf5a-f208-4fe5-97a1-96f1815fe9ac-iptables-alerter-script\") pod \"iptables-alerter-v7nk5\" (UID: \"616aaf5a-f208-4fe5-97a1-96f1815fe9ac\") " pod="openshift-network-operator/iptables-alerter-v7nk5" Apr 16 14:52:47.387116 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386326 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-run-openvswitch\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.387116 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386314 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-log-socket\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.387116 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386294 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/616aaf5a-f208-4fe5-97a1-96f1815fe9ac-host-slash\") pod \"iptables-alerter-v7nk5\" (UID: \"616aaf5a-f208-4fe5-97a1-96f1815fe9ac\") " pod="openshift-network-operator/iptables-alerter-v7nk5" Apr 16 14:52:47.387116 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386346 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-log-socket\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.387116 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386337 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/85fdf4e1-8738-483f-a40e-a9112c7098d5-ovnkube-script-lib\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.387116 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386365 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-host-run-ovn-kubernetes\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.387116 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386399 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-host-run-ovn-kubernetes\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.387116 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386401 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13941107-91c6-410e-a282-6657d7c5de03-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zpz4t\" (UID: \"13941107-91c6-410e-a282-6657d7c5de03\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.387116 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13941107-91c6-410e-a282-6657d7c5de03-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zpz4t\" (UID: \"13941107-91c6-410e-a282-6657d7c5de03\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.387116 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/13941107-91c6-410e-a282-6657d7c5de03-socket-dir\") pod \"aws-ebs-csi-driver-node-zpz4t\" (UID: \"13941107-91c6-410e-a282-6657d7c5de03\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.387116 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386504 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t45ww\" (UniqueName: \"kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww\") pod \"network-check-target-6nklq\" (UID: \"d6e0e8e5-d659-4175-b96f-52c250d77fd0\") " pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:52:47.387116 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386531 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-run-ovn\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.387116 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386557 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-host-cni-netd\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.387116 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386585 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-run-ovn\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.387116 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386596 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/13941107-91c6-410e-a282-6657d7c5de03-socket-dir\") pod \"aws-ebs-csi-driver-node-zpz4t\" (UID: \"13941107-91c6-410e-a282-6657d7c5de03\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.387116 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.386629 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/85fdf4e1-8738-483f-a40e-a9112c7098d5-host-cni-netd\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.388122 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.388101 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/85fdf4e1-8738-483f-a40e-a9112c7098d5-ovn-node-metrics-cert\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.392232 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:47.392207 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:47.392232 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:47.392233 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:47.392366 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:47.392246 2576 projected.go:194] Error preparing data for projected volume kube-api-access-t45ww for pod openshift-network-diagnostics/network-check-target-6nklq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:47.392366 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:47.392304 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww podName:d6e0e8e5-d659-4175-b96f-52c250d77fd0 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:47.892287542 +0000 UTC m=+2.311422502 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-t45ww" (UniqueName: "kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww") pod "network-check-target-6nklq" (UID: "d6e0e8e5-d659-4175-b96f-52c250d77fd0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:47.394214 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.394168 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9tsb\" (UniqueName: \"kubernetes.io/projected/13941107-91c6-410e-a282-6657d7c5de03-kube-api-access-s9tsb\") pod \"aws-ebs-csi-driver-node-zpz4t\" (UID: \"13941107-91c6-410e-a282-6657d7c5de03\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.394380 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.394361 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wms4n\" (UniqueName: \"kubernetes.io/projected/deecc941-e868-4306-99e5-4f30afef0f95-kube-api-access-wms4n\") pod \"network-metrics-daemon-9p5t7\" (UID: \"deecc941-e868-4306-99e5-4f30afef0f95\") " pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:52:47.394426 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.394417 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqjkz\" (UniqueName: \"kubernetes.io/projected/85fdf4e1-8738-483f-a40e-a9112c7098d5-kube-api-access-lqjkz\") pod \"ovnkube-node-ddt96\" (UID: \"85fdf4e1-8738-483f-a40e-a9112c7098d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.395213 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.395198 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khnql\" (UniqueName: \"kubernetes.io/projected/616aaf5a-f208-4fe5-97a1-96f1815fe9ac-kube-api-access-khnql\") pod \"iptables-alerter-v7nk5\" (UID: \"616aaf5a-f208-4fe5-97a1-96f1815fe9ac\") " pod="openshift-network-operator/iptables-alerter-v7nk5" Apr 16 14:52:47.497097 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.497057 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tb9c9" Apr 16 14:52:47.503175 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:47.503137 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6e086d8_b850_425d_9896_6df3cec2442b.slice/crio-5976777e82717a1a6d2e05596a4276ae1557e28de70502062a37aa4f7097d4a8 WatchSource:0}: Error finding container 5976777e82717a1a6d2e05596a4276ae1557e28de70502062a37aa4f7097d4a8: Status 404 returned error can't find the container with id 5976777e82717a1a6d2e05596a4276ae1557e28de70502062a37aa4f7097d4a8 Apr 16 14:52:47.507140 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.507120 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-btrdx" Apr 16 14:52:47.513229 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:47.513204 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod281d16c8_10bf_4c91_91f2_472d3584db2f.slice/crio-96bbb8ef9daeb9263772613ab082f2a69b617b06190130fcb94931cfcd5daa23 WatchSource:0}: Error finding container 96bbb8ef9daeb9263772613ab082f2a69b617b06190130fcb94931cfcd5daa23: Status 404 returned error can't find the container with id 96bbb8ef9daeb9263772613ab082f2a69b617b06190130fcb94931cfcd5daa23 Apr 16 14:52:47.524114 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.524090 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mjsr6" Apr 16 14:52:47.531061 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:47.531036 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69a10374_32da_4de3_b491_3854f69f1613.slice/crio-c5a4ab756b86aee255ffb2078ce41a73eea4a4626f5f8fd4de71b8099d3f253e WatchSource:0}: Error finding container c5a4ab756b86aee255ffb2078ce41a73eea4a4626f5f8fd4de71b8099d3f253e: Status 404 returned error can't find the container with id c5a4ab756b86aee255ffb2078ce41a73eea4a4626f5f8fd4de71b8099d3f253e Apr 16 14:52:47.540495 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.540478 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rxvdm" Apr 16 14:52:47.546013 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.545990 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8kcqd" Apr 16 14:52:47.546268 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:47.546247 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab74fce9_eb83_4941_97e9_42f6ed125bf5.slice/crio-49b5545d7d80a3819ff30ac02cef3a6cb0d8287a3819888fad5b662e1c1a6400 WatchSource:0}: Error finding container 49b5545d7d80a3819ff30ac02cef3a6cb0d8287a3819888fad5b662e1c1a6400: Status 404 returned error can't find the container with id 49b5545d7d80a3819ff30ac02cef3a6cb0d8287a3819888fad5b662e1c1a6400 Apr 16 14:52:47.552783 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:47.552746 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31294a51_df01_4523_afff_845ceb6be0cc.slice/crio-7685da7fab159dfaeb8bd777c7279f89f2cfec9a96ab41426feccda436b56c40 WatchSource:0}: Error finding container 7685da7fab159dfaeb8bd777c7279f89f2cfec9a96ab41426feccda436b56c40: Status 404 returned error can't find the container with id 7685da7fab159dfaeb8bd777c7279f89f2cfec9a96ab41426feccda436b56c40 Apr 16 14:52:47.558400 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.558382 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cdg2t" Apr 16 14:52:47.565381 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:47.565358 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda36d1747_2a52_4941_aa0e_8d1fe90b9b00.slice/crio-9870891a704039b3199f3a9bce5481c1d4d2e5daf571563fb362e7765c3bda7d WatchSource:0}: Error finding container 9870891a704039b3199f3a9bce5481c1d4d2e5daf571563fb362e7765c3bda7d: Status 404 returned error can't find the container with id 9870891a704039b3199f3a9bce5481c1d4d2e5daf571563fb362e7765c3bda7d Apr 16 14:52:47.572419 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.572401 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" Apr 16 14:52:47.578056 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:47.578016 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13941107_91c6_410e_a282_6657d7c5de03.slice/crio-6b407f3ebe80beef9d6b10b6f21d23055bda9bae56cbb5bdc952e4fc62154e0a WatchSource:0}: Error finding container 6b407f3ebe80beef9d6b10b6f21d23055bda9bae56cbb5bdc952e4fc62154e0a: Status 404 returned error can't find the container with id 6b407f3ebe80beef9d6b10b6f21d23055bda9bae56cbb5bdc952e4fc62154e0a Apr 16 14:52:47.586307 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.586289 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-v7nk5" Apr 16 14:52:47.592335 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:47.592309 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod616aaf5a_f208_4fe5_97a1_96f1815fe9ac.slice/crio-94ce3b3566317dfcfe7f057fa30c7189e0125b138118387d6ebc733a4ccfdaad WatchSource:0}: Error finding container 94ce3b3566317dfcfe7f057fa30c7189e0125b138118387d6ebc733a4ccfdaad: Status 404 returned error can't find the container with id 94ce3b3566317dfcfe7f057fa30c7189e0125b138118387d6ebc733a4ccfdaad Apr 16 14:52:47.596566 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.596547 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:52:47.602362 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:52:47.602339 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85fdf4e1_8738_483f_a40e_a9112c7098d5.slice/crio-ea943684593500df914abb55778790b878f48640ded49179faa5e495c8f8726e WatchSource:0}: Error finding container ea943684593500df914abb55778790b878f48640ded49179faa5e495c8f8726e: Status 404 returned error can't find the container with id ea943684593500df914abb55778790b878f48640ded49179faa5e495c8f8726e Apr 16 14:52:47.714798 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.714772 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:47.889358 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.889325 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs\") pod \"network-metrics-daemon-9p5t7\" (UID: \"deecc941-e868-4306-99e5-4f30afef0f95\") " pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:52:47.889520 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:47.889471 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:47.889577 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:47.889535 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs podName:deecc941-e868-4306-99e5-4f30afef0f95 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:48.889516618 +0000 UTC m=+3.308651571 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs") pod "network-metrics-daemon-9p5t7" (UID: "deecc941-e868-4306-99e5-4f30afef0f95") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:47.990415 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:47.990332 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t45ww\" (UniqueName: \"kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww\") pod \"network-check-target-6nklq\" (UID: \"d6e0e8e5-d659-4175-b96f-52c250d77fd0\") " pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:52:47.990576 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:47.990503 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:47.990576 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:47.990524 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:47.990576 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:47.990537 2576 projected.go:194] Error preparing data for projected volume kube-api-access-t45ww for pod openshift-network-diagnostics/network-check-target-6nklq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:47.990741 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:47.990594 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww podName:d6e0e8e5-d659-4175-b96f-52c250d77fd0 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:48.990574713 +0000 UTC m=+3.409709684 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-t45ww" (UniqueName: "kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww") pod "network-check-target-6nklq" (UID: "d6e0e8e5-d659-4175-b96f-52c250d77fd0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:48.178443 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:48.175394 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:48.212056 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:48.211994 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:47 +0000 UTC" deadline="2027-10-17 07:20:28.624089437 +0000 UTC" Apr 16 14:52:48.212056 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:48.212053 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13168h27m40.412040316s" Apr 16 14:52:48.302764 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:48.302691 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:52:48.302910 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:48.302814 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6nklq" podUID="d6e0e8e5-d659-4175-b96f-52c250d77fd0" Apr 16 14:52:48.321869 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:48.321835 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rxvdm" event={"ID":"ab74fce9-eb83-4941-97e9-42f6ed125bf5","Type":"ContainerStarted","Data":"49b5545d7d80a3819ff30ac02cef3a6cb0d8287a3819888fad5b662e1c1a6400"} Apr 16 14:52:48.339625 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:48.339562 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mjsr6" event={"ID":"69a10374-32da-4de3-b491-3854f69f1613","Type":"ContainerStarted","Data":"c5a4ab756b86aee255ffb2078ce41a73eea4a4626f5f8fd4de71b8099d3f253e"} Apr 16 14:52:48.342986 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:48.342899 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tb9c9" event={"ID":"a6e086d8-b850-425d-9896-6df3cec2442b","Type":"ContainerStarted","Data":"5976777e82717a1a6d2e05596a4276ae1557e28de70502062a37aa4f7097d4a8"} Apr 16 14:52:48.363279 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:48.363119 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-v7nk5" event={"ID":"616aaf5a-f208-4fe5-97a1-96f1815fe9ac","Type":"ContainerStarted","Data":"94ce3b3566317dfcfe7f057fa30c7189e0125b138118387d6ebc733a4ccfdaad"} Apr 16 14:52:48.383103 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:48.383071 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" event={"ID":"13941107-91c6-410e-a282-6657d7c5de03","Type":"ContainerStarted","Data":"6b407f3ebe80beef9d6b10b6f21d23055bda9bae56cbb5bdc952e4fc62154e0a"} Apr 16 14:52:48.408382 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:48.408334 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8kcqd" event={"ID":"31294a51-df01-4523-afff-845ceb6be0cc","Type":"ContainerStarted","Data":"7685da7fab159dfaeb8bd777c7279f89f2cfec9a96ab41426feccda436b56c40"} Apr 16 14:52:48.414659 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:48.414628 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-btrdx" event={"ID":"281d16c8-10bf-4c91-91f2-472d3584db2f","Type":"ContainerStarted","Data":"96bbb8ef9daeb9263772613ab082f2a69b617b06190130fcb94931cfcd5daa23"} Apr 16 14:52:48.436991 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:48.436939 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" event={"ID":"85fdf4e1-8738-483f-a40e-a9112c7098d5","Type":"ContainerStarted","Data":"ea943684593500df914abb55778790b878f48640ded49179faa5e495c8f8726e"} Apr 16 14:52:48.455340 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:48.455286 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cdg2t" event={"ID":"a36d1747-2a52-4941-aa0e-8d1fe90b9b00","Type":"ContainerStarted","Data":"9870891a704039b3199f3a9bce5481c1d4d2e5daf571563fb362e7765c3bda7d"} Apr 16 14:52:48.471418 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:48.471205 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:48.899305 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:48.899120 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs\") pod \"network-metrics-daemon-9p5t7\" (UID: \"deecc941-e868-4306-99e5-4f30afef0f95\") " pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:52:48.899479 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:48.899314 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:48.899479 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:48.899388 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs podName:deecc941-e868-4306-99e5-4f30afef0f95 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:50.899368673 +0000 UTC m=+5.318503650 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs") pod "network-metrics-daemon-9p5t7" (UID: "deecc941-e868-4306-99e5-4f30afef0f95") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:49.000196 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:49.000157 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t45ww\" (UniqueName: \"kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww\") pod \"network-check-target-6nklq\" (UID: \"d6e0e8e5-d659-4175-b96f-52c250d77fd0\") " pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:52:49.000365 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:49.000317 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:49.000365 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:49.000334 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:49.000365 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:49.000347 2576 projected.go:194] Error preparing data for projected volume kube-api-access-t45ww for pod openshift-network-diagnostics/network-check-target-6nklq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:49.000524 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:49.000411 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww podName:d6e0e8e5-d659-4175-b96f-52c250d77fd0 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:51.00039167 +0000 UTC m=+5.419526640 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-t45ww" (UniqueName: "kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww") pod "network-check-target-6nklq" (UID: "d6e0e8e5-d659-4175-b96f-52c250d77fd0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:49.212753 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:49.212633 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:47 +0000 UTC" deadline="2027-11-16 20:13:34.143972532 +0000 UTC" Apr 16 14:52:49.212753 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:49.212672 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13901h20m44.931304716s" Apr 16 14:52:49.302752 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:49.302720 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:52:49.302926 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:49.302863 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p5t7" podUID="deecc941-e868-4306-99e5-4f30afef0f95" Apr 16 14:52:50.047914 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:50.047881 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:50.305451 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:50.304874 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:52:50.305451 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:50.305007 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6nklq" podUID="d6e0e8e5-d659-4175-b96f-52c250d77fd0" Apr 16 14:52:50.916984 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:50.916419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs\") pod \"network-metrics-daemon-9p5t7\" (UID: \"deecc941-e868-4306-99e5-4f30afef0f95\") " pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:52:50.916984 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:50.916580 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:50.916984 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:50.916645 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs podName:deecc941-e868-4306-99e5-4f30afef0f95 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:54.916625723 +0000 UTC m=+9.335760691 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs") pod "network-metrics-daemon-9p5t7" (UID: "deecc941-e868-4306-99e5-4f30afef0f95") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:51.017333 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:51.017247 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t45ww\" (UniqueName: \"kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww\") pod \"network-check-target-6nklq\" (UID: \"d6e0e8e5-d659-4175-b96f-52c250d77fd0\") " pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:52:51.017510 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:51.017432 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:51.017510 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:51.017460 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:51.017510 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:51.017473 2576 projected.go:194] Error preparing data for projected volume kube-api-access-t45ww for pod openshift-network-diagnostics/network-check-target-6nklq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:51.017674 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:51.017541 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww podName:d6e0e8e5-d659-4175-b96f-52c250d77fd0 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:55.017520903 +0000 UTC m=+9.436655867 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-t45ww" (UniqueName: "kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww") pod "network-check-target-6nklq" (UID: "d6e0e8e5-d659-4175-b96f-52c250d77fd0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:51.303293 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:51.303212 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:52:51.303446 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:51.303369 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p5t7" podUID="deecc941-e868-4306-99e5-4f30afef0f95" Apr 16 14:52:52.303095 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:52.303057 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:52:52.303511 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:52.303185 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6nklq" podUID="d6e0e8e5-d659-4175-b96f-52c250d77fd0" Apr 16 14:52:53.303211 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:53.303176 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:52:53.303617 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:53.303324 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p5t7" podUID="deecc941-e868-4306-99e5-4f30afef0f95" Apr 16 14:52:54.303335 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:54.303057 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:52:54.303335 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:54.303183 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6nklq" podUID="d6e0e8e5-d659-4175-b96f-52c250d77fd0" Apr 16 14:52:54.949377 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:54.949341 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs\") pod \"network-metrics-daemon-9p5t7\" (UID: \"deecc941-e868-4306-99e5-4f30afef0f95\") " pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:52:54.949561 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:54.949462 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:54.949561 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:54.949528 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs podName:deecc941-e868-4306-99e5-4f30afef0f95 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:02.94951111 +0000 UTC m=+17.368646066 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs") pod "network-metrics-daemon-9p5t7" (UID: "deecc941-e868-4306-99e5-4f30afef0f95") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:55.049745 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:55.049704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t45ww\" (UniqueName: \"kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww\") pod \"network-check-target-6nklq\" (UID: \"d6e0e8e5-d659-4175-b96f-52c250d77fd0\") " pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:52:55.049911 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:55.049889 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:55.049911 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:55.049908 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:55.050086 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:55.049922 2576 projected.go:194] Error preparing data for projected volume kube-api-access-t45ww for pod openshift-network-diagnostics/network-check-target-6nklq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:55.050086 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:55.049994 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww podName:d6e0e8e5-d659-4175-b96f-52c250d77fd0 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:03.049979358 +0000 UTC m=+17.469114315 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-t45ww" (UniqueName: "kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww") pod "network-check-target-6nklq" (UID: "d6e0e8e5-d659-4175-b96f-52c250d77fd0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:55.303003 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:55.302440 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:52:55.303003 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:55.302595 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p5t7" podUID="deecc941-e868-4306-99e5-4f30afef0f95" Apr 16 14:52:56.303392 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:56.303357 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:52:56.303835 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:56.303470 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6nklq" podUID="d6e0e8e5-d659-4175-b96f-52c250d77fd0" Apr 16 14:52:57.303055 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:57.303007 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:52:57.303247 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:57.303135 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p5t7" podUID="deecc941-e868-4306-99e5-4f30afef0f95" Apr 16 14:52:58.303401 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:58.303361 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:52:58.303888 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:58.303497 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6nklq" podUID="d6e0e8e5-d659-4175-b96f-52c250d77fd0" Apr 16 14:52:59.303051 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:52:59.303010 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:52:59.303253 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:52:59.303126 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p5t7" podUID="deecc941-e868-4306-99e5-4f30afef0f95" Apr 16 14:53:00.302361 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:00.302326 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:53:00.302849 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:00.302481 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6nklq" podUID="d6e0e8e5-d659-4175-b96f-52c250d77fd0" Apr 16 14:53:01.303074 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:01.303037 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:53:01.303534 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:01.303149 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p5t7" podUID="deecc941-e868-4306-99e5-4f30afef0f95" Apr 16 14:53:02.303167 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:02.303127 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:53:02.303603 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:02.303265 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6nklq" podUID="d6e0e8e5-d659-4175-b96f-52c250d77fd0" Apr 16 14:53:03.007066 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:03.006938 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs\") pod \"network-metrics-daemon-9p5t7\" (UID: \"deecc941-e868-4306-99e5-4f30afef0f95\") " pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:53:03.007306 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:03.007094 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:53:03.007306 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:03.007160 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs podName:deecc941-e868-4306-99e5-4f30afef0f95 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:19.0071438 +0000 UTC m=+33.426278776 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs") pod "network-metrics-daemon-9p5t7" (UID: "deecc941-e868-4306-99e5-4f30afef0f95") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:53:03.108255 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:03.108209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t45ww\" (UniqueName: \"kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww\") pod \"network-check-target-6nklq\" (UID: \"d6e0e8e5-d659-4175-b96f-52c250d77fd0\") " pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:53:03.108416 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:03.108375 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:53:03.108416 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:03.108393 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:53:03.108416 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:03.108405 2576 projected.go:194] Error preparing data for projected volume kube-api-access-t45ww for pod openshift-network-diagnostics/network-check-target-6nklq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:53:03.108531 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:03.108461 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww podName:d6e0e8e5-d659-4175-b96f-52c250d77fd0 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:19.108446939 +0000 UTC m=+33.527581904 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-t45ww" (UniqueName: "kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww") pod "network-check-target-6nklq" (UID: "d6e0e8e5-d659-4175-b96f-52c250d77fd0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:53:03.302554 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:03.302465 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:53:03.302720 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:03.302583 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p5t7" podUID="deecc941-e868-4306-99e5-4f30afef0f95" Apr 16 14:53:04.303281 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:04.303243 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:53:04.303713 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:04.303379 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6nklq" podUID="d6e0e8e5-d659-4175-b96f-52c250d77fd0" Apr 16 14:53:05.302594 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:05.302576 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:53:05.302690 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:05.302674 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p5t7" podUID="deecc941-e868-4306-99e5-4f30afef0f95" Apr 16 14:53:06.303758 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:06.303588 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:53:06.304614 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:06.303816 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6nklq" podUID="d6e0e8e5-d659-4175-b96f-52c250d77fd0" Apr 16 14:53:06.510697 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:06.510658 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-76.ec2.internal" event={"ID":"fa97968d5b4634bd4f9419795593b093","Type":"ContainerStarted","Data":"fa4fcdae8b40be29ef85e90559d0a9104dade0fd7bb08222ba1f95e0d954a730"} Apr 16 14:53:06.514844 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:06.514805 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 14:53:06.515504 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:06.515474 2576 generic.go:358] "Generic (PLEG): container finished" podID="85fdf4e1-8738-483f-a40e-a9112c7098d5" containerID="b9c9e0efd45b8f3e8708ad30b613219283ff62d73b26b23d514ad0f19415d1af" exitCode=1 Apr 16 14:53:06.515603 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:06.515555 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" event={"ID":"85fdf4e1-8738-483f-a40e-a9112c7098d5","Type":"ContainerStarted","Data":"2487daa027bcd2f70f22cbe1d76f45f4c2c51a1f61f17c07cce0df378d035a90"} Apr 16 14:53:06.515603 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:06.515579 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" event={"ID":"85fdf4e1-8738-483f-a40e-a9112c7098d5","Type":"ContainerStarted","Data":"9084ecc291842e998eecff32822a14c95e8544b3afc161b89db88374fb265a61"} Apr 16 14:53:06.515603 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:06.515593 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" event={"ID":"85fdf4e1-8738-483f-a40e-a9112c7098d5","Type":"ContainerStarted","Data":"34757c6a5092be567419245d336765d04ebfef8d7b07be863f2a211f8e95fc40"} Apr 16 14:53:06.515760 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:06.515606 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" event={"ID":"85fdf4e1-8738-483f-a40e-a9112c7098d5","Type":"ContainerStarted","Data":"eafd108d0e3140e55001bf6be9ee41fd545481a645339273bfb5441260595a27"} Apr 16 14:53:06.515760 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:06.515619 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" event={"ID":"85fdf4e1-8738-483f-a40e-a9112c7098d5","Type":"ContainerDied","Data":"b9c9e0efd45b8f3e8708ad30b613219283ff62d73b26b23d514ad0f19415d1af"} Apr 16 14:53:06.515760 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:06.515633 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" event={"ID":"85fdf4e1-8738-483f-a40e-a9112c7098d5","Type":"ContainerStarted","Data":"3e7f34f9f496c10d38212f4282a64650a9fc3c72b81df72b006592e2e33ae531"} Apr 16 14:53:06.517154 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:06.517134 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cdg2t" event={"ID":"a36d1747-2a52-4941-aa0e-8d1fe90b9b00","Type":"ContainerStarted","Data":"73dad1e6234f012860b89c74506287a225c17150814fc2f93a2df017cb7613a9"} Apr 16 14:53:06.518338 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:06.518319 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-btrdx" event={"ID":"281d16c8-10bf-4c91-91f2-472d3584db2f","Type":"ContainerStarted","Data":"e363b0b084bd8971997110b5756f1b78572f96cb726f328a37a46cd847fd695d"} Apr 16 14:53:06.525077 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:06.525037 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-76.ec2.internal" podStartSLOduration=20.525006868 podStartE2EDuration="20.525006868s" podCreationTimestamp="2026-04-16 14:52:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:53:06.524628625 +0000 UTC m=+20.943763600" watchObservedRunningTime="2026-04-16 14:53:06.525006868 +0000 UTC m=+20.944141844" Apr 16 14:53:06.555318 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:06.555271 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-btrdx" podStartSLOduration=2.514817647 podStartE2EDuration="20.555254626s" podCreationTimestamp="2026-04-16 14:52:46 +0000 UTC" firstStartedPulling="2026-04-16 14:52:47.514505286 +0000 UTC m=+1.933640243" lastFinishedPulling="2026-04-16 14:53:05.554942267 +0000 UTC m=+19.974077222" observedRunningTime="2026-04-16 14:53:06.555089715 +0000 UTC m=+20.974224709" watchObservedRunningTime="2026-04-16 14:53:06.555254626 +0000 UTC m=+20.974389604" Apr 16 14:53:06.555612 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:06.555584 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cdg2t" podStartSLOduration=2.534287739 podStartE2EDuration="20.555576295s" podCreationTimestamp="2026-04-16 14:52:46 +0000 UTC" firstStartedPulling="2026-04-16 14:52:47.567233499 +0000 UTC m=+1.986368451" lastFinishedPulling="2026-04-16 14:53:05.588522051 +0000 UTC m=+20.007657007" observedRunningTime="2026-04-16 14:53:06.540341641 +0000 UTC m=+20.959476635" watchObservedRunningTime="2026-04-16 14:53:06.555576295 +0000 UTC m=+20.974711337" Apr 16 14:53:07.302716 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:07.302674 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:53:07.302894 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:07.302803 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p5t7" podUID="deecc941-e868-4306-99e5-4f30afef0f95" Apr 16 14:53:07.521119 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:07.521079 2576 generic.go:358] "Generic (PLEG): container finished" podID="eeeae1dda6131e6cc2d2b873cb53b9f0" containerID="c5ef11700734ffe90afba461d9e5f29c992821527eb5ee527cca5c873ff585a2" exitCode=0 Apr 16 14:53:07.521568 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:07.521158 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-76.ec2.internal" event={"ID":"eeeae1dda6131e6cc2d2b873cb53b9f0","Type":"ContainerDied","Data":"c5ef11700734ffe90afba461d9e5f29c992821527eb5ee527cca5c873ff585a2"} Apr 16 14:53:07.522525 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:07.522502 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rxvdm" event={"ID":"ab74fce9-eb83-4941-97e9-42f6ed125bf5","Type":"ContainerStarted","Data":"455439a24856b988cfc8b7b118b0162d4f68c5ad3a1c3c9aaaed15048b804396"} Apr 16 14:53:07.523596 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:07.523575 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mjsr6" event={"ID":"69a10374-32da-4de3-b491-3854f69f1613","Type":"ContainerStarted","Data":"5f5584bb80cf04fcf4778d08b3f2b1957c51d3205d77912fce6a439f0ad9173a"} Apr 16 14:53:07.526881 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:07.526861 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tb9c9" event={"ID":"a6e086d8-b850-425d-9896-6df3cec2442b","Type":"ContainerStarted","Data":"cfb5e1f20dadb89bf0a650ebeec68555ff33639d971d0753f5be713f118a04dc"} Apr 16 14:53:07.528067 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:07.528044 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-v7nk5" event={"ID":"616aaf5a-f208-4fe5-97a1-96f1815fe9ac","Type":"ContainerStarted","Data":"90f1df697f9804e9e0050c046738a1fc51e71d8e2906df80712e2cdf4aa7538d"} Apr 16 14:53:07.529248 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:07.529224 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" event={"ID":"13941107-91c6-410e-a282-6657d7c5de03","Type":"ContainerStarted","Data":"666b8c56846b63d671f8b2bb4a24e7a0c6f9ca298bca07e50eca14356366ee38"} Apr 16 14:53:07.530312 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:07.530291 2576 generic.go:358] "Generic (PLEG): container finished" podID="31294a51-df01-4523-afff-845ceb6be0cc" containerID="f07beb117c323436a25743baf804441308d29570404ce27cf5dd403fdaed6ecd" exitCode=0 Apr 16 14:53:07.530428 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:07.530381 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8kcqd" event={"ID":"31294a51-df01-4523-afff-845ceb6be0cc","Type":"ContainerDied","Data":"f07beb117c323436a25743baf804441308d29570404ce27cf5dd403fdaed6ecd"} Apr 16 14:53:07.549870 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:07.549826 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-tb9c9" podStartSLOduration=3.559999501 podStartE2EDuration="21.549810155s" podCreationTimestamp="2026-04-16 14:52:46 +0000 UTC" firstStartedPulling="2026-04-16 14:52:47.504744473 +0000 UTC m=+1.923879426" lastFinishedPulling="2026-04-16 14:53:05.494555121 +0000 UTC m=+19.913690080" observedRunningTime="2026-04-16 14:53:07.549657659 +0000 UTC m=+21.968792643" watchObservedRunningTime="2026-04-16 14:53:07.549810155 +0000 UTC m=+21.968945128" Apr 16 14:53:07.613867 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:07.613756 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-v7nk5" podStartSLOduration=3.712897639 podStartE2EDuration="21.613734237s" podCreationTimestamp="2026-04-16 14:52:46 +0000 UTC" firstStartedPulling="2026-04-16 14:52:47.593713723 +0000 UTC m=+2.012848680" lastFinishedPulling="2026-04-16 14:53:05.494550315 +0000 UTC m=+19.913685278" observedRunningTime="2026-04-16 14:53:07.613718615 +0000 UTC m=+22.032853592" watchObservedRunningTime="2026-04-16 14:53:07.613734237 +0000 UTC m=+22.032869213" Apr 16 14:53:07.649541 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:07.649487 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-mjsr6" podStartSLOduration=3.6268843029999998 podStartE2EDuration="21.649469635s" podCreationTimestamp="2026-04-16 14:52:46 +0000 UTC" firstStartedPulling="2026-04-16 14:52:47.532475908 +0000 UTC m=+1.951610861" lastFinishedPulling="2026-04-16 14:53:05.555061225 +0000 UTC m=+19.974196193" observedRunningTime="2026-04-16 14:53:07.634601228 +0000 UTC m=+22.053736207" watchObservedRunningTime="2026-04-16 14:53:07.649469635 +0000 UTC m=+22.068604611" Apr 16 14:53:07.649668 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:07.649591 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rxvdm" podStartSLOduration=3.703045189 podStartE2EDuration="21.649584963s" podCreationTimestamp="2026-04-16 14:52:46 +0000 UTC" firstStartedPulling="2026-04-16 14:52:47.548049941 +0000 UTC m=+1.967184901" lastFinishedPulling="2026-04-16 14:53:05.494589719 +0000 UTC m=+19.913724675" observedRunningTime="2026-04-16 14:53:07.649372717 +0000 UTC m=+22.068507702" watchObservedRunningTime="2026-04-16 14:53:07.649584963 +0000 UTC m=+22.068719938" Apr 16 14:53:07.720788 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:07.720763 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 14:53:08.244231 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:08.244113 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T14:53:07.720784407Z","UUID":"075a5228-4921-4d21-8878-8c2c845ad224","Handler":null,"Name":"","Endpoint":""} Apr 16 14:53:08.246193 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:08.246094 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 14:53:08.246193 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:08.246129 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 14:53:08.302720 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:08.302694 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:53:08.302856 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:08.302822 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6nklq" podUID="d6e0e8e5-d659-4175-b96f-52c250d77fd0" Apr 16 14:53:08.533830 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:08.533750 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" event={"ID":"13941107-91c6-410e-a282-6657d7c5de03","Type":"ContainerStarted","Data":"182f7e7954c144363b81cd1da59dae3a9460372177221e974d1ed4ddeb22c42c"} Apr 16 14:53:08.535602 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:08.535570 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-76.ec2.internal" event={"ID":"eeeae1dda6131e6cc2d2b873cb53b9f0","Type":"ContainerStarted","Data":"b7df0557a09f5f91c278f853aa9c50eaca9cf68f8d556c745ee1ea0e3e31cde3"} Apr 16 14:53:08.538579 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:08.538560 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 14:53:08.538993 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:08.538956 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" event={"ID":"85fdf4e1-8738-483f-a40e-a9112c7098d5","Type":"ContainerStarted","Data":"251f64d5452d9e08dda20935a6beca97fee99604471fbca9d462ba22f32c3520"} Apr 16 14:53:08.549990 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:08.549948 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-76.ec2.internal" podStartSLOduration=22.549936669 podStartE2EDuration="22.549936669s" podCreationTimestamp="2026-04-16 14:52:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:53:08.54959385 +0000 UTC m=+22.968728825" watchObservedRunningTime="2026-04-16 14:53:08.549936669 +0000 UTC m=+22.969071643" Apr 16 14:53:09.303303 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:09.303268 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:53:09.303563 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:09.303525 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p5t7" podUID="deecc941-e868-4306-99e5-4f30afef0f95" Apr 16 14:53:09.544082 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:09.544040 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" event={"ID":"13941107-91c6-410e-a282-6657d7c5de03","Type":"ContainerStarted","Data":"b559fa77243922ddec2d3b30a62bb86ae92fa3533a5efad952dde191c1ab2bcd"} Apr 16 14:53:09.562649 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:09.562542 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zpz4t" podStartSLOduration=2.556867166 podStartE2EDuration="23.562521616s" podCreationTimestamp="2026-04-16 14:52:46 +0000 UTC" firstStartedPulling="2026-04-16 14:52:47.580434514 +0000 UTC m=+1.999569483" lastFinishedPulling="2026-04-16 14:53:08.586088978 +0000 UTC m=+23.005223933" observedRunningTime="2026-04-16 14:53:09.56194832 +0000 UTC m=+23.981083324" watchObservedRunningTime="2026-04-16 14:53:09.562521616 +0000 UTC m=+23.981656593" Apr 16 14:53:10.303064 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:10.303010 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:53:10.303229 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:10.303142 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6nklq" podUID="d6e0e8e5-d659-4175-b96f-52c250d77fd0" Apr 16 14:53:11.303133 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:11.303091 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:53:11.303532 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:11.303225 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p5t7" podUID="deecc941-e868-4306-99e5-4f30afef0f95" Apr 16 14:53:11.841276 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:11.841037 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-tb9c9" Apr 16 14:53:11.841732 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:11.841715 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-tb9c9" Apr 16 14:53:12.303016 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:12.302988 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:53:12.303222 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:12.303099 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6nklq" podUID="d6e0e8e5-d659-4175-b96f-52c250d77fd0" Apr 16 14:53:12.551305 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:12.551270 2576 generic.go:358] "Generic (PLEG): container finished" podID="31294a51-df01-4523-afff-845ceb6be0cc" containerID="dccbc1bf6c97bb04837edd596411ddcc8e44e13493ce22f87ed630bceba592db" exitCode=0 Apr 16 14:53:12.551455 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:12.551352 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8kcqd" event={"ID":"31294a51-df01-4523-afff-845ceb6be0cc","Type":"ContainerDied","Data":"dccbc1bf6c97bb04837edd596411ddcc8e44e13493ce22f87ed630bceba592db"} Apr 16 14:53:12.554374 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:12.554358 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 14:53:12.554769 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:12.554749 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" event={"ID":"85fdf4e1-8738-483f-a40e-a9112c7098d5","Type":"ContainerStarted","Data":"ff4422b5b52573572bf56c72a7249f29b3cb32d9e42fdaf35fc4b7661ec2a0ef"} Apr 16 14:53:12.554982 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:12.554966 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:53:12.554982 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:12.554989 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-tb9c9" Apr 16 14:53:12.555172 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:12.555157 2576 scope.go:117] "RemoveContainer" containerID="b9c9e0efd45b8f3e8708ad30b613219283ff62d73b26b23d514ad0f19415d1af" Apr 16 14:53:12.555490 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:12.555473 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-tb9c9" Apr 16 14:53:12.573377 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:12.573357 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:53:13.303183 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:13.302932 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:53:13.303359 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:13.303280 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p5t7" podUID="deecc941-e868-4306-99e5-4f30afef0f95" Apr 16 14:53:13.559106 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:13.558951 2576 generic.go:358] "Generic (PLEG): container finished" podID="31294a51-df01-4523-afff-845ceb6be0cc" containerID="d90deecd46f848e17b7b31809c81f349c235e7c0b776c92cd217486b97f503d0" exitCode=0 Apr 16 14:53:13.559106 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:13.559045 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8kcqd" event={"ID":"31294a51-df01-4523-afff-845ceb6be0cc","Type":"ContainerDied","Data":"d90deecd46f848e17b7b31809c81f349c235e7c0b776c92cd217486b97f503d0"} Apr 16 14:53:13.560653 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:13.560140 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9p5t7"] Apr 16 14:53:13.562898 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:13.562879 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 14:53:13.563231 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:13.563208 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" event={"ID":"85fdf4e1-8738-483f-a40e-a9112c7098d5","Type":"ContainerStarted","Data":"e5170f0744b39542d15a331f75fd22b112f82edaa200bc99de8a6480b422d38a"} Apr 16 14:53:13.563399 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:13.563385 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:53:13.563510 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:13.563493 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p5t7" podUID="deecc941-e868-4306-99e5-4f30afef0f95" Apr 16 14:53:13.563604 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:13.563590 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 14:53:13.563817 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:13.563793 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:53:13.571833 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:13.571810 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6nklq"] Apr 16 14:53:13.571956 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:13.571913 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:53:13.572043 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:13.572003 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6nklq" podUID="d6e0e8e5-d659-4175-b96f-52c250d77fd0" Apr 16 14:53:13.580263 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:13.580239 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:53:13.604286 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:13.604230 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" podStartSLOduration=9.340277364 podStartE2EDuration="27.604213217s" podCreationTimestamp="2026-04-16 14:52:46 +0000 UTC" firstStartedPulling="2026-04-16 14:52:47.603893699 +0000 UTC m=+2.023028653" lastFinishedPulling="2026-04-16 14:53:05.867829545 +0000 UTC m=+20.286964506" observedRunningTime="2026-04-16 14:53:13.603846842 +0000 UTC m=+28.022981818" watchObservedRunningTime="2026-04-16 14:53:13.604213217 +0000 UTC m=+28.023348193" Apr 16 14:53:14.567849 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:14.567818 2576 generic.go:358] "Generic (PLEG): container finished" podID="31294a51-df01-4523-afff-845ceb6be0cc" containerID="eb992bd82b0ec5b788919b58a75b24b8173a767e157f9f60bd0ea01a8d46835f" exitCode=0 Apr 16 14:53:14.568345 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:14.567895 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8kcqd" event={"ID":"31294a51-df01-4523-afff-845ceb6be0cc","Type":"ContainerDied","Data":"eb992bd82b0ec5b788919b58a75b24b8173a767e157f9f60bd0ea01a8d46835f"} Apr 16 14:53:14.568345 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:14.567998 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 14:53:15.045917 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:15.045888 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:53:15.302652 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:15.302571 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:53:15.302803 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:15.302692 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6nklq" podUID="d6e0e8e5-d659-4175-b96f-52c250d77fd0" Apr 16 14:53:15.302803 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:15.302748 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:53:15.302916 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:15.302857 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p5t7" podUID="deecc941-e868-4306-99e5-4f30afef0f95" Apr 16 14:53:16.595904 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:16.595844 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" podUID="85fdf4e1-8738-483f-a40e-a9112c7098d5" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 14:53:17.302327 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:17.302294 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:53:17.302481 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:17.302294 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:53:17.302481 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:17.302407 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6nklq" podUID="d6e0e8e5-d659-4175-b96f-52c250d77fd0" Apr 16 14:53:17.302569 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:17.302473 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9p5t7" podUID="deecc941-e868-4306-99e5-4f30afef0f95" Apr 16 14:53:18.884255 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:18.883998 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-76.ec2.internal" event="NodeReady" Apr 16 14:53:18.884686 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:18.884333 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 14:53:18.923945 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:18.923909 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rsgph"] Apr 16 14:53:18.951752 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:18.951706 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-94b8j"] Apr 16 14:53:18.951925 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:18.951909 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rsgph" Apr 16 14:53:18.954281 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:18.954252 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 14:53:18.954420 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:18.954285 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 14:53:18.954420 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:18.954382 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vzfs6\"" Apr 16 14:53:18.971674 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:18.971643 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rsgph"] Apr 16 14:53:18.971674 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:18.971677 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-94b8j"] Apr 16 14:53:18.971861 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:18.971792 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-94b8j" Apr 16 14:53:18.974177 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:18.974140 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 14:53:18.974316 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:18.974282 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 14:53:18.974392 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:18.974333 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-8jkdg\"" Apr 16 14:53:18.974392 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:18.974355 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 14:53:19.032369 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.032330 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/574ca2b9-aeca-4a60-8152-838c7e3d1902-config-volume\") pod \"dns-default-rsgph\" (UID: \"574ca2b9-aeca-4a60-8152-838c7e3d1902\") " pod="openshift-dns/dns-default-rsgph" Apr 16 14:53:19.032369 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.032375 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/574ca2b9-aeca-4a60-8152-838c7e3d1902-tmp-dir\") pod \"dns-default-rsgph\" (UID: \"574ca2b9-aeca-4a60-8152-838c7e3d1902\") " pod="openshift-dns/dns-default-rsgph" Apr 16 14:53:19.032604 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.032402 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgzf9\" (UniqueName: \"kubernetes.io/projected/574ca2b9-aeca-4a60-8152-838c7e3d1902-kube-api-access-hgzf9\") pod \"dns-default-rsgph\" (UID: \"574ca2b9-aeca-4a60-8152-838c7e3d1902\") " pod="openshift-dns/dns-default-rsgph" Apr 16 14:53:19.032604 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.032519 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs\") pod \"network-metrics-daemon-9p5t7\" (UID: \"deecc941-e868-4306-99e5-4f30afef0f95\") " pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:53:19.032604 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.032552 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls\") pod \"dns-default-rsgph\" (UID: \"574ca2b9-aeca-4a60-8152-838c7e3d1902\") " pod="openshift-dns/dns-default-rsgph" Apr 16 14:53:19.032756 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:19.032661 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:53:19.032756 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:19.032724 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs podName:deecc941-e868-4306-99e5-4f30afef0f95 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:51.032704198 +0000 UTC m=+65.451839155 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs") pod "network-metrics-daemon-9p5t7" (UID: "deecc941-e868-4306-99e5-4f30afef0f95") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:53:19.132997 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.132960 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert\") pod \"ingress-canary-94b8j\" (UID: \"f82e5609-2a2d-49f8-aae5-da767543bb3d\") " pod="openshift-ingress-canary/ingress-canary-94b8j" Apr 16 14:53:19.133201 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.133034 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls\") pod \"dns-default-rsgph\" (UID: \"574ca2b9-aeca-4a60-8152-838c7e3d1902\") " pod="openshift-dns/dns-default-rsgph" Apr 16 14:53:19.133201 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.133116 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/574ca2b9-aeca-4a60-8152-838c7e3d1902-config-volume\") pod \"dns-default-rsgph\" (UID: \"574ca2b9-aeca-4a60-8152-838c7e3d1902\") " pod="openshift-dns/dns-default-rsgph" Apr 16 14:53:19.133201 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:19.133141 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:19.133201 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.133158 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c47k5\" (UniqueName: \"kubernetes.io/projected/f82e5609-2a2d-49f8-aae5-da767543bb3d-kube-api-access-c47k5\") pod \"ingress-canary-94b8j\" (UID: \"f82e5609-2a2d-49f8-aae5-da767543bb3d\") " pod="openshift-ingress-canary/ingress-canary-94b8j" Apr 16 14:53:19.133377 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:19.133204 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls podName:574ca2b9-aeca-4a60-8152-838c7e3d1902 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:19.633183639 +0000 UTC m=+34.052318593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls") pod "dns-default-rsgph" (UID: "574ca2b9-aeca-4a60-8152-838c7e3d1902") : secret "dns-default-metrics-tls" not found Apr 16 14:53:19.133377 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.133225 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/574ca2b9-aeca-4a60-8152-838c7e3d1902-tmp-dir\") pod \"dns-default-rsgph\" (UID: \"574ca2b9-aeca-4a60-8152-838c7e3d1902\") " pod="openshift-dns/dns-default-rsgph" Apr 16 14:53:19.133377 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.133257 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgzf9\" (UniqueName: \"kubernetes.io/projected/574ca2b9-aeca-4a60-8152-838c7e3d1902-kube-api-access-hgzf9\") pod \"dns-default-rsgph\" (UID: \"574ca2b9-aeca-4a60-8152-838c7e3d1902\") " pod="openshift-dns/dns-default-rsgph" Apr 16 14:53:19.133377 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.133293 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t45ww\" (UniqueName: \"kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww\") pod \"network-check-target-6nklq\" (UID: \"d6e0e8e5-d659-4175-b96f-52c250d77fd0\") " pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:53:19.133635 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:19.133441 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:53:19.133635 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:19.133463 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:53:19.133635 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:19.133477 2576 projected.go:194] Error preparing data for projected volume kube-api-access-t45ww for pod openshift-network-diagnostics/network-check-target-6nklq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:53:19.133635 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:19.133525 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww podName:d6e0e8e5-d659-4175-b96f-52c250d77fd0 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:51.133510707 +0000 UTC m=+65.552645673 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-t45ww" (UniqueName: "kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww") pod "network-check-target-6nklq" (UID: "d6e0e8e5-d659-4175-b96f-52c250d77fd0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:53:19.133635 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.133616 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/574ca2b9-aeca-4a60-8152-838c7e3d1902-tmp-dir\") pod \"dns-default-rsgph\" (UID: \"574ca2b9-aeca-4a60-8152-838c7e3d1902\") " pod="openshift-dns/dns-default-rsgph" Apr 16 14:53:19.133821 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.133761 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/574ca2b9-aeca-4a60-8152-838c7e3d1902-config-volume\") pod \"dns-default-rsgph\" (UID: \"574ca2b9-aeca-4a60-8152-838c7e3d1902\") " pod="openshift-dns/dns-default-rsgph" Apr 16 14:53:19.143674 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.143616 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgzf9\" (UniqueName: \"kubernetes.io/projected/574ca2b9-aeca-4a60-8152-838c7e3d1902-kube-api-access-hgzf9\") pod \"dns-default-rsgph\" (UID: \"574ca2b9-aeca-4a60-8152-838c7e3d1902\") " pod="openshift-dns/dns-default-rsgph" Apr 16 14:53:19.234052 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.234000 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c47k5\" (UniqueName: \"kubernetes.io/projected/f82e5609-2a2d-49f8-aae5-da767543bb3d-kube-api-access-c47k5\") pod \"ingress-canary-94b8j\" (UID: \"f82e5609-2a2d-49f8-aae5-da767543bb3d\") " pod="openshift-ingress-canary/ingress-canary-94b8j" Apr 16 14:53:19.234244 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.234111 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert\") pod \"ingress-canary-94b8j\" (UID: \"f82e5609-2a2d-49f8-aae5-da767543bb3d\") " pod="openshift-ingress-canary/ingress-canary-94b8j" Apr 16 14:53:19.234244 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:19.234221 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:19.234332 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:19.234280 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert podName:f82e5609-2a2d-49f8-aae5-da767543bb3d nodeName:}" failed. No retries permitted until 2026-04-16 14:53:19.734256012 +0000 UTC m=+34.153390975 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert") pod "ingress-canary-94b8j" (UID: "f82e5609-2a2d-49f8-aae5-da767543bb3d") : secret "canary-serving-cert" not found Apr 16 14:53:19.242537 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.242515 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c47k5\" (UniqueName: \"kubernetes.io/projected/f82e5609-2a2d-49f8-aae5-da767543bb3d-kube-api-access-c47k5\") pod \"ingress-canary-94b8j\" (UID: \"f82e5609-2a2d-49f8-aae5-da767543bb3d\") " pod="openshift-ingress-canary/ingress-canary-94b8j" Apr 16 14:53:19.302877 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.302840 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:53:19.303068 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.302881 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:53:19.307040 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.306999 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:53:19.307510 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.307486 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kc8hc\"" Apr 16 14:53:19.307636 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.307522 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-j5fbx\"" Apr 16 14:53:19.307636 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.307552 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:53:19.307737 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.307667 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:53:19.638442 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.638410 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls\") pod \"dns-default-rsgph\" (UID: \"574ca2b9-aeca-4a60-8152-838c7e3d1902\") " pod="openshift-dns/dns-default-rsgph" Apr 16 14:53:19.638677 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:19.638528 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:19.638677 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:19.638596 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls podName:574ca2b9-aeca-4a60-8152-838c7e3d1902 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:20.638576788 +0000 UTC m=+35.057711760 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls") pod "dns-default-rsgph" (UID: "574ca2b9-aeca-4a60-8152-838c7e3d1902") : secret "dns-default-metrics-tls" not found Apr 16 14:53:19.739047 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:19.739001 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert\") pod \"ingress-canary-94b8j\" (UID: \"f82e5609-2a2d-49f8-aae5-da767543bb3d\") " pod="openshift-ingress-canary/ingress-canary-94b8j" Apr 16 14:53:19.739242 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:19.739173 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:19.739313 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:19.739248 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert podName:f82e5609-2a2d-49f8-aae5-da767543bb3d nodeName:}" failed. No retries permitted until 2026-04-16 14:53:20.739229324 +0000 UTC m=+35.158364286 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert") pod "ingress-canary-94b8j" (UID: "f82e5609-2a2d-49f8-aae5-da767543bb3d") : secret "canary-serving-cert" not found Apr 16 14:53:20.647081 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:20.647014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls\") pod \"dns-default-rsgph\" (UID: \"574ca2b9-aeca-4a60-8152-838c7e3d1902\") " pod="openshift-dns/dns-default-rsgph" Apr 16 14:53:20.647739 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:20.647170 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:20.647739 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:20.647247 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls podName:574ca2b9-aeca-4a60-8152-838c7e3d1902 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:22.647223756 +0000 UTC m=+37.066358722 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls") pod "dns-default-rsgph" (UID: "574ca2b9-aeca-4a60-8152-838c7e3d1902") : secret "dns-default-metrics-tls" not found Apr 16 14:53:20.748402 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:20.748316 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert\") pod \"ingress-canary-94b8j\" (UID: \"f82e5609-2a2d-49f8-aae5-da767543bb3d\") " pod="openshift-ingress-canary/ingress-canary-94b8j" Apr 16 14:53:20.748572 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:20.748482 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:20.748572 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:20.748560 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert podName:f82e5609-2a2d-49f8-aae5-da767543bb3d nodeName:}" failed. No retries permitted until 2026-04-16 14:53:22.74854153 +0000 UTC m=+37.167676496 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert") pod "ingress-canary-94b8j" (UID: "f82e5609-2a2d-49f8-aae5-da767543bb3d") : secret "canary-serving-cert" not found Apr 16 14:53:21.584374 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:21.584340 2576 generic.go:358] "Generic (PLEG): container finished" podID="31294a51-df01-4523-afff-845ceb6be0cc" containerID="08792ca0750c569012c641ccc7e2067db1adac5cf11d60d184644725c2711b2a" exitCode=0 Apr 16 14:53:21.584548 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:21.584416 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8kcqd" event={"ID":"31294a51-df01-4523-afff-845ceb6be0cc","Type":"ContainerDied","Data":"08792ca0750c569012c641ccc7e2067db1adac5cf11d60d184644725c2711b2a"} Apr 16 14:53:22.591179 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:22.591145 2576 generic.go:358] "Generic (PLEG): container finished" podID="31294a51-df01-4523-afff-845ceb6be0cc" containerID="bd92fd8ed15a5fbb9db0726b09143a73ed34e0cfcd16d93cd7c77d0eeb9d2db8" exitCode=0 Apr 16 14:53:22.591787 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:22.591186 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8kcqd" event={"ID":"31294a51-df01-4523-afff-845ceb6be0cc","Type":"ContainerDied","Data":"bd92fd8ed15a5fbb9db0726b09143a73ed34e0cfcd16d93cd7c77d0eeb9d2db8"} Apr 16 14:53:22.660574 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:22.660541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls\") pod \"dns-default-rsgph\" (UID: \"574ca2b9-aeca-4a60-8152-838c7e3d1902\") " pod="openshift-dns/dns-default-rsgph" Apr 16 14:53:22.660733 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:22.660666 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:22.660797 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:22.660740 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls podName:574ca2b9-aeca-4a60-8152-838c7e3d1902 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:26.66071938 +0000 UTC m=+41.079854334 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls") pod "dns-default-rsgph" (UID: "574ca2b9-aeca-4a60-8152-838c7e3d1902") : secret "dns-default-metrics-tls" not found Apr 16 14:53:22.761668 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:22.761628 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert\") pod \"ingress-canary-94b8j\" (UID: \"f82e5609-2a2d-49f8-aae5-da767543bb3d\") " pod="openshift-ingress-canary/ingress-canary-94b8j" Apr 16 14:53:22.761831 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:22.761766 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:22.761901 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:22.761841 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert podName:f82e5609-2a2d-49f8-aae5-da767543bb3d nodeName:}" failed. No retries permitted until 2026-04-16 14:53:26.761820028 +0000 UTC m=+41.180954981 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert") pod "ingress-canary-94b8j" (UID: "f82e5609-2a2d-49f8-aae5-da767543bb3d") : secret "canary-serving-cert" not found Apr 16 14:53:23.595540 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:23.595374 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8kcqd" event={"ID":"31294a51-df01-4523-afff-845ceb6be0cc","Type":"ContainerStarted","Data":"a4d0afcb8683c27786626d42b750f19ba29fe6b852469673d202b6e10e1378b7"} Apr 16 14:53:23.617515 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:23.617468 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-8kcqd" podStartSLOduration=4.689832432 podStartE2EDuration="37.617454085s" podCreationTimestamp="2026-04-16 14:52:46 +0000 UTC" firstStartedPulling="2026-04-16 14:52:47.55442198 +0000 UTC m=+1.973556935" lastFinishedPulling="2026-04-16 14:53:20.482043631 +0000 UTC m=+34.901178588" observedRunningTime="2026-04-16 14:53:23.616131665 +0000 UTC m=+38.035266640" watchObservedRunningTime="2026-04-16 14:53:23.617454085 +0000 UTC m=+38.036589060" Apr 16 14:53:26.687863 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:26.687823 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls\") pod \"dns-default-rsgph\" (UID: \"574ca2b9-aeca-4a60-8152-838c7e3d1902\") " pod="openshift-dns/dns-default-rsgph" Apr 16 14:53:26.688269 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:26.687939 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:26.688269 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:26.687989 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls podName:574ca2b9-aeca-4a60-8152-838c7e3d1902 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:34.68797626 +0000 UTC m=+49.107111223 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls") pod "dns-default-rsgph" (UID: "574ca2b9-aeca-4a60-8152-838c7e3d1902") : secret "dns-default-metrics-tls" not found Apr 16 14:53:26.788434 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:26.788401 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert\") pod \"ingress-canary-94b8j\" (UID: \"f82e5609-2a2d-49f8-aae5-da767543bb3d\") " pod="openshift-ingress-canary/ingress-canary-94b8j" Apr 16 14:53:26.788591 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:26.788533 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:26.788591 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:26.788582 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert podName:f82e5609-2a2d-49f8-aae5-da767543bb3d nodeName:}" failed. No retries permitted until 2026-04-16 14:53:34.788569271 +0000 UTC m=+49.207704224 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert") pod "ingress-canary-94b8j" (UID: "f82e5609-2a2d-49f8-aae5-da767543bb3d") : secret "canary-serving-cert" not found Apr 16 14:53:34.743651 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:34.743608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls\") pod \"dns-default-rsgph\" (UID: \"574ca2b9-aeca-4a60-8152-838c7e3d1902\") " pod="openshift-dns/dns-default-rsgph" Apr 16 14:53:34.744102 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:34.743718 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:34.744102 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:34.743771 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls podName:574ca2b9-aeca-4a60-8152-838c7e3d1902 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:50.743756065 +0000 UTC m=+65.162891018 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls") pod "dns-default-rsgph" (UID: "574ca2b9-aeca-4a60-8152-838c7e3d1902") : secret "dns-default-metrics-tls" not found Apr 16 14:53:34.843907 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:34.843870 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert\") pod \"ingress-canary-94b8j\" (UID: \"f82e5609-2a2d-49f8-aae5-da767543bb3d\") " pod="openshift-ingress-canary/ingress-canary-94b8j" Apr 16 14:53:34.844066 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:34.844051 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:34.844132 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:34.844123 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert podName:f82e5609-2a2d-49f8-aae5-da767543bb3d nodeName:}" failed. No retries permitted until 2026-04-16 14:53:50.844104764 +0000 UTC m=+65.263239719 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert") pod "ingress-canary-94b8j" (UID: "f82e5609-2a2d-49f8-aae5-da767543bb3d") : secret "canary-serving-cert" not found Apr 16 14:53:46.583939 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:46.583911 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ddt96" Apr 16 14:53:50.749477 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:50.749434 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls\") pod \"dns-default-rsgph\" (UID: \"574ca2b9-aeca-4a60-8152-838c7e3d1902\") " pod="openshift-dns/dns-default-rsgph" Apr 16 14:53:50.749944 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:50.749566 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:50.749944 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:50.749642 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls podName:574ca2b9-aeca-4a60-8152-838c7e3d1902 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:22.749619544 +0000 UTC m=+97.168754500 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls") pod "dns-default-rsgph" (UID: "574ca2b9-aeca-4a60-8152-838c7e3d1902") : secret "dns-default-metrics-tls" not found Apr 16 14:53:50.850515 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:50.850475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert\") pod \"ingress-canary-94b8j\" (UID: \"f82e5609-2a2d-49f8-aae5-da767543bb3d\") " pod="openshift-ingress-canary/ingress-canary-94b8j" Apr 16 14:53:50.850660 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:50.850625 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:50.850705 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:50.850687 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert podName:f82e5609-2a2d-49f8-aae5-da767543bb3d nodeName:}" failed. No retries permitted until 2026-04-16 14:54:22.850671046 +0000 UTC m=+97.269806002 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert") pod "ingress-canary-94b8j" (UID: "f82e5609-2a2d-49f8-aae5-da767543bb3d") : secret "canary-serving-cert" not found Apr 16 14:53:51.052113 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:51.052037 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs\") pod \"network-metrics-daemon-9p5t7\" (UID: \"deecc941-e868-4306-99e5-4f30afef0f95\") " pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:53:51.054442 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:51.054425 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:53:51.062208 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:51.062190 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:53:51.062287 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:53:51.062248 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs podName:deecc941-e868-4306-99e5-4f30afef0f95 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:55.062232443 +0000 UTC m=+129.481367396 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs") pod "network-metrics-daemon-9p5t7" (UID: "deecc941-e868-4306-99e5-4f30afef0f95") : secret "metrics-daemon-secret" not found Apr 16 14:53:51.152573 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:51.152532 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t45ww\" (UniqueName: \"kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww\") pod \"network-check-target-6nklq\" (UID: \"d6e0e8e5-d659-4175-b96f-52c250d77fd0\") " pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:53:51.154822 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:51.154807 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:53:51.165573 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:51.165551 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:53:51.177084 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:51.177062 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t45ww\" (UniqueName: \"kubernetes.io/projected/d6e0e8e5-d659-4175-b96f-52c250d77fd0-kube-api-access-t45ww\") pod \"network-check-target-6nklq\" (UID: \"d6e0e8e5-d659-4175-b96f-52c250d77fd0\") " pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:53:51.424558 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:51.424523 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-j5fbx\"" Apr 16 14:53:51.433392 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:51.433373 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:53:51.615319 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:51.615262 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6nklq"] Apr 16 14:53:51.620608 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:53:51.620581 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6e0e8e5_d659_4175_b96f_52c250d77fd0.slice/crio-98dbabddefbc99e2f5aaea0d6520e88535ba7513cea7482c1ba7c297bd95b619 WatchSource:0}: Error finding container 98dbabddefbc99e2f5aaea0d6520e88535ba7513cea7482c1ba7c297bd95b619: Status 404 returned error can't find the container with id 98dbabddefbc99e2f5aaea0d6520e88535ba7513cea7482c1ba7c297bd95b619 Apr 16 14:53:51.646630 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:51.646596 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6nklq" event={"ID":"d6e0e8e5-d659-4175-b96f-52c250d77fd0","Type":"ContainerStarted","Data":"98dbabddefbc99e2f5aaea0d6520e88535ba7513cea7482c1ba7c297bd95b619"} Apr 16 14:53:54.655976 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:54.655943 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6nklq" event={"ID":"d6e0e8e5-d659-4175-b96f-52c250d77fd0","Type":"ContainerStarted","Data":"0678ba2a2f30b8a07e6b948d4bcca816ddd7aa30a35b06c929451c382b00ca85"} Apr 16 14:53:54.656372 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:54.656100 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:53:54.670877 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:53:54.670834 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-6nklq" podStartSLOduration=65.962450443 podStartE2EDuration="1m8.670823054s" podCreationTimestamp="2026-04-16 14:52:46 +0000 UTC" firstStartedPulling="2026-04-16 14:53:51.622484664 +0000 UTC m=+66.041619616" lastFinishedPulling="2026-04-16 14:53:54.330857259 +0000 UTC m=+68.749992227" observedRunningTime="2026-04-16 14:53:54.669628145 +0000 UTC m=+69.088763139" watchObservedRunningTime="2026-04-16 14:53:54.670823054 +0000 UTC m=+69.089958028" Apr 16 14:54:22.765347 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:54:22.765226 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls\") pod \"dns-default-rsgph\" (UID: \"574ca2b9-aeca-4a60-8152-838c7e3d1902\") " pod="openshift-dns/dns-default-rsgph" Apr 16 14:54:22.765805 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:54:22.765354 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:54:22.765805 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:54:22.765425 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls podName:574ca2b9-aeca-4a60-8152-838c7e3d1902 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:26.765407554 +0000 UTC m=+161.184542511 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls") pod "dns-default-rsgph" (UID: "574ca2b9-aeca-4a60-8152-838c7e3d1902") : secret "dns-default-metrics-tls" not found Apr 16 14:54:22.866487 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:54:22.866453 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert\") pod \"ingress-canary-94b8j\" (UID: \"f82e5609-2a2d-49f8-aae5-da767543bb3d\") " pod="openshift-ingress-canary/ingress-canary-94b8j" Apr 16 14:54:22.866596 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:54:22.866575 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:54:22.866651 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:54:22.866642 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert podName:f82e5609-2a2d-49f8-aae5-da767543bb3d nodeName:}" failed. No retries permitted until 2026-04-16 14:55:26.866627139 +0000 UTC m=+161.285762091 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert") pod "ingress-canary-94b8j" (UID: "f82e5609-2a2d-49f8-aae5-da767543bb3d") : secret "canary-serving-cert" not found Apr 16 14:54:25.660143 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:54:25.660110 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-6nklq" Apr 16 14:54:55.082709 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:54:55.082671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs\") pod \"network-metrics-daemon-9p5t7\" (UID: \"deecc941-e868-4306-99e5-4f30afef0f95\") " pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:54:55.083203 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:54:55.082799 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:54:55.083203 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:54:55.082859 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs podName:deecc941-e868-4306-99e5-4f30afef0f95 nodeName:}" failed. No retries permitted until 2026-04-16 14:56:57.082846453 +0000 UTC m=+251.501981405 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs") pod "network-metrics-daemon-9p5t7" (UID: "deecc941-e868-4306-99e5-4f30afef0f95") : secret "metrics-daemon-secret" not found Apr 16 14:55:00.626652 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.626616 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w"] Apr 16 14:55:00.629402 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.629386 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w" Apr 16 14:55:00.633738 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.633714 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:55:00.633738 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.633729 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:55:00.633922 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.633788 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 14:55:00.633922 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.633808 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-gq7gc\"" Apr 16 14:55:00.633922 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.633809 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 14:55:00.639472 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.639451 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w"] Apr 16 14:55:00.716267 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.716236 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-9pd4w\" (UID: \"9b07e8fb-9184-409a-ac6c-ab62ef5c0a79\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w" Apr 16 14:55:00.716383 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.716299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-9pd4w\" (UID: \"9b07e8fb-9184-409a-ac6c-ab62ef5c0a79\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w" Apr 16 14:55:00.716383 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.716343 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dclsd\" (UniqueName: \"kubernetes.io/projected/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-kube-api-access-dclsd\") pod \"cluster-monitoring-operator-6667474d89-9pd4w\" (UID: \"9b07e8fb-9184-409a-ac6c-ab62ef5c0a79\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w" Apr 16 14:55:00.729242 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.729213 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-hgj2w"] Apr 16 14:55:00.731855 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.731837 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-hgj2w" Apr 16 14:55:00.734173 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.734146 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-nvc4r\"" Apr 16 14:55:00.734323 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.734303 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:55:00.734385 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.734373 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 14:55:00.742281 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.742253 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-5dhmf"] Apr 16 14:55:00.745077 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.745061 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-67bddfdd5b-xjz7q"] Apr 16 14:55:00.745226 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.745210 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" Apr 16 14:55:00.749611 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.749589 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gkt5f"] Apr 16 14:55:00.750164 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.750139 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:00.751727 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.751708 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-6sdf6\"" Apr 16 14:55:00.751821 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.751808 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:55:00.751881 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.751862 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 14:55:00.751941 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.751911 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 14:55:00.752136 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.752118 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 14:55:00.752662 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.752644 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-hgj2w"] Apr 16 14:55:00.752739 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.752720 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gkt5f" Apr 16 14:55:00.755608 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.755586 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 14:55:00.755687 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.755636 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 14:55:00.755981 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.755962 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 14:55:00.756098 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.756014 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:55:00.756098 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.756085 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 14:55:00.756204 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.756099 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-xpnpw\"" Apr 16 14:55:00.756204 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.756152 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-hzw2z\"" Apr 16 14:55:00.756204 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.756106 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 14:55:00.756542 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.756521 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 14:55:00.760014 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.759997 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 14:55:00.764168 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.764150 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 14:55:00.766298 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.766278 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-5dhmf"] Apr 16 14:55:00.767274 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.767259 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gkt5f"] Apr 16 14:55:00.768316 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.768295 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-67bddfdd5b-xjz7q"] Apr 16 14:55:00.816741 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.816718 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d238297c-7c93-4211-8678-2ecfa5f39967-serving-cert\") pod \"console-operator-d87b8d5fc-5dhmf\" (UID: \"d238297c-7c93-4211-8678-2ecfa5f39967\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" Apr 16 14:55:00.816859 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.816774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sbsj\" (UniqueName: \"kubernetes.io/projected/d238297c-7c93-4211-8678-2ecfa5f39967-kube-api-access-9sbsj\") pod \"console-operator-d87b8d5fc-5dhmf\" (UID: \"d238297c-7c93-4211-8678-2ecfa5f39967\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" Apr 16 14:55:00.816859 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.816816 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-9pd4w\" (UID: \"9b07e8fb-9184-409a-ac6c-ab62ef5c0a79\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w" Apr 16 14:55:00.816859 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.816843 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dclsd\" (UniqueName: \"kubernetes.io/projected/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-kube-api-access-dclsd\") pod \"cluster-monitoring-operator-6667474d89-9pd4w\" (UID: \"9b07e8fb-9184-409a-ac6c-ab62ef5c0a79\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w" Apr 16 14:55:00.816972 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.816867 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp9xq\" (UniqueName: \"kubernetes.io/projected/62bdf3db-4656-4d51-9053-16e6c9a90d0a-kube-api-access-lp9xq\") pod \"volume-data-source-validator-7d955d5dd4-hgj2w\" (UID: \"62bdf3db-4656-4d51-9053-16e6c9a90d0a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-hgj2w" Apr 16 14:55:00.816972 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.816967 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d238297c-7c93-4211-8678-2ecfa5f39967-config\") pod \"console-operator-d87b8d5fc-5dhmf\" (UID: \"d238297c-7c93-4211-8678-2ecfa5f39967\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" Apr 16 14:55:00.817063 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.816986 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d238297c-7c93-4211-8678-2ecfa5f39967-trusted-ca\") pod \"console-operator-d87b8d5fc-5dhmf\" (UID: \"d238297c-7c93-4211-8678-2ecfa5f39967\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" Apr 16 14:55:00.817063 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.817043 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-9pd4w\" (UID: \"9b07e8fb-9184-409a-ac6c-ab62ef5c0a79\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w" Apr 16 14:55:00.817165 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:00.817153 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:55:00.817218 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:00.817209 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-cluster-monitoring-operator-tls podName:9b07e8fb-9184-409a-ac6c-ab62ef5c0a79 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:01.317195842 +0000 UTC m=+135.736330799 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-9pd4w" (UID: "9b07e8fb-9184-409a-ac6c-ab62ef5c0a79") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:55:00.817467 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.817451 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-9pd4w\" (UID: \"9b07e8fb-9184-409a-ac6c-ab62ef5c0a79\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w" Apr 16 14:55:00.828379 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.828352 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dclsd\" (UniqueName: \"kubernetes.io/projected/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-kube-api-access-dclsd\") pod \"cluster-monitoring-operator-6667474d89-9pd4w\" (UID: \"9b07e8fb-9184-409a-ac6c-ab62ef5c0a79\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w" Apr 16 14:55:00.918035 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.917994 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6d80c27-15aa-4aea-8508-8913412eba90-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-gkt5f\" (UID: \"e6d80c27-15aa-4aea-8508-8913412eba90\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gkt5f" Apr 16 14:55:00.918166 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.918039 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-bound-sa-token\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:00.918166 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.918055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xk7b\" (UniqueName: \"kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-kube-api-access-5xk7b\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:00.918166 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.918110 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9sbsj\" (UniqueName: \"kubernetes.io/projected/d238297c-7c93-4211-8678-2ecfa5f39967-kube-api-access-9sbsj\") pod \"console-operator-d87b8d5fc-5dhmf\" (UID: \"d238297c-7c93-4211-8678-2ecfa5f39967\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" Apr 16 14:55:00.918166 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.918143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lp9xq\" (UniqueName: \"kubernetes.io/projected/62bdf3db-4656-4d51-9053-16e6c9a90d0a-kube-api-access-lp9xq\") pod \"volume-data-source-validator-7d955d5dd4-hgj2w\" (UID: \"62bdf3db-4656-4d51-9053-16e6c9a90d0a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-hgj2w" Apr 16 14:55:00.918166 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.918163 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-registry-tls\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:00.918400 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.918186 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d238297c-7c93-4211-8678-2ecfa5f39967-config\") pod \"console-operator-d87b8d5fc-5dhmf\" (UID: \"d238297c-7c93-4211-8678-2ecfa5f39967\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" Apr 16 14:55:00.918400 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.918205 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d238297c-7c93-4211-8678-2ecfa5f39967-trusted-ca\") pod \"console-operator-d87b8d5fc-5dhmf\" (UID: \"d238297c-7c93-4211-8678-2ecfa5f39967\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" Apr 16 14:55:00.918400 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.918223 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5830445f-51ad-4827-b195-39ebf2152864-image-registry-private-configuration\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:00.918400 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.918240 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5830445f-51ad-4827-b195-39ebf2152864-ca-trust-extracted\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:00.918400 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.918305 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86dwg\" (UniqueName: \"kubernetes.io/projected/e6d80c27-15aa-4aea-8508-8913412eba90-kube-api-access-86dwg\") pod \"kube-storage-version-migrator-operator-756bb7d76f-gkt5f\" (UID: \"e6d80c27-15aa-4aea-8508-8913412eba90\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gkt5f" Apr 16 14:55:00.918400 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.918394 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5830445f-51ad-4827-b195-39ebf2152864-registry-certificates\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:00.918703 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.918465 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d238297c-7c93-4211-8678-2ecfa5f39967-serving-cert\") pod \"console-operator-d87b8d5fc-5dhmf\" (UID: \"d238297c-7c93-4211-8678-2ecfa5f39967\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" Apr 16 14:55:00.918703 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.918495 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6d80c27-15aa-4aea-8508-8913412eba90-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-gkt5f\" (UID: \"e6d80c27-15aa-4aea-8508-8913412eba90\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gkt5f" Apr 16 14:55:00.918703 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.918519 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5830445f-51ad-4827-b195-39ebf2152864-trusted-ca\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:00.918703 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.918565 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5830445f-51ad-4827-b195-39ebf2152864-installation-pull-secrets\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:00.918869 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.918852 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d238297c-7c93-4211-8678-2ecfa5f39967-config\") pod \"console-operator-d87b8d5fc-5dhmf\" (UID: \"d238297c-7c93-4211-8678-2ecfa5f39967\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" Apr 16 14:55:00.919293 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.919276 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d238297c-7c93-4211-8678-2ecfa5f39967-trusted-ca\") pod \"console-operator-d87b8d5fc-5dhmf\" (UID: \"d238297c-7c93-4211-8678-2ecfa5f39967\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" Apr 16 14:55:00.920622 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.920601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d238297c-7c93-4211-8678-2ecfa5f39967-serving-cert\") pod \"console-operator-d87b8d5fc-5dhmf\" (UID: \"d238297c-7c93-4211-8678-2ecfa5f39967\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" Apr 16 14:55:00.925934 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.925909 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp9xq\" (UniqueName: \"kubernetes.io/projected/62bdf3db-4656-4d51-9053-16e6c9a90d0a-kube-api-access-lp9xq\") pod \"volume-data-source-validator-7d955d5dd4-hgj2w\" (UID: \"62bdf3db-4656-4d51-9053-16e6c9a90d0a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-hgj2w" Apr 16 14:55:00.926009 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:00.925979 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sbsj\" (UniqueName: \"kubernetes.io/projected/d238297c-7c93-4211-8678-2ecfa5f39967-kube-api-access-9sbsj\") pod \"console-operator-d87b8d5fc-5dhmf\" (UID: \"d238297c-7c93-4211-8678-2ecfa5f39967\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" Apr 16 14:55:01.019522 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.019494 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-registry-tls\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:01.019635 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.019532 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5830445f-51ad-4827-b195-39ebf2152864-image-registry-private-configuration\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:01.019635 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.019552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5830445f-51ad-4827-b195-39ebf2152864-ca-trust-extracted\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:01.019635 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.019576 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86dwg\" (UniqueName: \"kubernetes.io/projected/e6d80c27-15aa-4aea-8508-8913412eba90-kube-api-access-86dwg\") pod \"kube-storage-version-migrator-operator-756bb7d76f-gkt5f\" (UID: \"e6d80c27-15aa-4aea-8508-8913412eba90\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gkt5f" Apr 16 14:55:01.019635 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:01.019622 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:55:01.019816 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:01.019638 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67bddfdd5b-xjz7q: secret "image-registry-tls" not found Apr 16 14:55:01.019816 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.019627 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5830445f-51ad-4827-b195-39ebf2152864-registry-certificates\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:01.019816 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:01.019693 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-registry-tls podName:5830445f-51ad-4827-b195-39ebf2152864 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:01.519677495 +0000 UTC m=+135.938812448 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-registry-tls") pod "image-registry-67bddfdd5b-xjz7q" (UID: "5830445f-51ad-4827-b195-39ebf2152864") : secret "image-registry-tls" not found Apr 16 14:55:01.019816 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.019757 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6d80c27-15aa-4aea-8508-8913412eba90-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-gkt5f\" (UID: \"e6d80c27-15aa-4aea-8508-8913412eba90\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gkt5f" Apr 16 14:55:01.019816 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.019788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5830445f-51ad-4827-b195-39ebf2152864-trusted-ca\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:01.020073 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.019818 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5830445f-51ad-4827-b195-39ebf2152864-installation-pull-secrets\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:01.020073 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.019882 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6d80c27-15aa-4aea-8508-8913412eba90-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-gkt5f\" (UID: \"e6d80c27-15aa-4aea-8508-8913412eba90\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gkt5f" Apr 16 14:55:01.020073 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.019905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-bound-sa-token\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:01.020073 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.019929 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xk7b\" (UniqueName: \"kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-kube-api-access-5xk7b\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:01.020479 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.020456 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6d80c27-15aa-4aea-8508-8913412eba90-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-gkt5f\" (UID: \"e6d80c27-15aa-4aea-8508-8913412eba90\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gkt5f" Apr 16 14:55:01.020614 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.020591 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5830445f-51ad-4827-b195-39ebf2152864-ca-trust-extracted\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:01.020910 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.020886 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5830445f-51ad-4827-b195-39ebf2152864-registry-certificates\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:01.021139 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.021121 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5830445f-51ad-4827-b195-39ebf2152864-trusted-ca\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:01.021882 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.021856 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6d80c27-15aa-4aea-8508-8913412eba90-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-gkt5f\" (UID: \"e6d80c27-15aa-4aea-8508-8913412eba90\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gkt5f" Apr 16 14:55:01.022079 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.022062 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5830445f-51ad-4827-b195-39ebf2152864-image-registry-private-configuration\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:01.022195 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.022179 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5830445f-51ad-4827-b195-39ebf2152864-installation-pull-secrets\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:01.028062 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.028042 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86dwg\" (UniqueName: \"kubernetes.io/projected/e6d80c27-15aa-4aea-8508-8913412eba90-kube-api-access-86dwg\") pod \"kube-storage-version-migrator-operator-756bb7d76f-gkt5f\" (UID: \"e6d80c27-15aa-4aea-8508-8913412eba90\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gkt5f" Apr 16 14:55:01.028556 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.028538 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xk7b\" (UniqueName: \"kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-kube-api-access-5xk7b\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:01.028642 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.028595 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-bound-sa-token\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:01.040576 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.040564 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-hgj2w" Apr 16 14:55:01.059807 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.059784 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" Apr 16 14:55:01.072089 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.072068 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gkt5f" Apr 16 14:55:01.170444 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.170211 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-hgj2w"] Apr 16 14:55:01.172576 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:55:01.172535 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62bdf3db_4656_4d51_9053_16e6c9a90d0a.slice/crio-851611a796ee1526ab31399173f971bdaf17f96532c267d6e8aad0b2959d4abe WatchSource:0}: Error finding container 851611a796ee1526ab31399173f971bdaf17f96532c267d6e8aad0b2959d4abe: Status 404 returned error can't find the container with id 851611a796ee1526ab31399173f971bdaf17f96532c267d6e8aad0b2959d4abe Apr 16 14:55:01.193312 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.193289 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-5dhmf"] Apr 16 14:55:01.196215 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:55:01.196181 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd238297c_7c93_4211_8678_2ecfa5f39967.slice/crio-f4da33505f3e2a70bbc9a6ee84328f4b13dcbd59a049385c4e36a0df3afe7890 WatchSource:0}: Error finding container f4da33505f3e2a70bbc9a6ee84328f4b13dcbd59a049385c4e36a0df3afe7890: Status 404 returned error can't find the container with id f4da33505f3e2a70bbc9a6ee84328f4b13dcbd59a049385c4e36a0df3afe7890 Apr 16 14:55:01.218584 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.218561 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gkt5f"] Apr 16 14:55:01.220940 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:55:01.220914 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d80c27_15aa_4aea_8508_8913412eba90.slice/crio-865650c9826b239163fed3435f228f452ebe58827f45c5ffda8c23d645d862d5 WatchSource:0}: Error finding container 865650c9826b239163fed3435f228f452ebe58827f45c5ffda8c23d645d862d5: Status 404 returned error can't find the container with id 865650c9826b239163fed3435f228f452ebe58827f45c5ffda8c23d645d862d5 Apr 16 14:55:01.322198 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.322158 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-9pd4w\" (UID: \"9b07e8fb-9184-409a-ac6c-ab62ef5c0a79\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w" Apr 16 14:55:01.322337 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:01.322324 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:55:01.322399 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:01.322392 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-cluster-monitoring-operator-tls podName:9b07e8fb-9184-409a-ac6c-ab62ef5c0a79 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:02.32237661 +0000 UTC m=+136.741511575 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-9pd4w" (UID: "9b07e8fb-9184-409a-ac6c-ab62ef5c0a79") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:55:01.524867 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.524766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-registry-tls\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:01.525041 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:01.524918 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:55:01.525041 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:01.524938 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67bddfdd5b-xjz7q: secret "image-registry-tls" not found Apr 16 14:55:01.525041 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:01.524991 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-registry-tls podName:5830445f-51ad-4827-b195-39ebf2152864 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:02.524975916 +0000 UTC m=+136.944110868 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-registry-tls") pod "image-registry-67bddfdd5b-xjz7q" (UID: "5830445f-51ad-4827-b195-39ebf2152864") : secret "image-registry-tls" not found Apr 16 14:55:01.779283 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.779162 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" event={"ID":"d238297c-7c93-4211-8678-2ecfa5f39967","Type":"ContainerStarted","Data":"f4da33505f3e2a70bbc9a6ee84328f4b13dcbd59a049385c4e36a0df3afe7890"} Apr 16 14:55:01.781167 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.781076 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gkt5f" event={"ID":"e6d80c27-15aa-4aea-8508-8913412eba90","Type":"ContainerStarted","Data":"865650c9826b239163fed3435f228f452ebe58827f45c5ffda8c23d645d862d5"} Apr 16 14:55:01.783060 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:01.783001 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-hgj2w" event={"ID":"62bdf3db-4656-4d51-9053-16e6c9a90d0a","Type":"ContainerStarted","Data":"851611a796ee1526ab31399173f971bdaf17f96532c267d6e8aad0b2959d4abe"} Apr 16 14:55:02.330179 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:02.330141 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-9pd4w\" (UID: \"9b07e8fb-9184-409a-ac6c-ab62ef5c0a79\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w" Apr 16 14:55:02.330366 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:02.330282 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:55:02.330366 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:02.330363 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-cluster-monitoring-operator-tls podName:9b07e8fb-9184-409a-ac6c-ab62ef5c0a79 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:04.330342804 +0000 UTC m=+138.749477771 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-9pd4w" (UID: "9b07e8fb-9184-409a-ac6c-ab62ef5c0a79") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:55:02.531777 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:02.531741 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-registry-tls\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:02.531917 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:02.531858 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:55:02.531917 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:02.531870 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67bddfdd5b-xjz7q: secret "image-registry-tls" not found Apr 16 14:55:02.531993 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:02.531920 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-registry-tls podName:5830445f-51ad-4827-b195-39ebf2152864 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:04.531906844 +0000 UTC m=+138.951041797 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-registry-tls") pod "image-registry-67bddfdd5b-xjz7q" (UID: "5830445f-51ad-4827-b195-39ebf2152864") : secret "image-registry-tls" not found Apr 16 14:55:03.788287 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:03.788191 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gkt5f" event={"ID":"e6d80c27-15aa-4aea-8508-8913412eba90","Type":"ContainerStarted","Data":"ce5587578751d427f6989f623e1f4f48d66096c653904823e523f33658be619a"} Apr 16 14:55:03.789530 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:03.789501 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-hgj2w" event={"ID":"62bdf3db-4656-4d51-9053-16e6c9a90d0a","Type":"ContainerStarted","Data":"365242fda095b6c70bf0aad2e584e21da329ea4067f86a554cb0a04ad5e58d64"} Apr 16 14:55:03.790909 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:03.790889 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/0.log" Apr 16 14:55:03.790985 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:03.790930 2576 generic.go:358] "Generic (PLEG): container finished" podID="d238297c-7c93-4211-8678-2ecfa5f39967" containerID="a432cdf6af63097b3fb8d3434e22944116e8dfc69d22e4c0a701de7c949d486a" exitCode=255 Apr 16 14:55:03.790985 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:03.790958 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" event={"ID":"d238297c-7c93-4211-8678-2ecfa5f39967","Type":"ContainerDied","Data":"a432cdf6af63097b3fb8d3434e22944116e8dfc69d22e4c0a701de7c949d486a"} Apr 16 14:55:03.791214 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:03.791199 2576 scope.go:117] "RemoveContainer" containerID="a432cdf6af63097b3fb8d3434e22944116e8dfc69d22e4c0a701de7c949d486a" Apr 16 14:55:03.804418 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:03.804371 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gkt5f" podStartSLOduration=1.48319877 podStartE2EDuration="3.804357059s" podCreationTimestamp="2026-04-16 14:55:00 +0000 UTC" firstStartedPulling="2026-04-16 14:55:01.224118511 +0000 UTC m=+135.643253464" lastFinishedPulling="2026-04-16 14:55:03.5452768 +0000 UTC m=+137.964411753" observedRunningTime="2026-04-16 14:55:03.803631524 +0000 UTC m=+138.222766512" watchObservedRunningTime="2026-04-16 14:55:03.804357059 +0000 UTC m=+138.223492031" Apr 16 14:55:03.842686 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:03.842636 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-hgj2w" podStartSLOduration=1.477558016 podStartE2EDuration="3.842617904s" podCreationTimestamp="2026-04-16 14:55:00 +0000 UTC" firstStartedPulling="2026-04-16 14:55:01.17475851 +0000 UTC m=+135.593893463" lastFinishedPulling="2026-04-16 14:55:03.539818393 +0000 UTC m=+137.958953351" observedRunningTime="2026-04-16 14:55:03.842036181 +0000 UTC m=+138.261171177" watchObservedRunningTime="2026-04-16 14:55:03.842617904 +0000 UTC m=+138.261752883" Apr 16 14:55:04.346508 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:04.346474 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-9pd4w\" (UID: \"9b07e8fb-9184-409a-ac6c-ab62ef5c0a79\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w" Apr 16 14:55:04.346678 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:04.346622 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:55:04.346729 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:04.346686 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-cluster-monitoring-operator-tls podName:9b07e8fb-9184-409a-ac6c-ab62ef5c0a79 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:08.346671675 +0000 UTC m=+142.765806632 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-9pd4w" (UID: "9b07e8fb-9184-409a-ac6c-ab62ef5c0a79") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:55:04.548917 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:04.548877 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-registry-tls\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:04.549100 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:04.549016 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:55:04.549100 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:04.549045 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67bddfdd5b-xjz7q: secret "image-registry-tls" not found Apr 16 14:55:04.549100 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:04.549095 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-registry-tls podName:5830445f-51ad-4827-b195-39ebf2152864 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:08.549080857 +0000 UTC m=+142.968215815 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-registry-tls") pod "image-registry-67bddfdd5b-xjz7q" (UID: "5830445f-51ad-4827-b195-39ebf2152864") : secret "image-registry-tls" not found Apr 16 14:55:04.795299 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:04.795271 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 14:55:04.795687 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:04.795669 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/0.log" Apr 16 14:55:04.795732 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:04.795702 2576 generic.go:358] "Generic (PLEG): container finished" podID="d238297c-7c93-4211-8678-2ecfa5f39967" containerID="07e763db492531ae663b568bcb17a042d3453ed6254aa61cd25cf5abd51e6c07" exitCode=255 Apr 16 14:55:04.795771 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:04.795733 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" event={"ID":"d238297c-7c93-4211-8678-2ecfa5f39967","Type":"ContainerDied","Data":"07e763db492531ae663b568bcb17a042d3453ed6254aa61cd25cf5abd51e6c07"} Apr 16 14:55:04.795807 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:04.795781 2576 scope.go:117] "RemoveContainer" containerID="a432cdf6af63097b3fb8d3434e22944116e8dfc69d22e4c0a701de7c949d486a" Apr 16 14:55:04.796266 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:04.796247 2576 scope.go:117] "RemoveContainer" containerID="07e763db492531ae663b568bcb17a042d3453ed6254aa61cd25cf5abd51e6c07" Apr 16 14:55:04.796454 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:04.796434 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-5dhmf_openshift-console-operator(d238297c-7c93-4211-8678-2ecfa5f39967)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" podUID="d238297c-7c93-4211-8678-2ecfa5f39967" Apr 16 14:55:05.798597 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:05.798558 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 14:55:05.798942 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:05.798883 2576 scope.go:117] "RemoveContainer" containerID="07e763db492531ae663b568bcb17a042d3453ed6254aa61cd25cf5abd51e6c07" Apr 16 14:55:05.799074 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:05.799056 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-5dhmf_openshift-console-operator(d238297c-7c93-4211-8678-2ecfa5f39967)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" podUID="d238297c-7c93-4211-8678-2ecfa5f39967" Apr 16 14:55:06.127863 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:06.127781 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-ppzzq"] Apr 16 14:55:06.130914 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:06.130897 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-ppzzq" Apr 16 14:55:06.133377 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:06.133353 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-b79ww\"" Apr 16 14:55:06.133519 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:06.133353 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 14:55:06.133519 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:06.133413 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 14:55:06.133519 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:06.133362 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 14:55:06.133519 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:06.133493 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 14:55:06.139708 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:06.139688 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-ppzzq"] Apr 16 14:55:06.262610 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:06.262578 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a3b99c2e-7af1-4bc8-9d92-60fe721033cd-signing-cabundle\") pod \"service-ca-bfc587fb7-ppzzq\" (UID: \"a3b99c2e-7af1-4bc8-9d92-60fe721033cd\") " pod="openshift-service-ca/service-ca-bfc587fb7-ppzzq" Apr 16 14:55:06.262766 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:06.262628 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a3b99c2e-7af1-4bc8-9d92-60fe721033cd-signing-key\") pod \"service-ca-bfc587fb7-ppzzq\" (UID: \"a3b99c2e-7af1-4bc8-9d92-60fe721033cd\") " pod="openshift-service-ca/service-ca-bfc587fb7-ppzzq" Apr 16 14:55:06.262808 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:06.262765 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcwbt\" (UniqueName: \"kubernetes.io/projected/a3b99c2e-7af1-4bc8-9d92-60fe721033cd-kube-api-access-xcwbt\") pod \"service-ca-bfc587fb7-ppzzq\" (UID: \"a3b99c2e-7af1-4bc8-9d92-60fe721033cd\") " pod="openshift-service-ca/service-ca-bfc587fb7-ppzzq" Apr 16 14:55:06.363189 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:06.363147 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a3b99c2e-7af1-4bc8-9d92-60fe721033cd-signing-key\") pod \"service-ca-bfc587fb7-ppzzq\" (UID: \"a3b99c2e-7af1-4bc8-9d92-60fe721033cd\") " pod="openshift-service-ca/service-ca-bfc587fb7-ppzzq" Apr 16 14:55:06.363376 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:06.363250 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcwbt\" (UniqueName: \"kubernetes.io/projected/a3b99c2e-7af1-4bc8-9d92-60fe721033cd-kube-api-access-xcwbt\") pod \"service-ca-bfc587fb7-ppzzq\" (UID: \"a3b99c2e-7af1-4bc8-9d92-60fe721033cd\") " pod="openshift-service-ca/service-ca-bfc587fb7-ppzzq" Apr 16 14:55:06.363376 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:06.363274 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a3b99c2e-7af1-4bc8-9d92-60fe721033cd-signing-cabundle\") pod \"service-ca-bfc587fb7-ppzzq\" (UID: \"a3b99c2e-7af1-4bc8-9d92-60fe721033cd\") " pod="openshift-service-ca/service-ca-bfc587fb7-ppzzq" Apr 16 14:55:06.364426 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:06.364403 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a3b99c2e-7af1-4bc8-9d92-60fe721033cd-signing-cabundle\") pod \"service-ca-bfc587fb7-ppzzq\" (UID: \"a3b99c2e-7af1-4bc8-9d92-60fe721033cd\") " pod="openshift-service-ca/service-ca-bfc587fb7-ppzzq" Apr 16 14:55:06.365632 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:06.365606 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a3b99c2e-7af1-4bc8-9d92-60fe721033cd-signing-key\") pod \"service-ca-bfc587fb7-ppzzq\" (UID: \"a3b99c2e-7af1-4bc8-9d92-60fe721033cd\") " pod="openshift-service-ca/service-ca-bfc587fb7-ppzzq" Apr 16 14:55:06.371128 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:06.371106 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcwbt\" (UniqueName: \"kubernetes.io/projected/a3b99c2e-7af1-4bc8-9d92-60fe721033cd-kube-api-access-xcwbt\") pod \"service-ca-bfc587fb7-ppzzq\" (UID: \"a3b99c2e-7af1-4bc8-9d92-60fe721033cd\") " pod="openshift-service-ca/service-ca-bfc587fb7-ppzzq" Apr 16 14:55:06.382200 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:06.382155 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mjsr6_69a10374-32da-4de3-b491-3854f69f1613/dns-node-resolver/0.log" Apr 16 14:55:06.439562 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:06.439509 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-ppzzq" Apr 16 14:55:06.554801 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:06.554767 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-ppzzq"] Apr 16 14:55:06.558388 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:55:06.558364 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3b99c2e_7af1_4bc8_9d92_60fe721033cd.slice/crio-2c05204172534e3b2054d0697b9b25e5e2950d4d020ec48d71abff3bc1e90f86 WatchSource:0}: Error finding container 2c05204172534e3b2054d0697b9b25e5e2950d4d020ec48d71abff3bc1e90f86: Status 404 returned error can't find the container with id 2c05204172534e3b2054d0697b9b25e5e2950d4d020ec48d71abff3bc1e90f86 Apr 16 14:55:06.801908 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:06.801877 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-ppzzq" event={"ID":"a3b99c2e-7af1-4bc8-9d92-60fe721033cd","Type":"ContainerStarted","Data":"2c05204172534e3b2054d0697b9b25e5e2950d4d020ec48d71abff3bc1e90f86"} Apr 16 14:55:07.578919 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:07.578890 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rxvdm_ab74fce9-eb83-4941-97e9-42f6ed125bf5/node-ca/0.log" Apr 16 14:55:08.381036 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:08.380937 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-9pd4w\" (UID: \"9b07e8fb-9184-409a-ac6c-ab62ef5c0a79\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w" Apr 16 14:55:08.381377 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:08.381102 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:55:08.381377 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:08.381162 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-cluster-monitoring-operator-tls podName:9b07e8fb-9184-409a-ac6c-ab62ef5c0a79 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:16.381146429 +0000 UTC m=+150.800281396 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-9pd4w" (UID: "9b07e8fb-9184-409a-ac6c-ab62ef5c0a79") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:55:08.582760 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:08.582714 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-registry-tls\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:08.582908 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:08.582869 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:55:08.582908 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:08.582891 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67bddfdd5b-xjz7q: secret "image-registry-tls" not found Apr 16 14:55:08.582992 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:08.582944 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-registry-tls podName:5830445f-51ad-4827-b195-39ebf2152864 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:16.582928613 +0000 UTC m=+151.002063566 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-registry-tls") pod "image-registry-67bddfdd5b-xjz7q" (UID: "5830445f-51ad-4827-b195-39ebf2152864") : secret "image-registry-tls" not found Apr 16 14:55:08.808451 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:08.808418 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-ppzzq" event={"ID":"a3b99c2e-7af1-4bc8-9d92-60fe721033cd","Type":"ContainerStarted","Data":"32f44a601e906f5ab22eae0d2ef8c17482944bf6f405f0b07b65551ffb62629c"} Apr 16 14:55:08.825609 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:08.825559 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-ppzzq" podStartSLOduration=1.327621226 podStartE2EDuration="2.825543933s" podCreationTimestamp="2026-04-16 14:55:06 +0000 UTC" firstStartedPulling="2026-04-16 14:55:06.560147715 +0000 UTC m=+140.979282672" lastFinishedPulling="2026-04-16 14:55:08.058070423 +0000 UTC m=+142.477205379" observedRunningTime="2026-04-16 14:55:08.824961063 +0000 UTC m=+143.244096037" watchObservedRunningTime="2026-04-16 14:55:08.825543933 +0000 UTC m=+143.244678908" Apr 16 14:55:11.060689 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:11.060653 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" Apr 16 14:55:11.060689 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:11.060692 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" Apr 16 14:55:11.061100 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:11.061041 2576 scope.go:117] "RemoveContainer" containerID="07e763db492531ae663b568bcb17a042d3453ed6254aa61cd25cf5abd51e6c07" Apr 16 14:55:11.061236 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:11.061218 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-5dhmf_openshift-console-operator(d238297c-7c93-4211-8678-2ecfa5f39967)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" podUID="d238297c-7c93-4211-8678-2ecfa5f39967" Apr 16 14:55:16.442791 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:16.442754 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-9pd4w\" (UID: \"9b07e8fb-9184-409a-ac6c-ab62ef5c0a79\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w" Apr 16 14:55:16.443182 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:16.442898 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:55:16.443182 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:16.442969 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-cluster-monitoring-operator-tls podName:9b07e8fb-9184-409a-ac6c-ab62ef5c0a79 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:32.442952701 +0000 UTC m=+166.862087653 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-9pd4w" (UID: "9b07e8fb-9184-409a-ac6c-ab62ef5c0a79") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:55:16.643853 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:16.643823 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-registry-tls\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:16.645914 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:16.645887 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-registry-tls\") pod \"image-registry-67bddfdd5b-xjz7q\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:16.666795 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:16.666767 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:16.788043 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:16.787987 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-67bddfdd5b-xjz7q"] Apr 16 14:55:16.791165 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:55:16.791139 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5830445f_51ad_4827_b195_39ebf2152864.slice/crio-f50108335a7182f3e521aaa533e7776660282106e13d7c89671cfeaeecfbf57b WatchSource:0}: Error finding container f50108335a7182f3e521aaa533e7776660282106e13d7c89671cfeaeecfbf57b: Status 404 returned error can't find the container with id f50108335a7182f3e521aaa533e7776660282106e13d7c89671cfeaeecfbf57b Apr 16 14:55:16.832892 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:16.832865 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" event={"ID":"5830445f-51ad-4827-b195-39ebf2152864","Type":"ContainerStarted","Data":"f50108335a7182f3e521aaa533e7776660282106e13d7c89671cfeaeecfbf57b"} Apr 16 14:55:17.836548 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:17.836513 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" event={"ID":"5830445f-51ad-4827-b195-39ebf2152864","Type":"ContainerStarted","Data":"050c07704a38de6db719dd7cf427357c3fdc0a1439a1718ac53cc67256c5c394"} Apr 16 14:55:17.836944 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:17.836625 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:17.856050 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:17.855992 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" podStartSLOduration=17.855978374 podStartE2EDuration="17.855978374s" podCreationTimestamp="2026-04-16 14:55:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:55:17.855570539 +0000 UTC m=+152.274705515" watchObservedRunningTime="2026-04-16 14:55:17.855978374 +0000 UTC m=+152.275113349" Apr 16 14:55:21.963518 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:21.963462 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-rsgph" podUID="574ca2b9-aeca-4a60-8152-838c7e3d1902" Apr 16 14:55:21.981829 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:21.981800 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-94b8j" podUID="f82e5609-2a2d-49f8-aae5-da767543bb3d" Apr 16 14:55:22.303278 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:22.303208 2576 scope.go:117] "RemoveContainer" containerID="07e763db492531ae663b568bcb17a042d3453ed6254aa61cd25cf5abd51e6c07" Apr 16 14:55:22.315239 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:22.315205 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-9p5t7" podUID="deecc941-e868-4306-99e5-4f30afef0f95" Apr 16 14:55:22.851291 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:22.851263 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 14:55:22.851454 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:22.851387 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rsgph" Apr 16 14:55:22.851454 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:22.851388 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" event={"ID":"d238297c-7c93-4211-8678-2ecfa5f39967","Type":"ContainerStarted","Data":"5eb61ca58a5a4190e9a07437847961b5adef46538a7e757deab869ddd9fbd2bf"} Apr 16 14:55:22.851747 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:22.851734 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" Apr 16 14:55:22.867792 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:22.867746 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" podStartSLOduration=20.524253015 podStartE2EDuration="22.867732167s" podCreationTimestamp="2026-04-16 14:55:00 +0000 UTC" firstStartedPulling="2026-04-16 14:55:01.198198485 +0000 UTC m=+135.617333438" lastFinishedPulling="2026-04-16 14:55:03.541677634 +0000 UTC m=+137.960812590" observedRunningTime="2026-04-16 14:55:22.867280699 +0000 UTC m=+157.286415686" watchObservedRunningTime="2026-04-16 14:55:22.867732167 +0000 UTC m=+157.286867137" Apr 16 14:55:23.542733 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:23.542703 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-5dhmf" Apr 16 14:55:26.812344 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.812265 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls\") pod \"dns-default-rsgph\" (UID: \"574ca2b9-aeca-4a60-8152-838c7e3d1902\") " pod="openshift-dns/dns-default-rsgph" Apr 16 14:55:26.814641 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.814618 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/574ca2b9-aeca-4a60-8152-838c7e3d1902-metrics-tls\") pod \"dns-default-rsgph\" (UID: \"574ca2b9-aeca-4a60-8152-838c7e3d1902\") " pod="openshift-dns/dns-default-rsgph" Apr 16 14:55:26.843302 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.843272 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-swmsg"] Apr 16 14:55:26.845864 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.845842 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-swmsg" Apr 16 14:55:26.849086 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.849061 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 14:55:26.849195 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.849158 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 14:55:26.849195 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.849162 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-t2gx7\"" Apr 16 14:55:26.859743 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.859675 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-67bddfdd5b-xjz7q"] Apr 16 14:55:26.860755 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.860732 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-swmsg"] Apr 16 14:55:26.870908 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.870888 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-b8j7n"] Apr 16 14:55:26.874253 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.874225 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-b8j7n" Apr 16 14:55:26.876814 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.876792 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:55:26.876920 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.876802 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:55:26.876920 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.876898 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8pzvz\"" Apr 16 14:55:26.877455 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.877262 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:55:26.877556 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.877540 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:55:26.888283 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.888262 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-b8j7n"] Apr 16 14:55:26.897417 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.897254 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-654d7bdccf-4mwkc"] Apr 16 14:55:26.900101 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.900085 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:26.912720 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.912693 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59cfc831-0e33-47bf-91f5-3c4c514090ec-registry-tls\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:26.912812 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.912727 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dd701306-bfea-4f3a-a4b0-47ea87d026f6-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-swmsg\" (UID: \"dd701306-bfea-4f3a-a4b0-47ea87d026f6\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-swmsg" Apr 16 14:55:26.912812 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.912761 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkhsq\" (UniqueName: \"kubernetes.io/projected/59cfc831-0e33-47bf-91f5-3c4c514090ec-kube-api-access-hkhsq\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:26.912812 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.912787 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-654d7bdccf-4mwkc"] Apr 16 14:55:26.912915 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.912811 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8be3d31a-0693-4554-ab6c-0e45affa2eee-crio-socket\") pod \"insights-runtime-extractor-b8j7n\" (UID: \"8be3d31a-0693-4554-ab6c-0e45affa2eee\") " pod="openshift-insights/insights-runtime-extractor-b8j7n" Apr 16 14:55:26.912915 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.912835 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/59cfc831-0e33-47bf-91f5-3c4c514090ec-image-registry-private-configuration\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:26.912915 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.912858 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8be3d31a-0693-4554-ab6c-0e45affa2eee-data-volume\") pod \"insights-runtime-extractor-b8j7n\" (UID: \"8be3d31a-0693-4554-ab6c-0e45affa2eee\") " pod="openshift-insights/insights-runtime-extractor-b8j7n" Apr 16 14:55:26.912915 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.912873 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8be3d31a-0693-4554-ab6c-0e45affa2eee-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-b8j7n\" (UID: \"8be3d31a-0693-4554-ab6c-0e45affa2eee\") " pod="openshift-insights/insights-runtime-extractor-b8j7n" Apr 16 14:55:26.912915 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.912888 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59cfc831-0e33-47bf-91f5-3c4c514090ec-ca-trust-extracted\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:26.912915 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.912905 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tstcq\" (UniqueName: \"kubernetes.io/projected/8be3d31a-0693-4554-ab6c-0e45affa2eee-kube-api-access-tstcq\") pod \"insights-runtime-extractor-b8j7n\" (UID: \"8be3d31a-0693-4554-ab6c-0e45affa2eee\") " pod="openshift-insights/insights-runtime-extractor-b8j7n" Apr 16 14:55:26.913111 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.912972 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59cfc831-0e33-47bf-91f5-3c4c514090ec-bound-sa-token\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:26.913111 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.913014 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dd701306-bfea-4f3a-a4b0-47ea87d026f6-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-swmsg\" (UID: \"dd701306-bfea-4f3a-a4b0-47ea87d026f6\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-swmsg" Apr 16 14:55:26.913111 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.913046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert\") pod \"ingress-canary-94b8j\" (UID: \"f82e5609-2a2d-49f8-aae5-da767543bb3d\") " pod="openshift-ingress-canary/ingress-canary-94b8j" Apr 16 14:55:26.913111 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.913072 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59cfc831-0e33-47bf-91f5-3c4c514090ec-registry-certificates\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:26.913111 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.913092 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59cfc831-0e33-47bf-91f5-3c4c514090ec-trusted-ca\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:26.913256 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.913120 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59cfc831-0e33-47bf-91f5-3c4c514090ec-installation-pull-secrets\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:26.913256 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.913135 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8be3d31a-0693-4554-ab6c-0e45affa2eee-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b8j7n\" (UID: \"8be3d31a-0693-4554-ab6c-0e45affa2eee\") " pod="openshift-insights/insights-runtime-extractor-b8j7n" Apr 16 14:55:26.915495 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:26.915466 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82e5609-2a2d-49f8-aae5-da767543bb3d-cert\") pod \"ingress-canary-94b8j\" (UID: \"f82e5609-2a2d-49f8-aae5-da767543bb3d\") " pod="openshift-ingress-canary/ingress-canary-94b8j" Apr 16 14:55:27.013620 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.013587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59cfc831-0e33-47bf-91f5-3c4c514090ec-registry-tls\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:27.013620 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.013619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dd701306-bfea-4f3a-a4b0-47ea87d026f6-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-swmsg\" (UID: \"dd701306-bfea-4f3a-a4b0-47ea87d026f6\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-swmsg" Apr 16 14:55:27.013875 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.013651 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkhsq\" (UniqueName: \"kubernetes.io/projected/59cfc831-0e33-47bf-91f5-3c4c514090ec-kube-api-access-hkhsq\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:27.013875 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.013671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8be3d31a-0693-4554-ab6c-0e45affa2eee-crio-socket\") pod \"insights-runtime-extractor-b8j7n\" (UID: \"8be3d31a-0693-4554-ab6c-0e45affa2eee\") " pod="openshift-insights/insights-runtime-extractor-b8j7n" Apr 16 14:55:27.013875 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.013698 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/59cfc831-0e33-47bf-91f5-3c4c514090ec-image-registry-private-configuration\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:27.013875 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.013716 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8be3d31a-0693-4554-ab6c-0e45affa2eee-data-volume\") pod \"insights-runtime-extractor-b8j7n\" (UID: \"8be3d31a-0693-4554-ab6c-0e45affa2eee\") " pod="openshift-insights/insights-runtime-extractor-b8j7n" Apr 16 14:55:27.013875 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.013735 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8be3d31a-0693-4554-ab6c-0e45affa2eee-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-b8j7n\" (UID: \"8be3d31a-0693-4554-ab6c-0e45affa2eee\") " pod="openshift-insights/insights-runtime-extractor-b8j7n" Apr 16 14:55:27.013875 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.013758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59cfc831-0e33-47bf-91f5-3c4c514090ec-ca-trust-extracted\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:27.013875 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.013782 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tstcq\" (UniqueName: \"kubernetes.io/projected/8be3d31a-0693-4554-ab6c-0e45affa2eee-kube-api-access-tstcq\") pod \"insights-runtime-extractor-b8j7n\" (UID: \"8be3d31a-0693-4554-ab6c-0e45affa2eee\") " pod="openshift-insights/insights-runtime-extractor-b8j7n" Apr 16 14:55:27.014263 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.013878 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8be3d31a-0693-4554-ab6c-0e45affa2eee-crio-socket\") pod \"insights-runtime-extractor-b8j7n\" (UID: \"8be3d31a-0693-4554-ab6c-0e45affa2eee\") " pod="openshift-insights/insights-runtime-extractor-b8j7n" Apr 16 14:55:27.014263 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.014061 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59cfc831-0e33-47bf-91f5-3c4c514090ec-bound-sa-token\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:27.014263 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.014113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dd701306-bfea-4f3a-a4b0-47ea87d026f6-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-swmsg\" (UID: \"dd701306-bfea-4f3a-a4b0-47ea87d026f6\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-swmsg" Apr 16 14:55:27.014263 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.014173 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59cfc831-0e33-47bf-91f5-3c4c514090ec-registry-certificates\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:27.014263 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.014199 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59cfc831-0e33-47bf-91f5-3c4c514090ec-ca-trust-extracted\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:27.014263 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.014211 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59cfc831-0e33-47bf-91f5-3c4c514090ec-trusted-ca\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:27.014547 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.014279 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59cfc831-0e33-47bf-91f5-3c4c514090ec-installation-pull-secrets\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:27.014547 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.014308 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8be3d31a-0693-4554-ab6c-0e45affa2eee-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b8j7n\" (UID: \"8be3d31a-0693-4554-ab6c-0e45affa2eee\") " pod="openshift-insights/insights-runtime-extractor-b8j7n" Apr 16 14:55:27.014547 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.014359 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8be3d31a-0693-4554-ab6c-0e45affa2eee-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-b8j7n\" (UID: \"8be3d31a-0693-4554-ab6c-0e45affa2eee\") " pod="openshift-insights/insights-runtime-extractor-b8j7n" Apr 16 14:55:27.014547 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.014423 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dd701306-bfea-4f3a-a4b0-47ea87d026f6-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-swmsg\" (UID: \"dd701306-bfea-4f3a-a4b0-47ea87d026f6\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-swmsg" Apr 16 14:55:27.015064 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.014988 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59cfc831-0e33-47bf-91f5-3c4c514090ec-registry-certificates\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:27.015458 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.015418 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8be3d31a-0693-4554-ab6c-0e45affa2eee-data-volume\") pod \"insights-runtime-extractor-b8j7n\" (UID: \"8be3d31a-0693-4554-ab6c-0e45affa2eee\") " pod="openshift-insights/insights-runtime-extractor-b8j7n" Apr 16 14:55:27.015572 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.015488 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59cfc831-0e33-47bf-91f5-3c4c514090ec-trusted-ca\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:27.016593 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.016570 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59cfc831-0e33-47bf-91f5-3c4c514090ec-registry-tls\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:27.016758 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.016693 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8be3d31a-0693-4554-ab6c-0e45affa2eee-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b8j7n\" (UID: \"8be3d31a-0693-4554-ab6c-0e45affa2eee\") " pod="openshift-insights/insights-runtime-extractor-b8j7n" Apr 16 14:55:27.017089 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.017012 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dd701306-bfea-4f3a-a4b0-47ea87d026f6-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-swmsg\" (UID: \"dd701306-bfea-4f3a-a4b0-47ea87d026f6\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-swmsg" Apr 16 14:55:27.017316 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.017294 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/59cfc831-0e33-47bf-91f5-3c4c514090ec-image-registry-private-configuration\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:27.017377 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.017309 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59cfc831-0e33-47bf-91f5-3c4c514090ec-installation-pull-secrets\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:27.022736 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.022713 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59cfc831-0e33-47bf-91f5-3c4c514090ec-bound-sa-token\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:27.022814 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.022737 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkhsq\" (UniqueName: \"kubernetes.io/projected/59cfc831-0e33-47bf-91f5-3c4c514090ec-kube-api-access-hkhsq\") pod \"image-registry-654d7bdccf-4mwkc\" (UID: \"59cfc831-0e33-47bf-91f5-3c4c514090ec\") " pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:27.022852 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.022838 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tstcq\" (UniqueName: \"kubernetes.io/projected/8be3d31a-0693-4554-ab6c-0e45affa2eee-kube-api-access-tstcq\") pod \"insights-runtime-extractor-b8j7n\" (UID: \"8be3d31a-0693-4554-ab6c-0e45affa2eee\") " pod="openshift-insights/insights-runtime-extractor-b8j7n" Apr 16 14:55:27.054035 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.053994 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vzfs6\"" Apr 16 14:55:27.062252 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.062233 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rsgph" Apr 16 14:55:27.154815 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.154781 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-swmsg" Apr 16 14:55:27.178284 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.178241 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rsgph"] Apr 16 14:55:27.180933 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:55:27.180904 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod574ca2b9_aeca_4a60_8152_838c7e3d1902.slice/crio-8633f955017ea5a40c843b4772deb14c42a3b8a835fcae3408da85a71c9c819b WatchSource:0}: Error finding container 8633f955017ea5a40c843b4772deb14c42a3b8a835fcae3408da85a71c9c819b: Status 404 returned error can't find the container with id 8633f955017ea5a40c843b4772deb14c42a3b8a835fcae3408da85a71c9c819b Apr 16 14:55:27.184173 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.184155 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-b8j7n" Apr 16 14:55:27.208369 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.207923 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:27.285578 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.285521 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-swmsg"] Apr 16 14:55:27.289654 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:55:27.289621 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd701306_bfea_4f3a_a4b0_47ea87d026f6.slice/crio-41469a4443adfc056fcc6f8f02b3a14014144d0e7ba51bc1ec3f56e8daf1558d WatchSource:0}: Error finding container 41469a4443adfc056fcc6f8f02b3a14014144d0e7ba51bc1ec3f56e8daf1558d: Status 404 returned error can't find the container with id 41469a4443adfc056fcc6f8f02b3a14014144d0e7ba51bc1ec3f56e8daf1558d Apr 16 14:55:27.316665 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.316603 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-b8j7n"] Apr 16 14:55:27.319554 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:55:27.319528 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8be3d31a_0693_4554_ab6c_0e45affa2eee.slice/crio-1473e654368233e2660468d25c601f4613afda9139381bb6b5123c1e5e4a3c1a WatchSource:0}: Error finding container 1473e654368233e2660468d25c601f4613afda9139381bb6b5123c1e5e4a3c1a: Status 404 returned error can't find the container with id 1473e654368233e2660468d25c601f4613afda9139381bb6b5123c1e5e4a3c1a Apr 16 14:55:27.334964 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.334942 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-654d7bdccf-4mwkc"] Apr 16 14:55:27.338004 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:55:27.337983 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59cfc831_0e33_47bf_91f5_3c4c514090ec.slice/crio-5b484da545b9aa26b5b6cbfae3b10ba2ed02882f70f89488fd8ba0fe3711880f WatchSource:0}: Error finding container 5b484da545b9aa26b5b6cbfae3b10ba2ed02882f70f89488fd8ba0fe3711880f: Status 404 returned error can't find the container with id 5b484da545b9aa26b5b6cbfae3b10ba2ed02882f70f89488fd8ba0fe3711880f Apr 16 14:55:27.865713 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.865672 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-swmsg" event={"ID":"dd701306-bfea-4f3a-a4b0-47ea87d026f6","Type":"ContainerStarted","Data":"41469a4443adfc056fcc6f8f02b3a14014144d0e7ba51bc1ec3f56e8daf1558d"} Apr 16 14:55:27.867067 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.867016 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rsgph" event={"ID":"574ca2b9-aeca-4a60-8152-838c7e3d1902","Type":"ContainerStarted","Data":"8633f955017ea5a40c843b4772deb14c42a3b8a835fcae3408da85a71c9c819b"} Apr 16 14:55:27.869076 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.868987 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" event={"ID":"59cfc831-0e33-47bf-91f5-3c4c514090ec","Type":"ContainerStarted","Data":"f9fff732014aafa17f7385e44097f088b24ed9d93b4562b2367b4adb03c4388d"} Apr 16 14:55:27.869076 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.869045 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" event={"ID":"59cfc831-0e33-47bf-91f5-3c4c514090ec","Type":"ContainerStarted","Data":"5b484da545b9aa26b5b6cbfae3b10ba2ed02882f70f89488fd8ba0fe3711880f"} Apr 16 14:55:27.869449 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.869321 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:27.870643 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.870619 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b8j7n" event={"ID":"8be3d31a-0693-4554-ab6c-0e45affa2eee","Type":"ContainerStarted","Data":"b1ad9405c0fe5e22d1735ae6e20e62df1acda0c28ea9ddebf87367c53dae6608"} Apr 16 14:55:27.870753 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.870648 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b8j7n" event={"ID":"8be3d31a-0693-4554-ab6c-0e45affa2eee","Type":"ContainerStarted","Data":"1473e654368233e2660468d25c601f4613afda9139381bb6b5123c1e5e4a3c1a"} Apr 16 14:55:27.890188 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:27.889838 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" podStartSLOduration=1.889824721 podStartE2EDuration="1.889824721s" podCreationTimestamp="2026-04-16 14:55:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:55:27.889382113 +0000 UTC m=+162.308517089" watchObservedRunningTime="2026-04-16 14:55:27.889824721 +0000 UTC m=+162.308959697" Apr 16 14:55:29.877578 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:29.877486 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b8j7n" event={"ID":"8be3d31a-0693-4554-ab6c-0e45affa2eee","Type":"ContainerStarted","Data":"eebbc76abb88b322b013427027488c40e83c75a22d70398e4213ca6d9acd207d"} Apr 16 14:55:29.878962 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:29.878929 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-swmsg" event={"ID":"dd701306-bfea-4f3a-a4b0-47ea87d026f6","Type":"ContainerStarted","Data":"1da3ceca8b746ceddbe1f71c307c54cc5a138f9950215785b08f3e08239c3131"} Apr 16 14:55:29.880630 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:29.880595 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rsgph" event={"ID":"574ca2b9-aeca-4a60-8152-838c7e3d1902","Type":"ContainerStarted","Data":"e97c382a41bceeeb49c376bd942e6d3a86dcbd4198789ea3d1bc51b191d4d670"} Apr 16 14:55:29.880728 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:29.880630 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rsgph" event={"ID":"574ca2b9-aeca-4a60-8152-838c7e3d1902","Type":"ContainerStarted","Data":"81032bccde0e5e58c4bf3bc0aecf90d7ea389d4dbc2a9dbca0950afa1a91acfa"} Apr 16 14:55:29.880773 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:29.880742 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-rsgph" Apr 16 14:55:29.898485 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:29.898443 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-swmsg" podStartSLOduration=1.882711178 podStartE2EDuration="3.89843172s" podCreationTimestamp="2026-04-16 14:55:26 +0000 UTC" firstStartedPulling="2026-04-16 14:55:27.29197971 +0000 UTC m=+161.711114667" lastFinishedPulling="2026-04-16 14:55:29.307700251 +0000 UTC m=+163.726835209" observedRunningTime="2026-04-16 14:55:29.89732717 +0000 UTC m=+164.316462145" watchObservedRunningTime="2026-04-16 14:55:29.89843172 +0000 UTC m=+164.317566741" Apr 16 14:55:29.918608 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:29.918562 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rsgph" podStartSLOduration=129.794271037 podStartE2EDuration="2m11.918545684s" podCreationTimestamp="2026-04-16 14:53:18 +0000 UTC" firstStartedPulling="2026-04-16 14:55:27.182787404 +0000 UTC m=+161.601922361" lastFinishedPulling="2026-04-16 14:55:29.307062056 +0000 UTC m=+163.726197008" observedRunningTime="2026-04-16 14:55:29.917982427 +0000 UTC m=+164.337117403" watchObservedRunningTime="2026-04-16 14:55:29.918545684 +0000 UTC m=+164.337680660" Apr 16 14:55:30.885189 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:30.885147 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b8j7n" event={"ID":"8be3d31a-0693-4554-ab6c-0e45affa2eee","Type":"ContainerStarted","Data":"4590b75b382f4e90ccdc9e0ea1692dc018622b954bf43e3b12ac80aad5098f6e"} Apr 16 14:55:30.902757 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:30.902712 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-b8j7n" podStartSLOduration=1.6360743439999998 podStartE2EDuration="4.902699657s" podCreationTimestamp="2026-04-16 14:55:26 +0000 UTC" firstStartedPulling="2026-04-16 14:55:27.373858048 +0000 UTC m=+161.792993007" lastFinishedPulling="2026-04-16 14:55:30.640483363 +0000 UTC m=+165.059618320" observedRunningTime="2026-04-16 14:55:30.90127916 +0000 UTC m=+165.320414135" watchObservedRunningTime="2026-04-16 14:55:30.902699657 +0000 UTC m=+165.321834677" Apr 16 14:55:32.461974 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:32.461934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-9pd4w\" (UID: \"9b07e8fb-9184-409a-ac6c-ab62ef5c0a79\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w" Apr 16 14:55:32.464405 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:32.464379 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b07e8fb-9184-409a-ac6c-ab62ef5c0a79-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-9pd4w\" (UID: \"9b07e8fb-9184-409a-ac6c-ab62ef5c0a79\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w" Apr 16 14:55:32.738194 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:32.738091 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w" Apr 16 14:55:32.850638 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:32.850604 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w"] Apr 16 14:55:32.854662 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:55:32.854632 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b07e8fb_9184_409a_ac6c_ab62ef5c0a79.slice/crio-c645069fbfcc0b1eacc5808448da6d16bca33438a89fee5e09490cd635450115 WatchSource:0}: Error finding container c645069fbfcc0b1eacc5808448da6d16bca33438a89fee5e09490cd635450115: Status 404 returned error can't find the container with id c645069fbfcc0b1eacc5808448da6d16bca33438a89fee5e09490cd635450115 Apr 16 14:55:32.895095 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:32.895068 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w" event={"ID":"9b07e8fb-9184-409a-ac6c-ab62ef5c0a79","Type":"ContainerStarted","Data":"c645069fbfcc0b1eacc5808448da6d16bca33438a89fee5e09490cd635450115"} Apr 16 14:55:34.902739 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:34.902704 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w" event={"ID":"9b07e8fb-9184-409a-ac6c-ab62ef5c0a79","Type":"ContainerStarted","Data":"7efbd1e48b4fb59eaad7361143d04b6236f634d16bb46a82522c5fe24e8c9808"} Apr 16 14:55:34.920458 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:34.920411 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9pd4w" podStartSLOduration=33.437262597 podStartE2EDuration="34.92039732s" podCreationTimestamp="2026-04-16 14:55:00 +0000 UTC" firstStartedPulling="2026-04-16 14:55:32.85704947 +0000 UTC m=+167.276184424" lastFinishedPulling="2026-04-16 14:55:34.340184191 +0000 UTC m=+168.759319147" observedRunningTime="2026-04-16 14:55:34.919544902 +0000 UTC m=+169.338679896" watchObservedRunningTime="2026-04-16 14:55:34.92039732 +0000 UTC m=+169.339532341" Apr 16 14:55:35.303280 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:35.303202 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-94b8j" Apr 16 14:55:35.306117 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:35.306098 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-8jkdg\"" Apr 16 14:55:35.314409 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:35.314395 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-94b8j" Apr 16 14:55:35.427813 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:35.427762 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-94b8j"] Apr 16 14:55:35.435722 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:55:35.435693 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf82e5609_2a2d_49f8_aae5_da767543bb3d.slice/crio-0f7ac4ba390a1c171d1a15a0f960a643e488839284826d5ff083bde5df29cfe4 WatchSource:0}: Error finding container 0f7ac4ba390a1c171d1a15a0f960a643e488839284826d5ff083bde5df29cfe4: Status 404 returned error can't find the container with id 0f7ac4ba390a1c171d1a15a0f960a643e488839284826d5ff083bde5df29cfe4 Apr 16 14:55:35.906479 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:35.906445 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-94b8j" event={"ID":"f82e5609-2a2d-49f8-aae5-da767543bb3d","Type":"ContainerStarted","Data":"0f7ac4ba390a1c171d1a15a0f960a643e488839284826d5ff083bde5df29cfe4"} Apr 16 14:55:36.864399 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:36.864367 2576 patch_prober.go:28] interesting pod/image-registry-67bddfdd5b-xjz7q container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:55:36.864558 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:36.864424 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" podUID="5830445f-51ad-4827-b195-39ebf2152864" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:55:37.302336 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:37.302293 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:55:37.912388 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:37.912355 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-94b8j" event={"ID":"f82e5609-2a2d-49f8-aae5-da767543bb3d","Type":"ContainerStarted","Data":"8fc971ca43a8ea6cbacb8f9a7d6bb796668b0aea0934b12f6d68225abdc0f482"} Apr 16 14:55:37.931698 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:37.931654 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-94b8j" podStartSLOduration=138.495648315 podStartE2EDuration="2m19.931636202s" podCreationTimestamp="2026-04-16 14:53:18 +0000 UTC" firstStartedPulling="2026-04-16 14:55:35.437637402 +0000 UTC m=+169.856772354" lastFinishedPulling="2026-04-16 14:55:36.873625277 +0000 UTC m=+171.292760241" observedRunningTime="2026-04-16 14:55:37.929967127 +0000 UTC m=+172.349102102" watchObservedRunningTime="2026-04-16 14:55:37.931636202 +0000 UTC m=+172.350771180" Apr 16 14:55:39.887539 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:39.887507 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rsgph" Apr 16 14:55:46.863540 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:46.863501 2576 patch_prober.go:28] interesting pod/image-registry-67bddfdd5b-xjz7q container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:55:46.863958 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:46.863556 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" podUID="5830445f-51ad-4827-b195-39ebf2152864" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:55:47.212683 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:47.212650 2576 patch_prober.go:28] interesting pod/image-registry-654d7bdccf-4mwkc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:55:47.212836 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:47.212717 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" podUID="59cfc831-0e33-47bf-91f5-3c4c514090ec" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:55:48.877633 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:48.877606 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-654d7bdccf-4mwkc" Apr 16 14:55:51.879390 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:51.879346 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" podUID="5830445f-51ad-4827-b195-39ebf2152864" containerName="registry" containerID="cri-o://050c07704a38de6db719dd7cf427357c3fdc0a1439a1718ac53cc67256c5c394" gracePeriod=30 Apr 16 14:55:52.111035 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.110997 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:52.210938 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.210905 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5830445f-51ad-4827-b195-39ebf2152864-ca-trust-extracted\") pod \"5830445f-51ad-4827-b195-39ebf2152864\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " Apr 16 14:55:52.211145 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.210957 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5830445f-51ad-4827-b195-39ebf2152864-image-registry-private-configuration\") pod \"5830445f-51ad-4827-b195-39ebf2152864\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " Apr 16 14:55:52.211145 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.210984 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-registry-tls\") pod \"5830445f-51ad-4827-b195-39ebf2152864\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " Apr 16 14:55:52.211145 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.211040 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xk7b\" (UniqueName: \"kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-kube-api-access-5xk7b\") pod \"5830445f-51ad-4827-b195-39ebf2152864\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " Apr 16 14:55:52.211145 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.211106 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5830445f-51ad-4827-b195-39ebf2152864-registry-certificates\") pod \"5830445f-51ad-4827-b195-39ebf2152864\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " Apr 16 14:55:52.211145 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.211139 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-bound-sa-token\") pod \"5830445f-51ad-4827-b195-39ebf2152864\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " Apr 16 14:55:52.211398 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.211176 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5830445f-51ad-4827-b195-39ebf2152864-trusted-ca\") pod \"5830445f-51ad-4827-b195-39ebf2152864\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " Apr 16 14:55:52.211398 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.211275 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5830445f-51ad-4827-b195-39ebf2152864-installation-pull-secrets\") pod \"5830445f-51ad-4827-b195-39ebf2152864\" (UID: \"5830445f-51ad-4827-b195-39ebf2152864\") " Apr 16 14:55:52.211679 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.211649 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5830445f-51ad-4827-b195-39ebf2152864-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5830445f-51ad-4827-b195-39ebf2152864" (UID: "5830445f-51ad-4827-b195-39ebf2152864"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:52.211804 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.211699 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5830445f-51ad-4827-b195-39ebf2152864-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5830445f-51ad-4827-b195-39ebf2152864" (UID: "5830445f-51ad-4827-b195-39ebf2152864"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:52.213451 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.213418 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-kube-api-access-5xk7b" (OuterVolumeSpecName: "kube-api-access-5xk7b") pod "5830445f-51ad-4827-b195-39ebf2152864" (UID: "5830445f-51ad-4827-b195-39ebf2152864"). InnerVolumeSpecName "kube-api-access-5xk7b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:55:52.213543 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.213519 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5830445f-51ad-4827-b195-39ebf2152864-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "5830445f-51ad-4827-b195-39ebf2152864" (UID: "5830445f-51ad-4827-b195-39ebf2152864"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:55:52.213684 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.213653 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5830445f-51ad-4827-b195-39ebf2152864" (UID: "5830445f-51ad-4827-b195-39ebf2152864"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:55:52.213766 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.213683 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5830445f-51ad-4827-b195-39ebf2152864-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5830445f-51ad-4827-b195-39ebf2152864" (UID: "5830445f-51ad-4827-b195-39ebf2152864"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:55:52.213915 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.213901 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5830445f-51ad-4827-b195-39ebf2152864" (UID: "5830445f-51ad-4827-b195-39ebf2152864"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:55:52.220342 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.220320 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5830445f-51ad-4827-b195-39ebf2152864-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5830445f-51ad-4827-b195-39ebf2152864" (UID: "5830445f-51ad-4827-b195-39ebf2152864"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:55:52.312068 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.312042 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-bound-sa-token\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 14:55:52.312068 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.312065 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5830445f-51ad-4827-b195-39ebf2152864-trusted-ca\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 14:55:52.312269 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.312074 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5830445f-51ad-4827-b195-39ebf2152864-installation-pull-secrets\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 14:55:52.312269 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.312086 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5830445f-51ad-4827-b195-39ebf2152864-ca-trust-extracted\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 14:55:52.312269 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.312095 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5830445f-51ad-4827-b195-39ebf2152864-image-registry-private-configuration\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 14:55:52.312269 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.312104 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-registry-tls\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 14:55:52.312269 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.312113 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5xk7b\" (UniqueName: \"kubernetes.io/projected/5830445f-51ad-4827-b195-39ebf2152864-kube-api-access-5xk7b\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 14:55:52.312269 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.312122 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5830445f-51ad-4827-b195-39ebf2152864-registry-certificates\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 14:55:52.950329 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.950291 2576 generic.go:358] "Generic (PLEG): container finished" podID="5830445f-51ad-4827-b195-39ebf2152864" containerID="050c07704a38de6db719dd7cf427357c3fdc0a1439a1718ac53cc67256c5c394" exitCode=0 Apr 16 14:55:52.950873 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.950847 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" event={"ID":"5830445f-51ad-4827-b195-39ebf2152864","Type":"ContainerDied","Data":"050c07704a38de6db719dd7cf427357c3fdc0a1439a1718ac53cc67256c5c394"} Apr 16 14:55:52.950988 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.950975 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" event={"ID":"5830445f-51ad-4827-b195-39ebf2152864","Type":"ContainerDied","Data":"f50108335a7182f3e521aaa533e7776660282106e13d7c89671cfeaeecfbf57b"} Apr 16 14:55:52.951118 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.951088 2576 scope.go:117] "RemoveContainer" containerID="050c07704a38de6db719dd7cf427357c3fdc0a1439a1718ac53cc67256c5c394" Apr 16 14:55:52.951273 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.951256 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67bddfdd5b-xjz7q" Apr 16 14:55:52.961707 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.961685 2576 scope.go:117] "RemoveContainer" containerID="050c07704a38de6db719dd7cf427357c3fdc0a1439a1718ac53cc67256c5c394" Apr 16 14:55:52.961985 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:52.961956 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"050c07704a38de6db719dd7cf427357c3fdc0a1439a1718ac53cc67256c5c394\": container with ID starting with 050c07704a38de6db719dd7cf427357c3fdc0a1439a1718ac53cc67256c5c394 not found: ID does not exist" containerID="050c07704a38de6db719dd7cf427357c3fdc0a1439a1718ac53cc67256c5c394" Apr 16 14:55:52.962119 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.961996 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"050c07704a38de6db719dd7cf427357c3fdc0a1439a1718ac53cc67256c5c394"} err="failed to get container status \"050c07704a38de6db719dd7cf427357c3fdc0a1439a1718ac53cc67256c5c394\": rpc error: code = NotFound desc = could not find container \"050c07704a38de6db719dd7cf427357c3fdc0a1439a1718ac53cc67256c5c394\": container with ID starting with 050c07704a38de6db719dd7cf427357c3fdc0a1439a1718ac53cc67256c5c394 not found: ID does not exist" Apr 16 14:55:52.986312 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.986285 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-67bddfdd5b-xjz7q"] Apr 16 14:55:52.997696 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:52.997674 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-67bddfdd5b-xjz7q"] Apr 16 14:55:53.249380 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.249294 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk"] Apr 16 14:55:53.249651 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.249637 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5830445f-51ad-4827-b195-39ebf2152864" containerName="registry" Apr 16 14:55:53.249721 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.249654 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5830445f-51ad-4827-b195-39ebf2152864" containerName="registry" Apr 16 14:55:53.249773 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.249723 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5830445f-51ad-4827-b195-39ebf2152864" containerName="registry" Apr 16 14:55:53.258384 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.258349 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk" Apr 16 14:55:53.260956 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.260729 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 14:55:53.260956 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.260767 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 14:55:53.260956 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.260886 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-dwlb5\"" Apr 16 14:55:53.261288 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.261260 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:55:53.266702 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.266421 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk"] Apr 16 14:55:53.277471 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.277451 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-7xvvx"] Apr 16 14:55:53.280452 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.280433 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" Apr 16 14:55:53.289643 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.289620 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 14:55:53.289750 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.289648 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 14:55:53.289909 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.289895 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-8hbtn\"" Apr 16 14:55:53.293298 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.293277 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 14:55:53.308133 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.308114 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-7xvvx"] Apr 16 14:55:53.320392 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.320372 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s7n9\" (UniqueName: \"kubernetes.io/projected/460ad184-72b0-4b47-b454-93b01b7a7648-kube-api-access-4s7n9\") pod \"kube-state-metrics-7479c89684-7xvvx\" (UID: \"460ad184-72b0-4b47-b454-93b01b7a7648\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" Apr 16 14:55:53.320484 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.320405 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2cfd5d54-0a18-4111-bb16-ee0e795d6f34-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-b4xpk\" (UID: \"2cfd5d54-0a18-4111-bb16-ee0e795d6f34\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk" Apr 16 14:55:53.320484 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.320436 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/460ad184-72b0-4b47-b454-93b01b7a7648-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-7xvvx\" (UID: \"460ad184-72b0-4b47-b454-93b01b7a7648\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" Apr 16 14:55:53.320593 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.320512 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mchzb\" (UniqueName: \"kubernetes.io/projected/2cfd5d54-0a18-4111-bb16-ee0e795d6f34-kube-api-access-mchzb\") pod \"openshift-state-metrics-5669946b84-b4xpk\" (UID: \"2cfd5d54-0a18-4111-bb16-ee0e795d6f34\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk" Apr 16 14:55:53.320593 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.320558 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2cfd5d54-0a18-4111-bb16-ee0e795d6f34-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-b4xpk\" (UID: \"2cfd5d54-0a18-4111-bb16-ee0e795d6f34\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk" Apr 16 14:55:53.320696 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.320598 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/460ad184-72b0-4b47-b454-93b01b7a7648-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-7xvvx\" (UID: \"460ad184-72b0-4b47-b454-93b01b7a7648\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" Apr 16 14:55:53.320696 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.320654 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/460ad184-72b0-4b47-b454-93b01b7a7648-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-7xvvx\" (UID: \"460ad184-72b0-4b47-b454-93b01b7a7648\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" Apr 16 14:55:53.320786 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.320716 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/460ad184-72b0-4b47-b454-93b01b7a7648-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-7xvvx\" (UID: \"460ad184-72b0-4b47-b454-93b01b7a7648\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" Apr 16 14:55:53.320786 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.320744 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2cfd5d54-0a18-4111-bb16-ee0e795d6f34-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-b4xpk\" (UID: \"2cfd5d54-0a18-4111-bb16-ee0e795d6f34\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk" Apr 16 14:55:53.320956 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.320797 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/460ad184-72b0-4b47-b454-93b01b7a7648-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-7xvvx\" (UID: \"460ad184-72b0-4b47-b454-93b01b7a7648\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" Apr 16 14:55:53.330319 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.330291 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-qpqgz"] Apr 16 14:55:53.333523 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.333504 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.336264 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.336247 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:55:53.336419 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.336260 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mkbd4\"" Apr 16 14:55:53.336651 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.336634 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:55:53.339491 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.339461 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:55:53.421693 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.421664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/460ad184-72b0-4b47-b454-93b01b7a7648-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-7xvvx\" (UID: \"460ad184-72b0-4b47-b454-93b01b7a7648\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" Apr 16 14:55:53.421847 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.421713 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-metrics-client-ca\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.421847 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.421744 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4s7n9\" (UniqueName: \"kubernetes.io/projected/460ad184-72b0-4b47-b454-93b01b7a7648-kube-api-access-4s7n9\") pod \"kube-state-metrics-7479c89684-7xvvx\" (UID: \"460ad184-72b0-4b47-b454-93b01b7a7648\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" Apr 16 14:55:53.421847 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.421770 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2cfd5d54-0a18-4111-bb16-ee0e795d6f34-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-b4xpk\" (UID: \"2cfd5d54-0a18-4111-bb16-ee0e795d6f34\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk" Apr 16 14:55:53.421847 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.421798 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/460ad184-72b0-4b47-b454-93b01b7a7648-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-7xvvx\" (UID: \"460ad184-72b0-4b47-b454-93b01b7a7648\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" Apr 16 14:55:53.421847 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.421832 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mchzb\" (UniqueName: \"kubernetes.io/projected/2cfd5d54-0a18-4111-bb16-ee0e795d6f34-kube-api-access-mchzb\") pod \"openshift-state-metrics-5669946b84-b4xpk\" (UID: \"2cfd5d54-0a18-4111-bb16-ee0e795d6f34\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk" Apr 16 14:55:53.422155 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.421858 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-node-exporter-wtmp\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.422155 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.421889 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2cfd5d54-0a18-4111-bb16-ee0e795d6f34-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-b4xpk\" (UID: \"2cfd5d54-0a18-4111-bb16-ee0e795d6f34\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk" Apr 16 14:55:53.422155 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:53.421909 2576 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 14:55:53.422155 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.421922 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-node-exporter-textfile\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.422155 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.421954 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/460ad184-72b0-4b47-b454-93b01b7a7648-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-7xvvx\" (UID: \"460ad184-72b0-4b47-b454-93b01b7a7648\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" Apr 16 14:55:53.422155 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:53.421988 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2cfd5d54-0a18-4111-bb16-ee0e795d6f34-openshift-state-metrics-tls podName:2cfd5d54-0a18-4111-bb16-ee0e795d6f34 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:53.921968499 +0000 UTC m=+188.341103468 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/2cfd5d54-0a18-4111-bb16-ee0e795d6f34-openshift-state-metrics-tls") pod "openshift-state-metrics-5669946b84-b4xpk" (UID: "2cfd5d54-0a18-4111-bb16-ee0e795d6f34") : secret "openshift-state-metrics-tls" not found Apr 16 14:55:53.422155 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.422059 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6z65\" (UniqueName: \"kubernetes.io/projected/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-kube-api-access-c6z65\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.422155 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.422111 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/460ad184-72b0-4b47-b454-93b01b7a7648-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-7xvvx\" (UID: \"460ad184-72b0-4b47-b454-93b01b7a7648\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" Apr 16 14:55:53.422558 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.422158 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-sys\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.422558 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.422199 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-node-exporter-tls\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.422558 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.422230 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/460ad184-72b0-4b47-b454-93b01b7a7648-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-7xvvx\" (UID: \"460ad184-72b0-4b47-b454-93b01b7a7648\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" Apr 16 14:55:53.422558 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.422272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2cfd5d54-0a18-4111-bb16-ee0e795d6f34-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-b4xpk\" (UID: \"2cfd5d54-0a18-4111-bb16-ee0e795d6f34\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk" Apr 16 14:55:53.422558 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.422303 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.422558 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.422353 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-root\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.422558 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.422389 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-node-exporter-accelerators-collector-config\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.422558 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.422474 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/460ad184-72b0-4b47-b454-93b01b7a7648-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-7xvvx\" (UID: \"460ad184-72b0-4b47-b454-93b01b7a7648\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" Apr 16 14:55:53.422961 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.422589 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/460ad184-72b0-4b47-b454-93b01b7a7648-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-7xvvx\" (UID: \"460ad184-72b0-4b47-b454-93b01b7a7648\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" Apr 16 14:55:53.422961 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:53.422688 2576 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 14:55:53.422961 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:53.422733 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/460ad184-72b0-4b47-b454-93b01b7a7648-kube-state-metrics-tls podName:460ad184-72b0-4b47-b454-93b01b7a7648 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:53.922717455 +0000 UTC m=+188.341852414 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/460ad184-72b0-4b47-b454-93b01b7a7648-kube-state-metrics-tls") pod "kube-state-metrics-7479c89684-7xvvx" (UID: "460ad184-72b0-4b47-b454-93b01b7a7648") : secret "kube-state-metrics-tls" not found Apr 16 14:55:53.422961 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.422845 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/460ad184-72b0-4b47-b454-93b01b7a7648-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-7xvvx\" (UID: \"460ad184-72b0-4b47-b454-93b01b7a7648\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" Apr 16 14:55:53.423194 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.423139 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2cfd5d54-0a18-4111-bb16-ee0e795d6f34-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-b4xpk\" (UID: \"2cfd5d54-0a18-4111-bb16-ee0e795d6f34\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk" Apr 16 14:55:53.425334 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.425310 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2cfd5d54-0a18-4111-bb16-ee0e795d6f34-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-b4xpk\" (UID: \"2cfd5d54-0a18-4111-bb16-ee0e795d6f34\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk" Apr 16 14:55:53.425428 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.425387 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/460ad184-72b0-4b47-b454-93b01b7a7648-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-7xvvx\" (UID: \"460ad184-72b0-4b47-b454-93b01b7a7648\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" Apr 16 14:55:53.430468 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.430446 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mchzb\" (UniqueName: \"kubernetes.io/projected/2cfd5d54-0a18-4111-bb16-ee0e795d6f34-kube-api-access-mchzb\") pod \"openshift-state-metrics-5669946b84-b4xpk\" (UID: \"2cfd5d54-0a18-4111-bb16-ee0e795d6f34\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk" Apr 16 14:55:53.431843 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.431822 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s7n9\" (UniqueName: \"kubernetes.io/projected/460ad184-72b0-4b47-b454-93b01b7a7648-kube-api-access-4s7n9\") pod \"kube-state-metrics-7479c89684-7xvvx\" (UID: \"460ad184-72b0-4b47-b454-93b01b7a7648\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" Apr 16 14:55:53.522987 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.522910 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-metrics-client-ca\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.522987 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.522983 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-node-exporter-wtmp\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.523162 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.523014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-node-exporter-textfile\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.523162 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.523073 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6z65\" (UniqueName: \"kubernetes.io/projected/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-kube-api-access-c6z65\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.523162 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.523109 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-sys\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.523162 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.523134 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-node-exporter-tls\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.523305 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.523172 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.523305 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.523178 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-node-exporter-wtmp\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.523438 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.523418 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-node-exporter-textfile\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.523499 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.523483 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-sys\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.523541 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:53.523504 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 14:55:53.523541 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.523520 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-root\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.523637 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.523536 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-metrics-client-ca\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.523637 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.523554 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-node-exporter-accelerators-collector-config\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.523637 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:53.523564 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-node-exporter-tls podName:528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd nodeName:}" failed. No retries permitted until 2026-04-16 14:55:54.0235465 +0000 UTC m=+188.442681466 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-node-exporter-tls") pod "node-exporter-qpqgz" (UID: "528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd") : secret "node-exporter-tls" not found Apr 16 14:55:53.523637 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.523598 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-root\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.524481 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.524457 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-node-exporter-accelerators-collector-config\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.525508 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.525490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.535090 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.535071 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6z65\" (UniqueName: \"kubernetes.io/projected/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-kube-api-access-c6z65\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:53.927177 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.927142 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/460ad184-72b0-4b47-b454-93b01b7a7648-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-7xvvx\" (UID: \"460ad184-72b0-4b47-b454-93b01b7a7648\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" Apr 16 14:55:53.927335 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.927204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2cfd5d54-0a18-4111-bb16-ee0e795d6f34-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-b4xpk\" (UID: \"2cfd5d54-0a18-4111-bb16-ee0e795d6f34\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk" Apr 16 14:55:53.927335 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:53.927287 2576 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 14:55:53.927409 ip-10-0-129-76 kubenswrapper[2576]: E0416 14:55:53.927358 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/460ad184-72b0-4b47-b454-93b01b7a7648-kube-state-metrics-tls podName:460ad184-72b0-4b47-b454-93b01b7a7648 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:54.927341546 +0000 UTC m=+189.346476519 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/460ad184-72b0-4b47-b454-93b01b7a7648-kube-state-metrics-tls") pod "kube-state-metrics-7479c89684-7xvvx" (UID: "460ad184-72b0-4b47-b454-93b01b7a7648") : secret "kube-state-metrics-tls" not found Apr 16 14:55:53.929597 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:53.929570 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2cfd5d54-0a18-4111-bb16-ee0e795d6f34-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-b4xpk\" (UID: \"2cfd5d54-0a18-4111-bb16-ee0e795d6f34\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk" Apr 16 14:55:54.028183 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.028142 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-node-exporter-tls\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:54.030405 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.030383 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd-node-exporter-tls\") pod \"node-exporter-qpqgz\" (UID: \"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd\") " pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:54.170293 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.170262 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk" Apr 16 14:55:54.242485 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.242453 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qpqgz" Apr 16 14:55:54.251612 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:55:54.251554 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod528cc92a_aa51_4bd0_9c9c_c21ddf5d16bd.slice/crio-f5a0a99ab3e03cb55c0e6e91492cbd515fd3bba5aca72b2658913b416f870851 WatchSource:0}: Error finding container f5a0a99ab3e03cb55c0e6e91492cbd515fd3bba5aca72b2658913b416f870851: Status 404 returned error can't find the container with id f5a0a99ab3e03cb55c0e6e91492cbd515fd3bba5aca72b2658913b416f870851 Apr 16 14:55:54.299887 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.299861 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk"] Apr 16 14:55:54.304059 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:55:54.304002 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cfd5d54_0a18_4111_bb16_ee0e795d6f34.slice/crio-f5f306e3d605851bd2c420a8144201c9998ee7f6fbce50eac12de2de013cde77 WatchSource:0}: Error finding container f5f306e3d605851bd2c420a8144201c9998ee7f6fbce50eac12de2de013cde77: Status 404 returned error can't find the container with id f5f306e3d605851bd2c420a8144201c9998ee7f6fbce50eac12de2de013cde77 Apr 16 14:55:54.306764 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.306741 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5830445f-51ad-4827-b195-39ebf2152864" path="/var/lib/kubelet/pods/5830445f-51ad-4827-b195-39ebf2152864/volumes" Apr 16 14:55:54.339782 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.339759 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:55:54.344965 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.344946 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.347503 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.347458 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 14:55:54.347503 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.347481 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 14:55:54.347672 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.347463 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 14:55:54.348095 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.347936 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 14:55:54.348095 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.347956 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 14:55:54.348095 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.347962 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 14:55:54.348095 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.348013 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 14:55:54.348095 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.348039 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-fr6wd\"" Apr 16 14:55:54.348342 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.348178 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 14:55:54.348429 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.348413 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 14:55:54.358045 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.358009 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:55:54.431367 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.431252 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/27e1b5f9-340a-4274-9148-50c65175772e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.431367 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.431306 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/27e1b5f9-340a-4274-9148-50c65175772e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.431367 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.431332 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/27e1b5f9-340a-4274-9148-50c65175772e-web-config\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.431367 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.431361 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vrxf\" (UniqueName: \"kubernetes.io/projected/27e1b5f9-340a-4274-9148-50c65175772e-kube-api-access-9vrxf\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.431704 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.431390 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/27e1b5f9-340a-4274-9148-50c65175772e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.431704 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.431430 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/27e1b5f9-340a-4274-9148-50c65175772e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.431704 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.431488 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/27e1b5f9-340a-4274-9148-50c65175772e-config-volume\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.431704 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.431523 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/27e1b5f9-340a-4274-9148-50c65175772e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.431704 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.431564 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/27e1b5f9-340a-4274-9148-50c65175772e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.431704 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.431684 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/27e1b5f9-340a-4274-9148-50c65175772e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.431997 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.431790 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/27e1b5f9-340a-4274-9148-50c65175772e-config-out\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.431997 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.431835 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/27e1b5f9-340a-4274-9148-50c65175772e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.431997 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.431859 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27e1b5f9-340a-4274-9148-50c65175772e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.532276 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.532240 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/27e1b5f9-340a-4274-9148-50c65175772e-config-out\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.532276 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.532277 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/27e1b5f9-340a-4274-9148-50c65175772e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.532519 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.532299 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27e1b5f9-340a-4274-9148-50c65175772e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.532519 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.532467 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/27e1b5f9-340a-4274-9148-50c65175772e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.532519 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.532502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/27e1b5f9-340a-4274-9148-50c65175772e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.532675 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.532526 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/27e1b5f9-340a-4274-9148-50c65175772e-web-config\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.532675 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.532632 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vrxf\" (UniqueName: \"kubernetes.io/projected/27e1b5f9-340a-4274-9148-50c65175772e-kube-api-access-9vrxf\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.532773 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.532677 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/27e1b5f9-340a-4274-9148-50c65175772e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.532773 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.532706 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/27e1b5f9-340a-4274-9148-50c65175772e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.532773 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.532733 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/27e1b5f9-340a-4274-9148-50c65175772e-config-volume\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.532773 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.532765 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/27e1b5f9-340a-4274-9148-50c65175772e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.532943 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.532789 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/27e1b5f9-340a-4274-9148-50c65175772e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.532943 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.532848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/27e1b5f9-340a-4274-9148-50c65175772e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.533727 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.533111 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/27e1b5f9-340a-4274-9148-50c65175772e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.533727 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.533275 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27e1b5f9-340a-4274-9148-50c65175772e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.533727 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.533686 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/27e1b5f9-340a-4274-9148-50c65175772e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.535430 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.535387 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/27e1b5f9-340a-4274-9148-50c65175772e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.535738 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.535705 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/27e1b5f9-340a-4274-9148-50c65175772e-config-out\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.536318 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.536294 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/27e1b5f9-340a-4274-9148-50c65175772e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.536488 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.536470 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/27e1b5f9-340a-4274-9148-50c65175772e-config-volume\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.536882 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.536842 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/27e1b5f9-340a-4274-9148-50c65175772e-web-config\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.536980 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.536885 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/27e1b5f9-340a-4274-9148-50c65175772e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.536980 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.536909 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/27e1b5f9-340a-4274-9148-50c65175772e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.537779 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.537755 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/27e1b5f9-340a-4274-9148-50c65175772e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.537969 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.537955 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/27e1b5f9-340a-4274-9148-50c65175772e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.541248 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.541231 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vrxf\" (UniqueName: \"kubernetes.io/projected/27e1b5f9-340a-4274-9148-50c65175772e-kube-api-access-9vrxf\") pod \"alertmanager-main-0\" (UID: \"27e1b5f9-340a-4274-9148-50c65175772e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.654788 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.654752 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:54.804734 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.804691 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:55:54.810458 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:55:54.810427 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27e1b5f9_340a_4274_9148_50c65175772e.slice/crio-5bfbc5f0ac87d23516a7c5e88902196066bcf15b053f57a32c047c3b724c2c75 WatchSource:0}: Error finding container 5bfbc5f0ac87d23516a7c5e88902196066bcf15b053f57a32c047c3b724c2c75: Status 404 returned error can't find the container with id 5bfbc5f0ac87d23516a7c5e88902196066bcf15b053f57a32c047c3b724c2c75 Apr 16 14:55:54.936441 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.936356 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/460ad184-72b0-4b47-b454-93b01b7a7648-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-7xvvx\" (UID: \"460ad184-72b0-4b47-b454-93b01b7a7648\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" Apr 16 14:55:54.939208 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.939174 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/460ad184-72b0-4b47-b454-93b01b7a7648-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-7xvvx\" (UID: \"460ad184-72b0-4b47-b454-93b01b7a7648\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" Apr 16 14:55:54.964986 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.964909 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"27e1b5f9-340a-4274-9148-50c65175772e","Type":"ContainerStarted","Data":"5bfbc5f0ac87d23516a7c5e88902196066bcf15b053f57a32c047c3b724c2c75"} Apr 16 14:55:54.966436 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.966407 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qpqgz" event={"ID":"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd","Type":"ContainerStarted","Data":"f5a0a99ab3e03cb55c0e6e91492cbd515fd3bba5aca72b2658913b416f870851"} Apr 16 14:55:54.968365 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.968337 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk" event={"ID":"2cfd5d54-0a18-4111-bb16-ee0e795d6f34","Type":"ContainerStarted","Data":"fff3ae177e157ccc2a7dde033675f76d0818c9891c41b66efdac7eaa59eb6af0"} Apr 16 14:55:54.968541 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.968516 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk" event={"ID":"2cfd5d54-0a18-4111-bb16-ee0e795d6f34","Type":"ContainerStarted","Data":"a9e0ada04f894ae21e783a5c73edfdc59cb1e45c9e65c197291b007e09015169"} Apr 16 14:55:54.968541 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:54.968542 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk" event={"ID":"2cfd5d54-0a18-4111-bb16-ee0e795d6f34","Type":"ContainerStarted","Data":"f5f306e3d605851bd2c420a8144201c9998ee7f6fbce50eac12de2de013cde77"} Apr 16 14:55:55.090861 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:55.090823 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" Apr 16 14:55:55.415551 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:55.415520 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-7xvvx"] Apr 16 14:55:55.972498 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:55.972465 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk" event={"ID":"2cfd5d54-0a18-4111-bb16-ee0e795d6f34","Type":"ContainerStarted","Data":"adf797457dd15bc4044b03a6e55b1e6b4e0ca9f45e7e7a9e4a69961087d44b76"} Apr 16 14:55:55.973620 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:55.973580 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" event={"ID":"460ad184-72b0-4b47-b454-93b01b7a7648","Type":"ContainerStarted","Data":"3e32e2cef12096b70795dc1077609779884796bdea49eb9a323ce094741c5e5f"} Apr 16 14:55:55.975484 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:55.975462 2576 generic.go:358] "Generic (PLEG): container finished" podID="528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd" containerID="4add6062ac31b3082917ee97167c0bed7a8cf079b007864fc2637b095edb8348" exitCode=0 Apr 16 14:55:55.975572 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:55.975492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qpqgz" event={"ID":"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd","Type":"ContainerDied","Data":"4add6062ac31b3082917ee97167c0bed7a8cf079b007864fc2637b095edb8348"} Apr 16 14:55:55.991970 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:55.991922 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-b4xpk" podStartSLOduration=2.120124869 podStartE2EDuration="2.991905525s" podCreationTimestamp="2026-04-16 14:55:53 +0000 UTC" firstStartedPulling="2026-04-16 14:55:54.420833214 +0000 UTC m=+188.839968170" lastFinishedPulling="2026-04-16 14:55:55.292613862 +0000 UTC m=+189.711748826" observedRunningTime="2026-04-16 14:55:55.990446356 +0000 UTC m=+190.409581335" watchObservedRunningTime="2026-04-16 14:55:55.991905525 +0000 UTC m=+190.411040501" Apr 16 14:55:56.243935 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.243821 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb"] Apr 16 14:55:56.248069 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.248039 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.250804 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.250781 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-nx9fb\"" Apr 16 14:55:56.250929 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.250874 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 14:55:56.251055 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.251013 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 14:55:56.251146 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.251129 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 14:55:56.251530 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.251246 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-15t1nb7n1ddsr\"" Apr 16 14:55:56.251530 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.251297 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 14:55:56.251530 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.251368 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 14:55:56.258849 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.258826 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb"] Apr 16 14:55:56.350170 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.350123 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/64e3ae53-70bf-4635-8294-d5d8634be750-secret-thanos-querier-tls\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.350170 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.350171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/64e3ae53-70bf-4635-8294-d5d8634be750-metrics-client-ca\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.350390 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.350210 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgc67\" (UniqueName: \"kubernetes.io/projected/64e3ae53-70bf-4635-8294-d5d8634be750-kube-api-access-lgc67\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.350390 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.350276 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/64e3ae53-70bf-4635-8294-d5d8634be750-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.350390 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.350344 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/64e3ae53-70bf-4635-8294-d5d8634be750-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.350559 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.350397 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/64e3ae53-70bf-4635-8294-d5d8634be750-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.350559 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.350474 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/64e3ae53-70bf-4635-8294-d5d8634be750-secret-grpc-tls\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.350559 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.350512 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/64e3ae53-70bf-4635-8294-d5d8634be750-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.451383 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.451349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/64e3ae53-70bf-4635-8294-d5d8634be750-secret-thanos-querier-tls\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.451383 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.451397 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/64e3ae53-70bf-4635-8294-d5d8634be750-metrics-client-ca\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.451683 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.451431 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgc67\" (UniqueName: \"kubernetes.io/projected/64e3ae53-70bf-4635-8294-d5d8634be750-kube-api-access-lgc67\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.451683 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.451476 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/64e3ae53-70bf-4635-8294-d5d8634be750-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.451683 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.451511 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/64e3ae53-70bf-4635-8294-d5d8634be750-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.451683 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.451558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/64e3ae53-70bf-4635-8294-d5d8634be750-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.451683 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.451632 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/64e3ae53-70bf-4635-8294-d5d8634be750-secret-grpc-tls\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.451683 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.451667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/64e3ae53-70bf-4635-8294-d5d8634be750-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.452525 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.452473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/64e3ae53-70bf-4635-8294-d5d8634be750-metrics-client-ca\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.454545 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.454515 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/64e3ae53-70bf-4635-8294-d5d8634be750-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.455170 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.454873 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/64e3ae53-70bf-4635-8294-d5d8634be750-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.455428 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.455388 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/64e3ae53-70bf-4635-8294-d5d8634be750-secret-grpc-tls\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.455734 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.455630 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/64e3ae53-70bf-4635-8294-d5d8634be750-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.456093 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.456071 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/64e3ae53-70bf-4635-8294-d5d8634be750-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.457861 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.457838 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/64e3ae53-70bf-4635-8294-d5d8634be750-secret-thanos-querier-tls\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.459983 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.459962 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgc67\" (UniqueName: \"kubernetes.io/projected/64e3ae53-70bf-4635-8294-d5d8634be750-kube-api-access-lgc67\") pod \"thanos-querier-6bf54c7b45-tvjxb\" (UID: \"64e3ae53-70bf-4635-8294-d5d8634be750\") " pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.561130 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.561044 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:55:56.858346 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.858320 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb"] Apr 16 14:55:56.860817 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:55:56.860796 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64e3ae53_70bf_4635_8294_d5d8634be750.slice/crio-4f22a12296066631394755a579089cb0371971e0bdda666133c215cecd069160 WatchSource:0}: Error finding container 4f22a12296066631394755a579089cb0371971e0bdda666133c215cecd069160: Status 404 returned error can't find the container with id 4f22a12296066631394755a579089cb0371971e0bdda666133c215cecd069160 Apr 16 14:55:56.979760 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.979554 2576 generic.go:358] "Generic (PLEG): container finished" podID="27e1b5f9-340a-4274-9148-50c65175772e" containerID="406d0ecc7d32aaf7435f477daa5726de728c4302b0e83b801feb6cac80e18adf" exitCode=0 Apr 16 14:55:56.979760 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.979659 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"27e1b5f9-340a-4274-9148-50c65175772e","Type":"ContainerDied","Data":"406d0ecc7d32aaf7435f477daa5726de728c4302b0e83b801feb6cac80e18adf"} Apr 16 14:55:56.983315 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.983290 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qpqgz" event={"ID":"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd","Type":"ContainerStarted","Data":"b5a622b2145ff79ba23ba4a716795076860da9e8b81f5c49a958d6b07828a71d"} Apr 16 14:55:56.983422 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.983335 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qpqgz" event={"ID":"528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd","Type":"ContainerStarted","Data":"770f903793406ca5aa19b8fefbd2eab207d0fc6085763239b1b0bad75bc89baf"} Apr 16 14:55:56.984768 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.984751 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" event={"ID":"460ad184-72b0-4b47-b454-93b01b7a7648","Type":"ContainerStarted","Data":"2bb9a964b6afef42baebea65b46e4fe8383d518bfcdbd4a6096eeb39f71525cc"} Apr 16 14:55:56.984845 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.984780 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" event={"ID":"460ad184-72b0-4b47-b454-93b01b7a7648","Type":"ContainerStarted","Data":"1796e0e3088d4034d5ddc6e63c4a29b1a62c2bfdce048a5acf9663d1b067c7d1"} Apr 16 14:55:56.985894 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:56.985848 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" event={"ID":"64e3ae53-70bf-4635-8294-d5d8634be750","Type":"ContainerStarted","Data":"4f22a12296066631394755a579089cb0371971e0bdda666133c215cecd069160"} Apr 16 14:55:57.051576 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.051523 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-qpqgz" podStartSLOduration=3.015335202 podStartE2EDuration="4.051506632s" podCreationTimestamp="2026-04-16 14:55:53 +0000 UTC" firstStartedPulling="2026-04-16 14:55:54.254177671 +0000 UTC m=+188.673312636" lastFinishedPulling="2026-04-16 14:55:55.290349099 +0000 UTC m=+189.709484066" observedRunningTime="2026-04-16 14:55:57.049909276 +0000 UTC m=+191.469044269" watchObservedRunningTime="2026-04-16 14:55:57.051506632 +0000 UTC m=+191.470641873" Apr 16 14:55:57.670764 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.670730 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6b4dc95984-98mbp"] Apr 16 14:55:57.674203 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.674181 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.676605 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.676582 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 14:55:57.677709 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.677686 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-kqzsr\"" Apr 16 14:55:57.677821 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.677716 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 14:55:57.677821 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.677717 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 14:55:57.677821 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.677802 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-62bamau3hhp8t\"" Apr 16 14:55:57.678066 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.678046 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 14:55:57.686340 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.686308 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6b4dc95984-98mbp"] Apr 16 14:55:57.763231 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.763199 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcxbm\" (UniqueName: \"kubernetes.io/projected/4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c-kube-api-access-tcxbm\") pod \"metrics-server-6b4dc95984-98mbp\" (UID: \"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c\") " pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.763398 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.763249 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c-secret-metrics-server-tls\") pod \"metrics-server-6b4dc95984-98mbp\" (UID: \"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c\") " pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.763398 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.763280 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c-metrics-server-audit-profiles\") pod \"metrics-server-6b4dc95984-98mbp\" (UID: \"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c\") " pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.763398 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.763326 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c-audit-log\") pod \"metrics-server-6b4dc95984-98mbp\" (UID: \"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c\") " pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.763398 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.763357 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6b4dc95984-98mbp\" (UID: \"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c\") " pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.763398 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.763392 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c-secret-metrics-server-client-certs\") pod \"metrics-server-6b4dc95984-98mbp\" (UID: \"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c\") " pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.763655 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.763455 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c-client-ca-bundle\") pod \"metrics-server-6b4dc95984-98mbp\" (UID: \"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c\") " pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.864481 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.864437 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c-client-ca-bundle\") pod \"metrics-server-6b4dc95984-98mbp\" (UID: \"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c\") " pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.864642 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.864506 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcxbm\" (UniqueName: \"kubernetes.io/projected/4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c-kube-api-access-tcxbm\") pod \"metrics-server-6b4dc95984-98mbp\" (UID: \"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c\") " pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.864642 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.864546 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c-secret-metrics-server-tls\") pod \"metrics-server-6b4dc95984-98mbp\" (UID: \"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c\") " pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.864642 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.864578 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c-metrics-server-audit-profiles\") pod \"metrics-server-6b4dc95984-98mbp\" (UID: \"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c\") " pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.864642 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.864630 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c-audit-log\") pod \"metrics-server-6b4dc95984-98mbp\" (UID: \"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c\") " pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.864871 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.864835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6b4dc95984-98mbp\" (UID: \"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c\") " pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.864940 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.864900 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c-secret-metrics-server-client-certs\") pod \"metrics-server-6b4dc95984-98mbp\" (UID: \"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c\") " pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.865145 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.865122 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c-audit-log\") pod \"metrics-server-6b4dc95984-98mbp\" (UID: \"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c\") " pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.865983 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.865958 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c-metrics-server-audit-profiles\") pod \"metrics-server-6b4dc95984-98mbp\" (UID: \"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c\") " pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.866284 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.866259 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6b4dc95984-98mbp\" (UID: \"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c\") " pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.867312 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.867293 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c-client-ca-bundle\") pod \"metrics-server-6b4dc95984-98mbp\" (UID: \"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c\") " pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.867496 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.867473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c-secret-metrics-server-tls\") pod \"metrics-server-6b4dc95984-98mbp\" (UID: \"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c\") " pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.868603 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.868579 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c-secret-metrics-server-client-certs\") pod \"metrics-server-6b4dc95984-98mbp\" (UID: \"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c\") " pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.873651 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.873629 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcxbm\" (UniqueName: \"kubernetes.io/projected/4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c-kube-api-access-tcxbm\") pod \"metrics-server-6b4dc95984-98mbp\" (UID: \"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c\") " pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.986459 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.986365 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:55:57.993386 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:57.993352 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" event={"ID":"460ad184-72b0-4b47-b454-93b01b7a7648","Type":"ContainerStarted","Data":"cac4bb6e06af86dbed9ced86186612202a1918537f3ee114e8d67ca47a5d9a4d"} Apr 16 14:55:58.013918 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:58.013880 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-7xvvx" podStartSLOduration=3.6496464189999998 podStartE2EDuration="5.013863195s" podCreationTimestamp="2026-04-16 14:55:53 +0000 UTC" firstStartedPulling="2026-04-16 14:55:55.422440076 +0000 UTC m=+189.841575029" lastFinishedPulling="2026-04-16 14:55:56.786656853 +0000 UTC m=+191.205791805" observedRunningTime="2026-04-16 14:55:58.012979586 +0000 UTC m=+192.432114562" watchObservedRunningTime="2026-04-16 14:55:58.013863195 +0000 UTC m=+192.432998172" Apr 16 14:55:58.031201 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:58.031175 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-vm2nt"] Apr 16 14:55:58.045499 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:58.045472 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-vm2nt"] Apr 16 14:55:58.045610 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:58.045589 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-vm2nt" Apr 16 14:55:58.048810 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:58.048781 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 14:55:58.049110 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:58.049056 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-tzm57\"" Apr 16 14:55:58.167709 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:58.167678 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b40e3801-d782-46b4-9a70-170dbfac4af1-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-vm2nt\" (UID: \"b40e3801-d782-46b4-9a70-170dbfac4af1\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-vm2nt" Apr 16 14:55:58.268684 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:58.268597 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b40e3801-d782-46b4-9a70-170dbfac4af1-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-vm2nt\" (UID: \"b40e3801-d782-46b4-9a70-170dbfac4af1\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-vm2nt" Apr 16 14:55:58.271360 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:58.271334 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b40e3801-d782-46b4-9a70-170dbfac4af1-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-vm2nt\" (UID: \"b40e3801-d782-46b4-9a70-170dbfac4af1\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-vm2nt" Apr 16 14:55:58.359747 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:58.359715 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-vm2nt" Apr 16 14:55:59.011735 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:59.006966 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" event={"ID":"64e3ae53-70bf-4635-8294-d5d8634be750","Type":"ContainerStarted","Data":"ad4f09c56e50d53cdd61b367616881f88f96d3f82ca9e97637415784820a33d2"} Apr 16 14:55:59.023232 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:59.022294 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"27e1b5f9-340a-4274-9148-50c65175772e","Type":"ContainerStarted","Data":"ecd80eeeda10b7a42fb4abe1434421f8d8e8b02388beee6ebc07d55f76d819db"} Apr 16 14:55:59.047872 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:59.047846 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-vm2nt"] Apr 16 14:55:59.050986 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:55:59.050963 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb40e3801_d782_46b4_9a70_170dbfac4af1.slice/crio-3189fcbae46d317647d196bc392682497b4deb14361774e0f488d1e6bb292280 WatchSource:0}: Error finding container 3189fcbae46d317647d196bc392682497b4deb14361774e0f488d1e6bb292280: Status 404 returned error can't find the container with id 3189fcbae46d317647d196bc392682497b4deb14361774e0f488d1e6bb292280 Apr 16 14:55:59.065069 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:55:59.065043 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6b4dc95984-98mbp"] Apr 16 14:55:59.067723 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:55:59.067704 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a4f7a19_49a1_4031_ae9b_2947eb0b2a2c.slice/crio-cc7235f12129ef76986456efbb97dff550540a8abed328862560217ffec54221 WatchSource:0}: Error finding container cc7235f12129ef76986456efbb97dff550540a8abed328862560217ffec54221: Status 404 returned error can't find the container with id cc7235f12129ef76986456efbb97dff550540a8abed328862560217ffec54221 Apr 16 14:56:00.028311 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:00.028274 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"27e1b5f9-340a-4274-9148-50c65175772e","Type":"ContainerStarted","Data":"8039e93e211a23df90473f2239f7e59f0f8a705a7a9bdf90ab302f9443e9caf9"} Apr 16 14:56:00.028795 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:00.028320 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"27e1b5f9-340a-4274-9148-50c65175772e","Type":"ContainerStarted","Data":"04f040bc2f3d3110348da20ed36e1d64d07ea41ed62d6247900308c1f0733105"} Apr 16 14:56:00.028795 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:00.028337 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"27e1b5f9-340a-4274-9148-50c65175772e","Type":"ContainerStarted","Data":"60b5b09acdd0eddd24138d0a1455efd1ab7c4c8bd26fa5de56843f2eba495933"} Apr 16 14:56:00.028795 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:00.028353 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"27e1b5f9-340a-4274-9148-50c65175772e","Type":"ContainerStarted","Data":"f8ae6df271b74a7a62325342961953a267cab9340f122e56562b3019f0bcb1f5"} Apr 16 14:56:00.029629 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:00.029590 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" event={"ID":"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c","Type":"ContainerStarted","Data":"cc7235f12129ef76986456efbb97dff550540a8abed328862560217ffec54221"} Apr 16 14:56:00.031091 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:00.031054 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-vm2nt" event={"ID":"b40e3801-d782-46b4-9a70-170dbfac4af1","Type":"ContainerStarted","Data":"3189fcbae46d317647d196bc392682497b4deb14361774e0f488d1e6bb292280"} Apr 16 14:56:00.033304 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:00.033278 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" event={"ID":"64e3ae53-70bf-4635-8294-d5d8634be750","Type":"ContainerStarted","Data":"e18d789ba0207a14878d28e29765bc3c771ff88644f30df654797f7d6a1a5100"} Apr 16 14:56:00.033416 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:00.033307 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" event={"ID":"64e3ae53-70bf-4635-8294-d5d8634be750","Type":"ContainerStarted","Data":"fbbc512e01c93dac6b4bd1c1b1008389e9015d59eaa1e7f288c978c3b0e6d177"} Apr 16 14:56:01.038080 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:01.037893 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" event={"ID":"4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c","Type":"ContainerStarted","Data":"60beff0332dbeb05c010e573e2b097f0805fd555f21f20b862cd8c118833dc38"} Apr 16 14:56:01.039447 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:01.039412 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-vm2nt" event={"ID":"b40e3801-d782-46b4-9a70-170dbfac4af1","Type":"ContainerStarted","Data":"bfeaca90fb1b8170e349b805995c9c126e97e3d923e568db019564a6a1b921e1"} Apr 16 14:56:01.039613 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:01.039598 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-vm2nt" Apr 16 14:56:01.041973 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:01.041945 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" event={"ID":"64e3ae53-70bf-4635-8294-d5d8634be750","Type":"ContainerStarted","Data":"712d06441596fef69f32f83964892a23fb3b12d6b98a4a2faefe0c747365c64f"} Apr 16 14:56:01.045313 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:01.045268 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"27e1b5f9-340a-4274-9148-50c65175772e","Type":"ContainerStarted","Data":"ae883c03f28973218dd2d3d5a30b5175c33ad9fbd3c9261855ccd74d724ae5b1"} Apr 16 14:56:01.045538 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:01.045513 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-vm2nt" Apr 16 14:56:01.060105 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:01.060053 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" podStartSLOduration=2.205029778 podStartE2EDuration="4.06001564s" podCreationTimestamp="2026-04-16 14:55:57 +0000 UTC" firstStartedPulling="2026-04-16 14:55:59.070103295 +0000 UTC m=+193.489238248" lastFinishedPulling="2026-04-16 14:56:00.925089142 +0000 UTC m=+195.344224110" observedRunningTime="2026-04-16 14:56:01.059231808 +0000 UTC m=+195.478366806" watchObservedRunningTime="2026-04-16 14:56:01.06001564 +0000 UTC m=+195.479150617" Apr 16 14:56:01.086516 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:01.086464 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.994868822 podStartE2EDuration="7.086447746s" podCreationTimestamp="2026-04-16 14:55:54 +0000 UTC" firstStartedPulling="2026-04-16 14:55:54.81341854 +0000 UTC m=+189.232553506" lastFinishedPulling="2026-04-16 14:55:59.904997463 +0000 UTC m=+194.324132430" observedRunningTime="2026-04-16 14:56:01.083927534 +0000 UTC m=+195.503062511" watchObservedRunningTime="2026-04-16 14:56:01.086447746 +0000 UTC m=+195.505582721" Apr 16 14:56:01.101734 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:01.101682 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-vm2nt" podStartSLOduration=1.231636489 podStartE2EDuration="3.101665432s" podCreationTimestamp="2026-04-16 14:55:58 +0000 UTC" firstStartedPulling="2026-04-16 14:55:59.053475471 +0000 UTC m=+193.472610435" lastFinishedPulling="2026-04-16 14:56:00.923504409 +0000 UTC m=+195.342639378" observedRunningTime="2026-04-16 14:56:01.100287526 +0000 UTC m=+195.519422501" watchObservedRunningTime="2026-04-16 14:56:01.101665432 +0000 UTC m=+195.520800409" Apr 16 14:56:02.050807 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:02.050756 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" event={"ID":"64e3ae53-70bf-4635-8294-d5d8634be750","Type":"ContainerStarted","Data":"4021e51f9dbeaba5338c869c7e0ce841d421d92e20f698237ec5763fe166c60e"} Apr 16 14:56:02.051310 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:02.050818 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" event={"ID":"64e3ae53-70bf-4635-8294-d5d8634be750","Type":"ContainerStarted","Data":"84258f8198051d0c28b65687355982bac912a7a023a425ba1aeadd66f2ed3dea"} Apr 16 14:56:02.073088 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:02.073042 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" podStartSLOduration=3.032516041 podStartE2EDuration="6.073006804s" podCreationTimestamp="2026-04-16 14:55:56 +0000 UTC" firstStartedPulling="2026-04-16 14:55:56.862678627 +0000 UTC m=+191.281813581" lastFinishedPulling="2026-04-16 14:55:59.903169391 +0000 UTC m=+194.322304344" observedRunningTime="2026-04-16 14:56:02.071212029 +0000 UTC m=+196.490347015" watchObservedRunningTime="2026-04-16 14:56:02.073006804 +0000 UTC m=+196.492141780" Apr 16 14:56:03.054014 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:03.053973 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:56:04.062296 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:04.062271 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6bf54c7b45-tvjxb" Apr 16 14:56:17.987592 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:17.987559 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:56:17.987968 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:17.987605 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:56:20.105262 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:20.105227 2576 generic.go:358] "Generic (PLEG): container finished" podID="e6d80c27-15aa-4aea-8508-8913412eba90" containerID="ce5587578751d427f6989f623e1f4f48d66096c653904823e523f33658be619a" exitCode=0 Apr 16 14:56:20.105623 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:20.105277 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gkt5f" event={"ID":"e6d80c27-15aa-4aea-8508-8913412eba90","Type":"ContainerDied","Data":"ce5587578751d427f6989f623e1f4f48d66096c653904823e523f33658be619a"} Apr 16 14:56:20.105623 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:20.105546 2576 scope.go:117] "RemoveContainer" containerID="ce5587578751d427f6989f623e1f4f48d66096c653904823e523f33658be619a" Apr 16 14:56:21.109270 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:21.109235 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gkt5f" event={"ID":"e6d80c27-15aa-4aea-8508-8913412eba90","Type":"ContainerStarted","Data":"3ac88d7b2c9b8d96d9db3fcd2db9ae6edc1c224853d1db2b9c6a823de0f20686"} Apr 16 14:56:24.362942 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:24.362912 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rsgph_574ca2b9-aeca-4a60-8152-838c7e3d1902/dns/0.log" Apr 16 14:56:24.368036 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:24.368002 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rsgph_574ca2b9-aeca-4a60-8152-838c7e3d1902/kube-rbac-proxy/0.log" Apr 16 14:56:24.526650 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:24.526605 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mjsr6_69a10374-32da-4de3-b491-3854f69f1613/dns-node-resolver/0.log" Apr 16 14:56:37.992701 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:37.992672 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:56:37.997003 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:37.996980 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6b4dc95984-98mbp" Apr 16 14:56:57.181344 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:57.181260 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs\") pod \"network-metrics-daemon-9p5t7\" (UID: \"deecc941-e868-4306-99e5-4f30afef0f95\") " pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:56:57.183534 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:57.183513 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deecc941-e868-4306-99e5-4f30afef0f95-metrics-certs\") pod \"network-metrics-daemon-9p5t7\" (UID: \"deecc941-e868-4306-99e5-4f30afef0f95\") " pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:56:57.405329 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:57.405300 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kc8hc\"" Apr 16 14:56:57.414051 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:57.414014 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9p5t7" Apr 16 14:56:57.538490 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:57.538449 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9p5t7"] Apr 16 14:56:57.542326 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:56:57.542299 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeecc941_e868_4306_99e5_4f30afef0f95.slice/crio-92ce93b70b2262da58fc66417aa534ab50d3a7070bd38ffa37a21aa1b6d6e999 WatchSource:0}: Error finding container 92ce93b70b2262da58fc66417aa534ab50d3a7070bd38ffa37a21aa1b6d6e999: Status 404 returned error can't find the container with id 92ce93b70b2262da58fc66417aa534ab50d3a7070bd38ffa37a21aa1b6d6e999 Apr 16 14:56:58.220138 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:58.220100 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9p5t7" event={"ID":"deecc941-e868-4306-99e5-4f30afef0f95","Type":"ContainerStarted","Data":"92ce93b70b2262da58fc66417aa534ab50d3a7070bd38ffa37a21aa1b6d6e999"} Apr 16 14:56:59.226069 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:59.226016 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9p5t7" event={"ID":"deecc941-e868-4306-99e5-4f30afef0f95","Type":"ContainerStarted","Data":"b7fcb1f6d9b4938554b4ce1afc12bc952612e94ef4aa58dcb642da641f0db7e9"} Apr 16 14:56:59.226069 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:59.226067 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9p5t7" event={"ID":"deecc941-e868-4306-99e5-4f30afef0f95","Type":"ContainerStarted","Data":"34e845bb28b1257b8ab641422d9eb7c7fd184344b87c7c5fb17781ba6f1a6fe4"} Apr 16 14:56:59.243655 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:56:59.243598 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9p5t7" podStartSLOduration=252.250588766 podStartE2EDuration="4m13.243582305s" podCreationTimestamp="2026-04-16 14:52:46 +0000 UTC" firstStartedPulling="2026-04-16 14:56:57.544582422 +0000 UTC m=+251.963717374" lastFinishedPulling="2026-04-16 14:56:58.53757596 +0000 UTC m=+252.956710913" observedRunningTime="2026-04-16 14:56:59.241525502 +0000 UTC m=+253.660660478" watchObservedRunningTime="2026-04-16 14:56:59.243582305 +0000 UTC m=+253.662717281" Apr 16 14:57:17.529804 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.529765 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-864cbb5958-7xfqt"] Apr 16 14:57:17.534583 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.534561 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.537205 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.537180 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 14:57:17.537310 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.537206 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 14:57:17.537310 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.537192 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 14:57:17.537473 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.537457 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 14:57:17.537620 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.537606 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-47s4w\"" Apr 16 14:57:17.537787 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.537769 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 14:57:17.543310 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.543293 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 14:57:17.550427 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.550404 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-864cbb5958-7xfqt"] Apr 16 14:57:17.649995 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.649962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-metrics-client-ca\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.650207 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.650012 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-telemeter-client-tls\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.650207 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.650055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxhpg\" (UniqueName: \"kubernetes.io/projected/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-kube-api-access-rxhpg\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.650207 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.650075 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.650207 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.650104 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-serving-certs-ca-bundle\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.650207 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.650159 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-secret-telemeter-client\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.650207 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.650184 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-federate-client-tls\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.650412 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.650221 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.750939 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.750892 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-metrics-client-ca\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.750939 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.750953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-telemeter-client-tls\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.751209 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.750972 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxhpg\" (UniqueName: \"kubernetes.io/projected/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-kube-api-access-rxhpg\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.751209 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.750991 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.751209 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.751013 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-serving-certs-ca-bundle\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.751209 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.751069 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-secret-telemeter-client\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.751396 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.751209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-federate-client-tls\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.751396 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.751267 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.751775 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.751749 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-metrics-client-ca\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.751946 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.751780 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-serving-certs-ca-bundle\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.752175 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.752150 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.753593 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.753571 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-telemeter-client-tls\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.753688 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.753662 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-federate-client-tls\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.753883 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.753865 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.753962 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.753944 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-secret-telemeter-client\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.759842 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.759814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxhpg\" (UniqueName: \"kubernetes.io/projected/0fe47cdc-0294-4716-8f99-0a0ff91f3da3-kube-api-access-rxhpg\") pod \"telemeter-client-864cbb5958-7xfqt\" (UID: \"0fe47cdc-0294-4716-8f99-0a0ff91f3da3\") " pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.843777 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.843693 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" Apr 16 14:57:17.971589 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:17.971560 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-864cbb5958-7xfqt"] Apr 16 14:57:17.972863 ip-10-0-129-76 kubenswrapper[2576]: W0416 14:57:17.972839 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fe47cdc_0294_4716_8f99_0a0ff91f3da3.slice/crio-493d5f5d9f72cfd031a61d5b18ddf47f349611ae1746efa182a61bbd372d972f WatchSource:0}: Error finding container 493d5f5d9f72cfd031a61d5b18ddf47f349611ae1746efa182a61bbd372d972f: Status 404 returned error can't find the container with id 493d5f5d9f72cfd031a61d5b18ddf47f349611ae1746efa182a61bbd372d972f Apr 16 14:57:18.284775 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:18.284739 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" event={"ID":"0fe47cdc-0294-4716-8f99-0a0ff91f3da3","Type":"ContainerStarted","Data":"493d5f5d9f72cfd031a61d5b18ddf47f349611ae1746efa182a61bbd372d972f"} Apr 16 14:57:20.293406 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:20.293371 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" event={"ID":"0fe47cdc-0294-4716-8f99-0a0ff91f3da3","Type":"ContainerStarted","Data":"dea5981c3298e162588a3b0ead68c1e1cbf763c968cbc737ed8c0f67fd1ea95e"} Apr 16 14:57:20.293406 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:20.293412 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" event={"ID":"0fe47cdc-0294-4716-8f99-0a0ff91f3da3","Type":"ContainerStarted","Data":"9a9158444812620d743c2409b0f6ec2870210e6f7d3466a32327051218b2ca96"} Apr 16 14:57:20.293982 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:20.293421 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" event={"ID":"0fe47cdc-0294-4716-8f99-0a0ff91f3da3","Type":"ContainerStarted","Data":"95b1e626dd0af63f9a46f224a6e21f8a474fc79b0bd1cc90c42ac178eefbd08a"} Apr 16 14:57:20.314512 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:20.314451 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-864cbb5958-7xfqt" podStartSLOduration=1.8591795819999999 podStartE2EDuration="3.314433617s" podCreationTimestamp="2026-04-16 14:57:17 +0000 UTC" firstStartedPulling="2026-04-16 14:57:17.974628202 +0000 UTC m=+272.393763158" lastFinishedPulling="2026-04-16 14:57:19.42988224 +0000 UTC m=+273.849017193" observedRunningTime="2026-04-16 14:57:20.31381107 +0000 UTC m=+274.732946058" watchObservedRunningTime="2026-04-16 14:57:20.314433617 +0000 UTC m=+274.733568592" Apr 16 14:57:46.187644 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:46.187614 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 14:57:46.188221 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:46.187614 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 14:57:46.190487 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:46.190464 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 14:57:46.190613 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:46.190600 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 14:57:46.197774 ip-10-0-129-76 kubenswrapper[2576]: I0416 14:57:46.197757 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 15:01:59.732033 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:01:59.731929 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-6zlvj"] Apr 16 15:01:59.735253 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:01:59.735230 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zlvj" Apr 16 15:01:59.737801 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:01:59.737781 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 15:01:59.755447 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:01:59.755424 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6zlvj"] Apr 16 15:01:59.898782 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:01:59.898751 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9da141d8-7c00-4479-bb5d-0cc7c31814ff-original-pull-secret\") pod \"global-pull-secret-syncer-6zlvj\" (UID: \"9da141d8-7c00-4479-bb5d-0cc7c31814ff\") " pod="kube-system/global-pull-secret-syncer-6zlvj" Apr 16 15:01:59.898943 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:01:59.898788 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9da141d8-7c00-4479-bb5d-0cc7c31814ff-kubelet-config\") pod \"global-pull-secret-syncer-6zlvj\" (UID: \"9da141d8-7c00-4479-bb5d-0cc7c31814ff\") " pod="kube-system/global-pull-secret-syncer-6zlvj" Apr 16 15:01:59.898943 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:01:59.898868 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9da141d8-7c00-4479-bb5d-0cc7c31814ff-dbus\") pod \"global-pull-secret-syncer-6zlvj\" (UID: \"9da141d8-7c00-4479-bb5d-0cc7c31814ff\") " pod="kube-system/global-pull-secret-syncer-6zlvj" Apr 16 15:02:00.000157 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:00.000068 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9da141d8-7c00-4479-bb5d-0cc7c31814ff-dbus\") pod \"global-pull-secret-syncer-6zlvj\" (UID: \"9da141d8-7c00-4479-bb5d-0cc7c31814ff\") " pod="kube-system/global-pull-secret-syncer-6zlvj" Apr 16 15:02:00.000157 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:00.000123 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9da141d8-7c00-4479-bb5d-0cc7c31814ff-original-pull-secret\") pod \"global-pull-secret-syncer-6zlvj\" (UID: \"9da141d8-7c00-4479-bb5d-0cc7c31814ff\") " pod="kube-system/global-pull-secret-syncer-6zlvj" Apr 16 15:02:00.000157 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:00.000147 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9da141d8-7c00-4479-bb5d-0cc7c31814ff-kubelet-config\") pod \"global-pull-secret-syncer-6zlvj\" (UID: \"9da141d8-7c00-4479-bb5d-0cc7c31814ff\") " pod="kube-system/global-pull-secret-syncer-6zlvj" Apr 16 15:02:00.000425 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:00.000254 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9da141d8-7c00-4479-bb5d-0cc7c31814ff-dbus\") pod \"global-pull-secret-syncer-6zlvj\" (UID: \"9da141d8-7c00-4479-bb5d-0cc7c31814ff\") " pod="kube-system/global-pull-secret-syncer-6zlvj" Apr 16 15:02:00.000425 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:00.000273 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9da141d8-7c00-4479-bb5d-0cc7c31814ff-kubelet-config\") pod \"global-pull-secret-syncer-6zlvj\" (UID: \"9da141d8-7c00-4479-bb5d-0cc7c31814ff\") " pod="kube-system/global-pull-secret-syncer-6zlvj" Apr 16 15:02:00.002604 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:00.002579 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9da141d8-7c00-4479-bb5d-0cc7c31814ff-original-pull-secret\") pod \"global-pull-secret-syncer-6zlvj\" (UID: \"9da141d8-7c00-4479-bb5d-0cc7c31814ff\") " pod="kube-system/global-pull-secret-syncer-6zlvj" Apr 16 15:02:00.044785 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:00.044760 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zlvj" Apr 16 15:02:00.162359 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:00.162278 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6zlvj"] Apr 16 15:02:00.165302 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:02:00.165273 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9da141d8_7c00_4479_bb5d_0cc7c31814ff.slice/crio-f6976fd94f182503b06346d2d2a8edd1e7a8df52d7bdbca641b1b82611ef835e WatchSource:0}: Error finding container f6976fd94f182503b06346d2d2a8edd1e7a8df52d7bdbca641b1b82611ef835e: Status 404 returned error can't find the container with id f6976fd94f182503b06346d2d2a8edd1e7a8df52d7bdbca641b1b82611ef835e Apr 16 15:02:00.167210 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:00.167194 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:02:01.105598 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:01.105556 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6zlvj" event={"ID":"9da141d8-7c00-4479-bb5d-0cc7c31814ff","Type":"ContainerStarted","Data":"f6976fd94f182503b06346d2d2a8edd1e7a8df52d7bdbca641b1b82611ef835e"} Apr 16 15:02:04.118094 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:04.118048 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6zlvj" event={"ID":"9da141d8-7c00-4479-bb5d-0cc7c31814ff","Type":"ContainerStarted","Data":"63970f3b3c81bb957f3f72799b8f9e381eec6389ca4d4b4d7d84201441ae5022"} Apr 16 15:02:04.132313 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:04.132269 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-6zlvj" podStartSLOduration=1.34847725 podStartE2EDuration="5.132254242s" podCreationTimestamp="2026-04-16 15:01:59 +0000 UTC" firstStartedPulling="2026-04-16 15:02:00.167326865 +0000 UTC m=+554.586461818" lastFinishedPulling="2026-04-16 15:02:03.951103846 +0000 UTC m=+558.370238810" observedRunningTime="2026-04-16 15:02:04.131098307 +0000 UTC m=+558.550233281" watchObservedRunningTime="2026-04-16 15:02:04.132254242 +0000 UTC m=+558.551389217" Apr 16 15:02:46.217195 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:46.217160 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:02:46.218426 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:46.218403 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:02:46.219802 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:46.219777 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:02:46.220775 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:46.220758 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:02:57.151581 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:57.151504 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms"] Apr 16 15:02:57.153930 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:57.153910 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms" Apr 16 15:02:57.160440 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:57.160061 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 15:02:57.160440 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:57.160070 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 15:02:57.160440 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:57.160097 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-j4fh5\"" Apr 16 15:02:57.165123 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:57.165101 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms"] Apr 16 15:02:57.182809 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:57.182783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/03dd53ad-63ec-4a6f-a604-e9f557652a74-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms\" (UID: \"03dd53ad-63ec-4a6f-a604-e9f557652a74\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms" Apr 16 15:02:57.182940 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:57.182825 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/03dd53ad-63ec-4a6f-a604-e9f557652a74-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms\" (UID: \"03dd53ad-63ec-4a6f-a604-e9f557652a74\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms" Apr 16 15:02:57.182940 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:57.182922 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9tmp\" (UniqueName: \"kubernetes.io/projected/03dd53ad-63ec-4a6f-a604-e9f557652a74-kube-api-access-l9tmp\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms\" (UID: \"03dd53ad-63ec-4a6f-a604-e9f557652a74\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms" Apr 16 15:02:57.283345 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:57.283313 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/03dd53ad-63ec-4a6f-a604-e9f557652a74-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms\" (UID: \"03dd53ad-63ec-4a6f-a604-e9f557652a74\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms" Apr 16 15:02:57.283510 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:57.283378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9tmp\" (UniqueName: \"kubernetes.io/projected/03dd53ad-63ec-4a6f-a604-e9f557652a74-kube-api-access-l9tmp\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms\" (UID: \"03dd53ad-63ec-4a6f-a604-e9f557652a74\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms" Apr 16 15:02:57.283510 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:57.283411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/03dd53ad-63ec-4a6f-a604-e9f557652a74-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms\" (UID: \"03dd53ad-63ec-4a6f-a604-e9f557652a74\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms" Apr 16 15:02:57.283685 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:57.283665 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/03dd53ad-63ec-4a6f-a604-e9f557652a74-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms\" (UID: \"03dd53ad-63ec-4a6f-a604-e9f557652a74\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms" Apr 16 15:02:57.283766 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:57.283734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/03dd53ad-63ec-4a6f-a604-e9f557652a74-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms\" (UID: \"03dd53ad-63ec-4a6f-a604-e9f557652a74\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms" Apr 16 15:02:57.292199 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:57.292178 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9tmp\" (UniqueName: \"kubernetes.io/projected/03dd53ad-63ec-4a6f-a604-e9f557652a74-kube-api-access-l9tmp\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms\" (UID: \"03dd53ad-63ec-4a6f-a604-e9f557652a74\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms" Apr 16 15:02:57.462612 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:57.462527 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms" Apr 16 15:02:57.585922 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:57.585894 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms"] Apr 16 15:02:57.588672 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:02:57.588640 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03dd53ad_63ec_4a6f_a604_e9f557652a74.slice/crio-8aa0a24f9063aa2c0e8443319590498b35bcfa5ceee6c5e6522ee012aff9a569 WatchSource:0}: Error finding container 8aa0a24f9063aa2c0e8443319590498b35bcfa5ceee6c5e6522ee012aff9a569: Status 404 returned error can't find the container with id 8aa0a24f9063aa2c0e8443319590498b35bcfa5ceee6c5e6522ee012aff9a569 Apr 16 15:02:58.280632 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:02:58.280589 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms" event={"ID":"03dd53ad-63ec-4a6f-a604-e9f557652a74","Type":"ContainerStarted","Data":"8aa0a24f9063aa2c0e8443319590498b35bcfa5ceee6c5e6522ee012aff9a569"} Apr 16 15:03:03.297550 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:03.297514 2576 generic.go:358] "Generic (PLEG): container finished" podID="03dd53ad-63ec-4a6f-a604-e9f557652a74" containerID="f9224a02f94221ce07bfe45ddcac490d78bdd41d98279b5ddaff2e0bc9dd09ce" exitCode=0 Apr 16 15:03:03.297949 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:03.297601 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms" event={"ID":"03dd53ad-63ec-4a6f-a604-e9f557652a74","Type":"ContainerDied","Data":"f9224a02f94221ce07bfe45ddcac490d78bdd41d98279b5ddaff2e0bc9dd09ce"} Apr 16 15:03:06.315525 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:06.315488 2576 generic.go:358] "Generic (PLEG): container finished" podID="03dd53ad-63ec-4a6f-a604-e9f557652a74" containerID="e79113d3523039d6ec228d2029a52e3362895de69f3fb85879595abacc6eae83" exitCode=0 Apr 16 15:03:06.315525 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:06.315525 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms" event={"ID":"03dd53ad-63ec-4a6f-a604-e9f557652a74","Type":"ContainerDied","Data":"e79113d3523039d6ec228d2029a52e3362895de69f3fb85879595abacc6eae83"} Apr 16 15:03:12.335120 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:12.335081 2576 generic.go:358] "Generic (PLEG): container finished" podID="03dd53ad-63ec-4a6f-a604-e9f557652a74" containerID="2203e94b357c2370e27306f8c97404f8f8b0449cb31f36860c30fbb848da97db" exitCode=0 Apr 16 15:03:12.335552 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:12.335152 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms" event={"ID":"03dd53ad-63ec-4a6f-a604-e9f557652a74","Type":"ContainerDied","Data":"2203e94b357c2370e27306f8c97404f8f8b0449cb31f36860c30fbb848da97db"} Apr 16 15:03:13.459556 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:13.459527 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms" Apr 16 15:03:13.526300 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:13.526262 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9tmp\" (UniqueName: \"kubernetes.io/projected/03dd53ad-63ec-4a6f-a604-e9f557652a74-kube-api-access-l9tmp\") pod \"03dd53ad-63ec-4a6f-a604-e9f557652a74\" (UID: \"03dd53ad-63ec-4a6f-a604-e9f557652a74\") " Apr 16 15:03:13.526300 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:13.526305 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/03dd53ad-63ec-4a6f-a604-e9f557652a74-util\") pod \"03dd53ad-63ec-4a6f-a604-e9f557652a74\" (UID: \"03dd53ad-63ec-4a6f-a604-e9f557652a74\") " Apr 16 15:03:13.526763 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:13.526323 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/03dd53ad-63ec-4a6f-a604-e9f557652a74-bundle\") pod \"03dd53ad-63ec-4a6f-a604-e9f557652a74\" (UID: \"03dd53ad-63ec-4a6f-a604-e9f557652a74\") " Apr 16 15:03:13.526989 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:13.526959 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03dd53ad-63ec-4a6f-a604-e9f557652a74-bundle" (OuterVolumeSpecName: "bundle") pod "03dd53ad-63ec-4a6f-a604-e9f557652a74" (UID: "03dd53ad-63ec-4a6f-a604-e9f557652a74"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:03:13.528482 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:13.528454 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03dd53ad-63ec-4a6f-a604-e9f557652a74-kube-api-access-l9tmp" (OuterVolumeSpecName: "kube-api-access-l9tmp") pod "03dd53ad-63ec-4a6f-a604-e9f557652a74" (UID: "03dd53ad-63ec-4a6f-a604-e9f557652a74"). InnerVolumeSpecName "kube-api-access-l9tmp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:03:13.530559 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:13.530534 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03dd53ad-63ec-4a6f-a604-e9f557652a74-util" (OuterVolumeSpecName: "util") pod "03dd53ad-63ec-4a6f-a604-e9f557652a74" (UID: "03dd53ad-63ec-4a6f-a604-e9f557652a74"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:03:13.627380 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:13.627287 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l9tmp\" (UniqueName: \"kubernetes.io/projected/03dd53ad-63ec-4a6f-a604-e9f557652a74-kube-api-access-l9tmp\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:03:13.627380 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:13.627321 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/03dd53ad-63ec-4a6f-a604-e9f557652a74-util\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:03:13.627380 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:13.627331 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/03dd53ad-63ec-4a6f-a604-e9f557652a74-bundle\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:03:14.342504 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:14.342475 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms" Apr 16 15:03:14.342659 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:14.342476 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cqc7ms" event={"ID":"03dd53ad-63ec-4a6f-a604-e9f557652a74","Type":"ContainerDied","Data":"8aa0a24f9063aa2c0e8443319590498b35bcfa5ceee6c5e6522ee012aff9a569"} Apr 16 15:03:14.342659 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:14.342590 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8aa0a24f9063aa2c0e8443319590498b35bcfa5ceee6c5e6522ee012aff9a569" Apr 16 15:03:18.921690 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:18.921656 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4mm9r"] Apr 16 15:03:18.922182 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:18.921996 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03dd53ad-63ec-4a6f-a604-e9f557652a74" containerName="extract" Apr 16 15:03:18.922182 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:18.922007 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="03dd53ad-63ec-4a6f-a604-e9f557652a74" containerName="extract" Apr 16 15:03:18.922182 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:18.922016 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03dd53ad-63ec-4a6f-a604-e9f557652a74" containerName="pull" Apr 16 15:03:18.922182 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:18.922038 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="03dd53ad-63ec-4a6f-a604-e9f557652a74" containerName="pull" Apr 16 15:03:18.922182 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:18.922051 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03dd53ad-63ec-4a6f-a604-e9f557652a74" containerName="util" Apr 16 15:03:18.922182 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:18.922056 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="03dd53ad-63ec-4a6f-a604-e9f557652a74" containerName="util" Apr 16 15:03:18.922182 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:18.922099 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="03dd53ad-63ec-4a6f-a604-e9f557652a74" containerName="extract" Apr 16 15:03:18.955841 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:18.955808 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4mm9r"] Apr 16 15:03:18.956000 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:18.955947 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4mm9r" Apr 16 15:03:18.959013 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:18.958989 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 15:03:18.959343 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:18.959322 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 15:03:18.959343 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:18.959336 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-f2khb\"" Apr 16 15:03:18.959456 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:18.959430 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 15:03:19.078161 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:19.078126 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/1b691560-c977-4730-bebf-484cd95818ae-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4mm9r\" (UID: \"1b691560-c977-4730-bebf-484cd95818ae\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4mm9r" Apr 16 15:03:19.078331 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:19.078178 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxcpw\" (UniqueName: \"kubernetes.io/projected/1b691560-c977-4730-bebf-484cd95818ae-kube-api-access-vxcpw\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4mm9r\" (UID: \"1b691560-c977-4730-bebf-484cd95818ae\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4mm9r" Apr 16 15:03:19.179623 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:19.179523 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/1b691560-c977-4730-bebf-484cd95818ae-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4mm9r\" (UID: \"1b691560-c977-4730-bebf-484cd95818ae\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4mm9r" Apr 16 15:03:19.179623 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:19.179590 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxcpw\" (UniqueName: \"kubernetes.io/projected/1b691560-c977-4730-bebf-484cd95818ae-kube-api-access-vxcpw\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4mm9r\" (UID: \"1b691560-c977-4730-bebf-484cd95818ae\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4mm9r" Apr 16 15:03:19.181904 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:19.181880 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/1b691560-c977-4730-bebf-484cd95818ae-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4mm9r\" (UID: \"1b691560-c977-4730-bebf-484cd95818ae\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4mm9r" Apr 16 15:03:19.188924 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:19.188901 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxcpw\" (UniqueName: \"kubernetes.io/projected/1b691560-c977-4730-bebf-484cd95818ae-kube-api-access-vxcpw\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4mm9r\" (UID: \"1b691560-c977-4730-bebf-484cd95818ae\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4mm9r" Apr 16 15:03:19.266936 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:19.266899 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4mm9r" Apr 16 15:03:19.396624 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:19.396604 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4mm9r"] Apr 16 15:03:19.399101 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:03:19.399072 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b691560_c977_4730_bebf_484cd95818ae.slice/crio-09b30b0a179960078ede6c2031c32aa6418738c42532fa058362cd619f97094c WatchSource:0}: Error finding container 09b30b0a179960078ede6c2031c32aa6418738c42532fa058362cd619f97094c: Status 404 returned error can't find the container with id 09b30b0a179960078ede6c2031c32aa6418738c42532fa058362cd619f97094c Apr 16 15:03:20.363679 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:20.363647 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4mm9r" event={"ID":"1b691560-c977-4730-bebf-484cd95818ae","Type":"ContainerStarted","Data":"09b30b0a179960078ede6c2031c32aa6418738c42532fa058362cd619f97094c"} Apr 16 15:03:23.821950 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:23.821912 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-rgpnn"] Apr 16 15:03:23.838467 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:23.838421 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-rgpnn"] Apr 16 15:03:23.838700 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:23.838598 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-rgpnn" Apr 16 15:03:23.841008 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:23.840983 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 15:03:23.841207 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:23.840991 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 15:03:23.841207 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:23.841012 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-sbr5p\"" Apr 16 15:03:23.923308 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:23.923271 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3fa5a664-5cad-45a9-9723-30307ed9a974-certificates\") pod \"keda-operator-ffbb595cb-rgpnn\" (UID: \"3fa5a664-5cad-45a9-9723-30307ed9a974\") " pod="openshift-keda/keda-operator-ffbb595cb-rgpnn" Apr 16 15:03:23.923498 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:23.923400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhqrl\" (UniqueName: \"kubernetes.io/projected/3fa5a664-5cad-45a9-9723-30307ed9a974-kube-api-access-jhqrl\") pod \"keda-operator-ffbb595cb-rgpnn\" (UID: \"3fa5a664-5cad-45a9-9723-30307ed9a974\") " pod="openshift-keda/keda-operator-ffbb595cb-rgpnn" Apr 16 15:03:23.923565 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:23.923526 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/3fa5a664-5cad-45a9-9723-30307ed9a974-cabundle0\") pod \"keda-operator-ffbb595cb-rgpnn\" (UID: \"3fa5a664-5cad-45a9-9723-30307ed9a974\") " pod="openshift-keda/keda-operator-ffbb595cb-rgpnn" Apr 16 15:03:24.024852 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:24.024816 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/3fa5a664-5cad-45a9-9723-30307ed9a974-cabundle0\") pod \"keda-operator-ffbb595cb-rgpnn\" (UID: \"3fa5a664-5cad-45a9-9723-30307ed9a974\") " pod="openshift-keda/keda-operator-ffbb595cb-rgpnn" Apr 16 15:03:24.025220 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:24.024872 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3fa5a664-5cad-45a9-9723-30307ed9a974-certificates\") pod \"keda-operator-ffbb595cb-rgpnn\" (UID: \"3fa5a664-5cad-45a9-9723-30307ed9a974\") " pod="openshift-keda/keda-operator-ffbb595cb-rgpnn" Apr 16 15:03:24.025220 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:24.024916 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhqrl\" (UniqueName: \"kubernetes.io/projected/3fa5a664-5cad-45a9-9723-30307ed9a974-kube-api-access-jhqrl\") pod \"keda-operator-ffbb595cb-rgpnn\" (UID: \"3fa5a664-5cad-45a9-9723-30307ed9a974\") " pod="openshift-keda/keda-operator-ffbb595cb-rgpnn" Apr 16 15:03:24.025220 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:24.025000 2576 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 16 15:03:24.025220 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:24.025039 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 15:03:24.025220 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:24.025050 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 15:03:24.025220 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:24.025064 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-rgpnn: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 15:03:24.025220 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:24.025122 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3fa5a664-5cad-45a9-9723-30307ed9a974-certificates podName:3fa5a664-5cad-45a9-9723-30307ed9a974 nodeName:}" failed. No retries permitted until 2026-04-16 15:03:24.525104304 +0000 UTC m=+638.944239258 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3fa5a664-5cad-45a9-9723-30307ed9a974-certificates") pod "keda-operator-ffbb595cb-rgpnn" (UID: "3fa5a664-5cad-45a9-9723-30307ed9a974") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 15:03:24.025684 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:24.025664 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/3fa5a664-5cad-45a9-9723-30307ed9a974-cabundle0\") pod \"keda-operator-ffbb595cb-rgpnn\" (UID: \"3fa5a664-5cad-45a9-9723-30307ed9a974\") " pod="openshift-keda/keda-operator-ffbb595cb-rgpnn" Apr 16 15:03:24.034291 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:24.034268 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhqrl\" (UniqueName: \"kubernetes.io/projected/3fa5a664-5cad-45a9-9723-30307ed9a974-kube-api-access-jhqrl\") pod \"keda-operator-ffbb595cb-rgpnn\" (UID: \"3fa5a664-5cad-45a9-9723-30307ed9a974\") " pod="openshift-keda/keda-operator-ffbb595cb-rgpnn" Apr 16 15:03:24.199241 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:24.199201 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6"] Apr 16 15:03:24.224219 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:24.224170 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6"] Apr 16 15:03:24.224392 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:24.224328 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6" Apr 16 15:03:24.226969 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:24.226943 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 15:03:24.326849 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:24.326810 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fd7a8a36-66c2-48e9-a13d-ef22f519b182-certificates\") pod \"keda-metrics-apiserver-7c9f485588-cxdj6\" (UID: \"fd7a8a36-66c2-48e9-a13d-ef22f519b182\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6" Apr 16 15:03:24.327052 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:24.326914 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvwb4\" (UniqueName: \"kubernetes.io/projected/fd7a8a36-66c2-48e9-a13d-ef22f519b182-kube-api-access-lvwb4\") pod \"keda-metrics-apiserver-7c9f485588-cxdj6\" (UID: \"fd7a8a36-66c2-48e9-a13d-ef22f519b182\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6" Apr 16 15:03:24.327052 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:24.327035 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/fd7a8a36-66c2-48e9-a13d-ef22f519b182-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-cxdj6\" (UID: \"fd7a8a36-66c2-48e9-a13d-ef22f519b182\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6" Apr 16 15:03:24.379897 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:24.379849 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4mm9r" event={"ID":"1b691560-c977-4730-bebf-484cd95818ae","Type":"ContainerStarted","Data":"c88e1a83ea5aa3d714ebfc64473248a5e5901630fb4052d0a192152c6327df84"} Apr 16 15:03:24.380082 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:24.379979 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4mm9r" Apr 16 15:03:24.402805 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:24.402747 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4mm9r" podStartSLOduration=2.49306099 podStartE2EDuration="6.40272845s" podCreationTimestamp="2026-04-16 15:03:18 +0000 UTC" firstStartedPulling="2026-04-16 15:03:19.400698864 +0000 UTC m=+633.819833817" lastFinishedPulling="2026-04-16 15:03:23.310366324 +0000 UTC m=+637.729501277" observedRunningTime="2026-04-16 15:03:24.400438526 +0000 UTC m=+638.819573501" watchObservedRunningTime="2026-04-16 15:03:24.40272845 +0000 UTC m=+638.821863425" Apr 16 15:03:24.428316 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:24.428281 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lvwb4\" (UniqueName: \"kubernetes.io/projected/fd7a8a36-66c2-48e9-a13d-ef22f519b182-kube-api-access-lvwb4\") pod \"keda-metrics-apiserver-7c9f485588-cxdj6\" (UID: \"fd7a8a36-66c2-48e9-a13d-ef22f519b182\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6" Apr 16 15:03:24.428501 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:24.428362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/fd7a8a36-66c2-48e9-a13d-ef22f519b182-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-cxdj6\" (UID: \"fd7a8a36-66c2-48e9-a13d-ef22f519b182\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6" Apr 16 15:03:24.428501 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:24.428425 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fd7a8a36-66c2-48e9-a13d-ef22f519b182-certificates\") pod \"keda-metrics-apiserver-7c9f485588-cxdj6\" (UID: \"fd7a8a36-66c2-48e9-a13d-ef22f519b182\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6" Apr 16 15:03:24.428618 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:24.428517 2576 secret.go:281] references non-existent secret key: tls.crt Apr 16 15:03:24.428618 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:24.428527 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 15:03:24.428618 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:24.428541 2576 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 16 15:03:24.428618 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:24.428554 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 15:03:24.428618 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:24.428594 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd7a8a36-66c2-48e9-a13d-ef22f519b182-certificates podName:fd7a8a36-66c2-48e9-a13d-ef22f519b182 nodeName:}" failed. No retries permitted until 2026-04-16 15:03:24.928580928 +0000 UTC m=+639.347715886 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/fd7a8a36-66c2-48e9-a13d-ef22f519b182-certificates") pod "keda-metrics-apiserver-7c9f485588-cxdj6" (UID: "fd7a8a36-66c2-48e9-a13d-ef22f519b182") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 15:03:24.429272 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:24.429243 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/fd7a8a36-66c2-48e9-a13d-ef22f519b182-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-cxdj6\" (UID: \"fd7a8a36-66c2-48e9-a13d-ef22f519b182\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6" Apr 16 15:03:24.439587 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:24.439559 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvwb4\" (UniqueName: \"kubernetes.io/projected/fd7a8a36-66c2-48e9-a13d-ef22f519b182-kube-api-access-lvwb4\") pod \"keda-metrics-apiserver-7c9f485588-cxdj6\" (UID: \"fd7a8a36-66c2-48e9-a13d-ef22f519b182\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6" Apr 16 15:03:24.529575 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:24.529476 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3fa5a664-5cad-45a9-9723-30307ed9a974-certificates\") pod \"keda-operator-ffbb595cb-rgpnn\" (UID: \"3fa5a664-5cad-45a9-9723-30307ed9a974\") " pod="openshift-keda/keda-operator-ffbb595cb-rgpnn" Apr 16 15:03:24.529754 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:24.529596 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 15:03:24.529754 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:24.529610 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 15:03:24.529754 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:24.529621 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-rgpnn: references non-existent secret key: ca.crt Apr 16 15:03:24.529754 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:24.529671 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3fa5a664-5cad-45a9-9723-30307ed9a974-certificates podName:3fa5a664-5cad-45a9-9723-30307ed9a974 nodeName:}" failed. No retries permitted until 2026-04-16 15:03:25.529654831 +0000 UTC m=+639.948789788 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3fa5a664-5cad-45a9-9723-30307ed9a974-certificates") pod "keda-operator-ffbb595cb-rgpnn" (UID: "3fa5a664-5cad-45a9-9723-30307ed9a974") : references non-existent secret key: ca.crt Apr 16 15:03:24.933383 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:24.933349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fd7a8a36-66c2-48e9-a13d-ef22f519b182-certificates\") pod \"keda-metrics-apiserver-7c9f485588-cxdj6\" (UID: \"fd7a8a36-66c2-48e9-a13d-ef22f519b182\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6" Apr 16 15:03:24.933846 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:24.933500 2576 secret.go:281] references non-existent secret key: tls.crt Apr 16 15:03:24.933846 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:24.933519 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 15:03:24.933846 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:24.933537 2576 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 16 15:03:24.933846 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:24.933558 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 15:03:24.933846 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:24.933650 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd7a8a36-66c2-48e9-a13d-ef22f519b182-certificates podName:fd7a8a36-66c2-48e9-a13d-ef22f519b182 nodeName:}" failed. No retries permitted until 2026-04-16 15:03:25.933618185 +0000 UTC m=+640.352753145 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/fd7a8a36-66c2-48e9-a13d-ef22f519b182-certificates") pod "keda-metrics-apiserver-7c9f485588-cxdj6" (UID: "fd7a8a36-66c2-48e9-a13d-ef22f519b182") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 15:03:25.539884 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:25.539849 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3fa5a664-5cad-45a9-9723-30307ed9a974-certificates\") pod \"keda-operator-ffbb595cb-rgpnn\" (UID: \"3fa5a664-5cad-45a9-9723-30307ed9a974\") " pod="openshift-keda/keda-operator-ffbb595cb-rgpnn" Apr 16 15:03:25.540083 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:25.539998 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 15:03:25.540083 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:25.540017 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 15:03:25.540083 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:25.540047 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-rgpnn: references non-existent secret key: ca.crt Apr 16 15:03:25.540211 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:25.540106 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3fa5a664-5cad-45a9-9723-30307ed9a974-certificates podName:3fa5a664-5cad-45a9-9723-30307ed9a974 nodeName:}" failed. No retries permitted until 2026-04-16 15:03:27.540088204 +0000 UTC m=+641.959223173 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3fa5a664-5cad-45a9-9723-30307ed9a974-certificates") pod "keda-operator-ffbb595cb-rgpnn" (UID: "3fa5a664-5cad-45a9-9723-30307ed9a974") : references non-existent secret key: ca.crt Apr 16 15:03:25.944628 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:25.944590 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fd7a8a36-66c2-48e9-a13d-ef22f519b182-certificates\") pod \"keda-metrics-apiserver-7c9f485588-cxdj6\" (UID: \"fd7a8a36-66c2-48e9-a13d-ef22f519b182\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6" Apr 16 15:03:25.945195 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:25.944760 2576 secret.go:281] references non-existent secret key: tls.crt Apr 16 15:03:25.945195 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:25.944788 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 15:03:25.945195 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:25.944813 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6: references non-existent secret key: tls.crt Apr 16 15:03:25.945195 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:03:25.944879 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd7a8a36-66c2-48e9-a13d-ef22f519b182-certificates podName:fd7a8a36-66c2-48e9-a13d-ef22f519b182 nodeName:}" failed. No retries permitted until 2026-04-16 15:03:27.944859912 +0000 UTC m=+642.363994869 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/fd7a8a36-66c2-48e9-a13d-ef22f519b182-certificates") pod "keda-metrics-apiserver-7c9f485588-cxdj6" (UID: "fd7a8a36-66c2-48e9-a13d-ef22f519b182") : references non-existent secret key: tls.crt Apr 16 15:03:27.558942 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:27.558899 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3fa5a664-5cad-45a9-9723-30307ed9a974-certificates\") pod \"keda-operator-ffbb595cb-rgpnn\" (UID: \"3fa5a664-5cad-45a9-9723-30307ed9a974\") " pod="openshift-keda/keda-operator-ffbb595cb-rgpnn" Apr 16 15:03:27.561397 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:27.561377 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3fa5a664-5cad-45a9-9723-30307ed9a974-certificates\") pod \"keda-operator-ffbb595cb-rgpnn\" (UID: \"3fa5a664-5cad-45a9-9723-30307ed9a974\") " pod="openshift-keda/keda-operator-ffbb595cb-rgpnn" Apr 16 15:03:27.752097 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:27.752011 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-rgpnn" Apr 16 15:03:27.886268 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:27.886232 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-rgpnn"] Apr 16 15:03:27.889471 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:03:27.889440 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fa5a664_5cad_45a9_9723_30307ed9a974.slice/crio-9368ad64d980e2545620aa0fcf3e8fce6947700acc41db6d090d319658e86809 WatchSource:0}: Error finding container 9368ad64d980e2545620aa0fcf3e8fce6947700acc41db6d090d319658e86809: Status 404 returned error can't find the container with id 9368ad64d980e2545620aa0fcf3e8fce6947700acc41db6d090d319658e86809 Apr 16 15:03:27.962351 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:27.962314 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fd7a8a36-66c2-48e9-a13d-ef22f519b182-certificates\") pod \"keda-metrics-apiserver-7c9f485588-cxdj6\" (UID: \"fd7a8a36-66c2-48e9-a13d-ef22f519b182\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6" Apr 16 15:03:27.964697 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:27.964677 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fd7a8a36-66c2-48e9-a13d-ef22f519b182-certificates\") pod \"keda-metrics-apiserver-7c9f485588-cxdj6\" (UID: \"fd7a8a36-66c2-48e9-a13d-ef22f519b182\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6" Apr 16 15:03:28.138673 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:28.138583 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6" Apr 16 15:03:28.259068 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:28.259042 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6"] Apr 16 15:03:28.261647 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:03:28.261619 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd7a8a36_66c2_48e9_a13d_ef22f519b182.slice/crio-2b11e6761d5a610b0c8b9528518b693724dbb0563b032ed9424e734c68b2451b WatchSource:0}: Error finding container 2b11e6761d5a610b0c8b9528518b693724dbb0563b032ed9424e734c68b2451b: Status 404 returned error can't find the container with id 2b11e6761d5a610b0c8b9528518b693724dbb0563b032ed9424e734c68b2451b Apr 16 15:03:28.395626 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:28.395532 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-rgpnn" event={"ID":"3fa5a664-5cad-45a9-9723-30307ed9a974","Type":"ContainerStarted","Data":"9368ad64d980e2545620aa0fcf3e8fce6947700acc41db6d090d319658e86809"} Apr 16 15:03:28.396565 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:28.396532 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6" event={"ID":"fd7a8a36-66c2-48e9-a13d-ef22f519b182","Type":"ContainerStarted","Data":"2b11e6761d5a610b0c8b9528518b693724dbb0563b032ed9424e734c68b2451b"} Apr 16 15:03:32.412456 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:32.412412 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-rgpnn" event={"ID":"3fa5a664-5cad-45a9-9723-30307ed9a974","Type":"ContainerStarted","Data":"66bf0842795ea6a9141a713acfdf74dc2cf0d885cab597ae8895a68e7ae0e2ce"} Apr 16 15:03:32.412456 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:32.412464 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-rgpnn" Apr 16 15:03:32.413802 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:32.413771 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6" event={"ID":"fd7a8a36-66c2-48e9-a13d-ef22f519b182","Type":"ContainerStarted","Data":"a129e25cf3569546c2310a7f633c48ac3dba9d9ee12abea6243f095d7ffb642f"} Apr 16 15:03:32.413944 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:32.413885 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6" Apr 16 15:03:32.437466 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:32.437418 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-rgpnn" podStartSLOduration=5.575758401 podStartE2EDuration="9.437405496s" podCreationTimestamp="2026-04-16 15:03:23 +0000 UTC" firstStartedPulling="2026-04-16 15:03:27.890801275 +0000 UTC m=+642.309936228" lastFinishedPulling="2026-04-16 15:03:31.752448369 +0000 UTC m=+646.171583323" observedRunningTime="2026-04-16 15:03:32.434396821 +0000 UTC m=+646.853531823" watchObservedRunningTime="2026-04-16 15:03:32.437405496 +0000 UTC m=+646.856540474" Apr 16 15:03:32.453142 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:32.453094 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6" podStartSLOduration=4.969784106 podStartE2EDuration="8.45308154s" podCreationTimestamp="2026-04-16 15:03:24 +0000 UTC" firstStartedPulling="2026-04-16 15:03:28.263061014 +0000 UTC m=+642.682195967" lastFinishedPulling="2026-04-16 15:03:31.74635843 +0000 UTC m=+646.165493401" observedRunningTime="2026-04-16 15:03:32.451533052 +0000 UTC m=+646.870668061" watchObservedRunningTime="2026-04-16 15:03:32.45308154 +0000 UTC m=+646.872216515" Apr 16 15:03:43.421578 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:43.421548 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cxdj6" Apr 16 15:03:45.386178 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:45.386147 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4mm9r" Apr 16 15:03:53.419634 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:03:53.419600 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-rgpnn" Apr 16 15:04:31.813450 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:31.813366 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-2qzz9"] Apr 16 15:04:31.816523 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:31.816501 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-2qzz9" Apr 16 15:04:31.827707 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:31.827672 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 15:04:31.827707 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:31.827708 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 15:04:31.828465 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:31.827921 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 15:04:31.829388 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:31.828954 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-kmvk9\"" Apr 16 15:04:31.830891 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:31.830869 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-2qzz9"] Apr 16 15:04:31.831083 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:31.831062 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrhtx\" (UniqueName: \"kubernetes.io/projected/1560df80-a1e6-491a-a76e-f51c0298a049-kube-api-access-mrhtx\") pod \"kserve-controller-manager-7669bdc57-2qzz9\" (UID: \"1560df80-a1e6-491a-a76e-f51c0298a049\") " pod="kserve/kserve-controller-manager-7669bdc57-2qzz9" Apr 16 15:04:31.831177 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:31.831122 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1560df80-a1e6-491a-a76e-f51c0298a049-cert\") pod \"kserve-controller-manager-7669bdc57-2qzz9\" (UID: \"1560df80-a1e6-491a-a76e-f51c0298a049\") " pod="kserve/kserve-controller-manager-7669bdc57-2qzz9" Apr 16 15:04:31.850512 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:31.850486 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-7zsmf"] Apr 16 15:04:31.853808 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:31.853783 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-7zsmf" Apr 16 15:04:31.856047 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:31.856008 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 15:04:31.856172 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:31.856103 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-9zz8m\"" Apr 16 15:04:31.862429 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:31.862401 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-7zsmf"] Apr 16 15:04:31.932400 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:31.932360 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/51f30a24-dd29-4fa2-a9bf-e8c4a58aab7a-data\") pod \"seaweedfs-86cc847c5c-7zsmf\" (UID: \"51f30a24-dd29-4fa2-a9bf-e8c4a58aab7a\") " pod="kserve/seaweedfs-86cc847c5c-7zsmf" Apr 16 15:04:31.932563 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:31.932484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrhtx\" (UniqueName: \"kubernetes.io/projected/1560df80-a1e6-491a-a76e-f51c0298a049-kube-api-access-mrhtx\") pod \"kserve-controller-manager-7669bdc57-2qzz9\" (UID: \"1560df80-a1e6-491a-a76e-f51c0298a049\") " pod="kserve/kserve-controller-manager-7669bdc57-2qzz9" Apr 16 15:04:31.932563 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:31.932527 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6tpv\" (UniqueName: \"kubernetes.io/projected/51f30a24-dd29-4fa2-a9bf-e8c4a58aab7a-kube-api-access-z6tpv\") pod \"seaweedfs-86cc847c5c-7zsmf\" (UID: \"51f30a24-dd29-4fa2-a9bf-e8c4a58aab7a\") " pod="kserve/seaweedfs-86cc847c5c-7zsmf" Apr 16 15:04:31.932641 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:31.932573 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1560df80-a1e6-491a-a76e-f51c0298a049-cert\") pod \"kserve-controller-manager-7669bdc57-2qzz9\" (UID: \"1560df80-a1e6-491a-a76e-f51c0298a049\") " pod="kserve/kserve-controller-manager-7669bdc57-2qzz9" Apr 16 15:04:31.935090 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:31.935061 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1560df80-a1e6-491a-a76e-f51c0298a049-cert\") pod \"kserve-controller-manager-7669bdc57-2qzz9\" (UID: \"1560df80-a1e6-491a-a76e-f51c0298a049\") " pod="kserve/kserve-controller-manager-7669bdc57-2qzz9" Apr 16 15:04:31.941356 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:31.941324 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrhtx\" (UniqueName: \"kubernetes.io/projected/1560df80-a1e6-491a-a76e-f51c0298a049-kube-api-access-mrhtx\") pod \"kserve-controller-manager-7669bdc57-2qzz9\" (UID: \"1560df80-a1e6-491a-a76e-f51c0298a049\") " pod="kserve/kserve-controller-manager-7669bdc57-2qzz9" Apr 16 15:04:32.033081 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:32.033033 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/51f30a24-dd29-4fa2-a9bf-e8c4a58aab7a-data\") pod \"seaweedfs-86cc847c5c-7zsmf\" (UID: \"51f30a24-dd29-4fa2-a9bf-e8c4a58aab7a\") " pod="kserve/seaweedfs-86cc847c5c-7zsmf" Apr 16 15:04:32.033262 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:32.033138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6tpv\" (UniqueName: \"kubernetes.io/projected/51f30a24-dd29-4fa2-a9bf-e8c4a58aab7a-kube-api-access-z6tpv\") pod \"seaweedfs-86cc847c5c-7zsmf\" (UID: \"51f30a24-dd29-4fa2-a9bf-e8c4a58aab7a\") " pod="kserve/seaweedfs-86cc847c5c-7zsmf" Apr 16 15:04:32.033450 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:32.033429 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/51f30a24-dd29-4fa2-a9bf-e8c4a58aab7a-data\") pod \"seaweedfs-86cc847c5c-7zsmf\" (UID: \"51f30a24-dd29-4fa2-a9bf-e8c4a58aab7a\") " pod="kserve/seaweedfs-86cc847c5c-7zsmf" Apr 16 15:04:32.040536 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:32.040511 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6tpv\" (UniqueName: \"kubernetes.io/projected/51f30a24-dd29-4fa2-a9bf-e8c4a58aab7a-kube-api-access-z6tpv\") pod \"seaweedfs-86cc847c5c-7zsmf\" (UID: \"51f30a24-dd29-4fa2-a9bf-e8c4a58aab7a\") " pod="kserve/seaweedfs-86cc847c5c-7zsmf" Apr 16 15:04:32.139466 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:32.139388 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-2qzz9" Apr 16 15:04:32.165362 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:32.165331 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-7zsmf" Apr 16 15:04:32.282261 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:32.282189 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-2qzz9"] Apr 16 15:04:32.285101 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:04:32.285072 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1560df80_a1e6_491a_a76e_f51c0298a049.slice/crio-42b68fdaed0727ec11f18f68dceec0335771fae38b3dff5004df5e6a4caf9d9b WatchSource:0}: Error finding container 42b68fdaed0727ec11f18f68dceec0335771fae38b3dff5004df5e6a4caf9d9b: Status 404 returned error can't find the container with id 42b68fdaed0727ec11f18f68dceec0335771fae38b3dff5004df5e6a4caf9d9b Apr 16 15:04:32.308353 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:32.308334 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-7zsmf"] Apr 16 15:04:32.309369 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:04:32.309343 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51f30a24_dd29_4fa2_a9bf_e8c4a58aab7a.slice/crio-1dacd92e11b0299f74e525aff725b76ed68a818a20e9dff8d4d12ebe1383bebc WatchSource:0}: Error finding container 1dacd92e11b0299f74e525aff725b76ed68a818a20e9dff8d4d12ebe1383bebc: Status 404 returned error can't find the container with id 1dacd92e11b0299f74e525aff725b76ed68a818a20e9dff8d4d12ebe1383bebc Apr 16 15:04:32.621039 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:32.620991 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-2qzz9" event={"ID":"1560df80-a1e6-491a-a76e-f51c0298a049","Type":"ContainerStarted","Data":"42b68fdaed0727ec11f18f68dceec0335771fae38b3dff5004df5e6a4caf9d9b"} Apr 16 15:04:32.622060 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:32.622015 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-7zsmf" event={"ID":"51f30a24-dd29-4fa2-a9bf-e8c4a58aab7a","Type":"ContainerStarted","Data":"1dacd92e11b0299f74e525aff725b76ed68a818a20e9dff8d4d12ebe1383bebc"} Apr 16 15:04:36.640034 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:36.639983 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-7zsmf" event={"ID":"51f30a24-dd29-4fa2-a9bf-e8c4a58aab7a","Type":"ContainerStarted","Data":"bc9c17f61094da69840024f549ec0d7df3cdb4f3ac937373cec70a553d89d9e9"} Apr 16 15:04:36.640493 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:36.640066 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-7zsmf" Apr 16 15:04:36.641313 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:36.641292 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-2qzz9" event={"ID":"1560df80-a1e6-491a-a76e-f51c0298a049","Type":"ContainerStarted","Data":"13e30fa384bedf2c226dca8328e3c653934974a17a8e21c5317ba92d82e0818d"} Apr 16 15:04:36.641473 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:36.641461 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7669bdc57-2qzz9" Apr 16 15:04:36.656343 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:36.656286 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-7zsmf" podStartSLOduration=2.024087746 podStartE2EDuration="5.656273155s" podCreationTimestamp="2026-04-16 15:04:31 +0000 UTC" firstStartedPulling="2026-04-16 15:04:32.310750029 +0000 UTC m=+706.729884985" lastFinishedPulling="2026-04-16 15:04:35.942935427 +0000 UTC m=+710.362070394" observedRunningTime="2026-04-16 15:04:36.655446925 +0000 UTC m=+711.074581900" watchObservedRunningTime="2026-04-16 15:04:36.656273155 +0000 UTC m=+711.075408129" Apr 16 15:04:36.669793 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:36.669742 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7669bdc57-2qzz9" podStartSLOduration=2.11096947 podStartE2EDuration="5.669726281s" podCreationTimestamp="2026-04-16 15:04:31 +0000 UTC" firstStartedPulling="2026-04-16 15:04:32.286869514 +0000 UTC m=+706.706004474" lastFinishedPulling="2026-04-16 15:04:35.84562633 +0000 UTC m=+710.264761285" observedRunningTime="2026-04-16 15:04:36.669222429 +0000 UTC m=+711.088357406" watchObservedRunningTime="2026-04-16 15:04:36.669726281 +0000 UTC m=+711.088861257" Apr 16 15:04:42.647128 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:04:42.647093 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-7zsmf" Apr 16 15:05:07.082875 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.082839 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-2qzz9"] Apr 16 15:05:07.083450 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.083112 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-7669bdc57-2qzz9" podUID="1560df80-a1e6-491a-a76e-f51c0298a049" containerName="manager" containerID="cri-o://13e30fa384bedf2c226dca8328e3c653934974a17a8e21c5317ba92d82e0818d" gracePeriod=10 Apr 16 15:05:07.088835 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.088807 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7669bdc57-2qzz9" Apr 16 15:05:07.104707 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.104674 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-5tbbk"] Apr 16 15:05:07.108676 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.108648 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-5tbbk" Apr 16 15:05:07.115818 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.115791 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-5tbbk"] Apr 16 15:05:07.262514 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.262315 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-962m2\" (UniqueName: \"kubernetes.io/projected/ce99f83c-1d37-49b5-930d-a5d043dcc6e3-kube-api-access-962m2\") pod \"kserve-controller-manager-7669bdc57-5tbbk\" (UID: \"ce99f83c-1d37-49b5-930d-a5d043dcc6e3\") " pod="kserve/kserve-controller-manager-7669bdc57-5tbbk" Apr 16 15:05:07.262514 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.262394 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce99f83c-1d37-49b5-930d-a5d043dcc6e3-cert\") pod \"kserve-controller-manager-7669bdc57-5tbbk\" (UID: \"ce99f83c-1d37-49b5-930d-a5d043dcc6e3\") " pod="kserve/kserve-controller-manager-7669bdc57-5tbbk" Apr 16 15:05:07.323995 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.323971 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-2qzz9" Apr 16 15:05:07.363766 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.363688 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-962m2\" (UniqueName: \"kubernetes.io/projected/ce99f83c-1d37-49b5-930d-a5d043dcc6e3-kube-api-access-962m2\") pod \"kserve-controller-manager-7669bdc57-5tbbk\" (UID: \"ce99f83c-1d37-49b5-930d-a5d043dcc6e3\") " pod="kserve/kserve-controller-manager-7669bdc57-5tbbk" Apr 16 15:05:07.363766 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.363740 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce99f83c-1d37-49b5-930d-a5d043dcc6e3-cert\") pod \"kserve-controller-manager-7669bdc57-5tbbk\" (UID: \"ce99f83c-1d37-49b5-930d-a5d043dcc6e3\") " pod="kserve/kserve-controller-manager-7669bdc57-5tbbk" Apr 16 15:05:07.366179 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.366154 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce99f83c-1d37-49b5-930d-a5d043dcc6e3-cert\") pod \"kserve-controller-manager-7669bdc57-5tbbk\" (UID: \"ce99f83c-1d37-49b5-930d-a5d043dcc6e3\") " pod="kserve/kserve-controller-manager-7669bdc57-5tbbk" Apr 16 15:05:07.373443 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.373419 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-962m2\" (UniqueName: \"kubernetes.io/projected/ce99f83c-1d37-49b5-930d-a5d043dcc6e3-kube-api-access-962m2\") pod \"kserve-controller-manager-7669bdc57-5tbbk\" (UID: \"ce99f83c-1d37-49b5-930d-a5d043dcc6e3\") " pod="kserve/kserve-controller-manager-7669bdc57-5tbbk" Apr 16 15:05:07.464607 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.464573 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrhtx\" (UniqueName: \"kubernetes.io/projected/1560df80-a1e6-491a-a76e-f51c0298a049-kube-api-access-mrhtx\") pod \"1560df80-a1e6-491a-a76e-f51c0298a049\" (UID: \"1560df80-a1e6-491a-a76e-f51c0298a049\") " Apr 16 15:05:07.464774 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.464613 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1560df80-a1e6-491a-a76e-f51c0298a049-cert\") pod \"1560df80-a1e6-491a-a76e-f51c0298a049\" (UID: \"1560df80-a1e6-491a-a76e-f51c0298a049\") " Apr 16 15:05:07.466845 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.466814 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1560df80-a1e6-491a-a76e-f51c0298a049-kube-api-access-mrhtx" (OuterVolumeSpecName: "kube-api-access-mrhtx") pod "1560df80-a1e6-491a-a76e-f51c0298a049" (UID: "1560df80-a1e6-491a-a76e-f51c0298a049"). InnerVolumeSpecName "kube-api-access-mrhtx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:05:07.466845 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.466821 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1560df80-a1e6-491a-a76e-f51c0298a049-cert" (OuterVolumeSpecName: "cert") pod "1560df80-a1e6-491a-a76e-f51c0298a049" (UID: "1560df80-a1e6-491a-a76e-f51c0298a049"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:05:07.473135 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.473115 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-5tbbk" Apr 16 15:05:07.565815 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.565777 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mrhtx\" (UniqueName: \"kubernetes.io/projected/1560df80-a1e6-491a-a76e-f51c0298a049-kube-api-access-mrhtx\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:05:07.565815 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.565816 2576 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1560df80-a1e6-491a-a76e-f51c0298a049-cert\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:05:07.595474 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.595451 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-5tbbk"] Apr 16 15:05:07.598109 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:05:07.598082 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce99f83c_1d37_49b5_930d_a5d043dcc6e3.slice/crio-69e39f6ccf9edbdd45e913f0736a0c2029cc215cf23b4d53164d4bb22b9f8d05 WatchSource:0}: Error finding container 69e39f6ccf9edbdd45e913f0736a0c2029cc215cf23b4d53164d4bb22b9f8d05: Status 404 returned error can't find the container with id 69e39f6ccf9edbdd45e913f0736a0c2029cc215cf23b4d53164d4bb22b9f8d05 Apr 16 15:05:07.746790 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.746748 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-5tbbk" event={"ID":"ce99f83c-1d37-49b5-930d-a5d043dcc6e3","Type":"ContainerStarted","Data":"69e39f6ccf9edbdd45e913f0736a0c2029cc215cf23b4d53164d4bb22b9f8d05"} Apr 16 15:05:07.747881 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.747856 2576 generic.go:358] "Generic (PLEG): container finished" podID="1560df80-a1e6-491a-a76e-f51c0298a049" containerID="13e30fa384bedf2c226dca8328e3c653934974a17a8e21c5317ba92d82e0818d" exitCode=0 Apr 16 15:05:07.748003 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.747893 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-2qzz9" event={"ID":"1560df80-a1e6-491a-a76e-f51c0298a049","Type":"ContainerDied","Data":"13e30fa384bedf2c226dca8328e3c653934974a17a8e21c5317ba92d82e0818d"} Apr 16 15:05:07.748003 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.747911 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-2qzz9" event={"ID":"1560df80-a1e6-491a-a76e-f51c0298a049","Type":"ContainerDied","Data":"42b68fdaed0727ec11f18f68dceec0335771fae38b3dff5004df5e6a4caf9d9b"} Apr 16 15:05:07.748003 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.747925 2576 scope.go:117] "RemoveContainer" containerID="13e30fa384bedf2c226dca8328e3c653934974a17a8e21c5317ba92d82e0818d" Apr 16 15:05:07.748003 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.747923 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-2qzz9" Apr 16 15:05:07.757596 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.757579 2576 scope.go:117] "RemoveContainer" containerID="13e30fa384bedf2c226dca8328e3c653934974a17a8e21c5317ba92d82e0818d" Apr 16 15:05:07.757883 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:05:07.757864 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13e30fa384bedf2c226dca8328e3c653934974a17a8e21c5317ba92d82e0818d\": container with ID starting with 13e30fa384bedf2c226dca8328e3c653934974a17a8e21c5317ba92d82e0818d not found: ID does not exist" containerID="13e30fa384bedf2c226dca8328e3c653934974a17a8e21c5317ba92d82e0818d" Apr 16 15:05:07.757931 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.757892 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13e30fa384bedf2c226dca8328e3c653934974a17a8e21c5317ba92d82e0818d"} err="failed to get container status \"13e30fa384bedf2c226dca8328e3c653934974a17a8e21c5317ba92d82e0818d\": rpc error: code = NotFound desc = could not find container \"13e30fa384bedf2c226dca8328e3c653934974a17a8e21c5317ba92d82e0818d\": container with ID starting with 13e30fa384bedf2c226dca8328e3c653934974a17a8e21c5317ba92d82e0818d not found: ID does not exist" Apr 16 15:05:07.770342 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.770314 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-2qzz9"] Apr 16 15:05:07.774266 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:07.774242 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-2qzz9"] Apr 16 15:05:08.307331 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:08.307302 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1560df80-a1e6-491a-a76e-f51c0298a049" path="/var/lib/kubelet/pods/1560df80-a1e6-491a-a76e-f51c0298a049/volumes" Apr 16 15:05:08.753813 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:08.753783 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-5tbbk" event={"ID":"ce99f83c-1d37-49b5-930d-a5d043dcc6e3","Type":"ContainerStarted","Data":"583f07a903a1c628628060e84d6e5f63ee0b222cba30c486560db025ec05cde2"} Apr 16 15:05:08.753999 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:08.753929 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7669bdc57-5tbbk" Apr 16 15:05:08.770428 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:08.770384 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7669bdc57-5tbbk" podStartSLOduration=1.262010104 podStartE2EDuration="1.770368859s" podCreationTimestamp="2026-04-16 15:05:07 +0000 UTC" firstStartedPulling="2026-04-16 15:05:07.599384425 +0000 UTC m=+742.018519378" lastFinishedPulling="2026-04-16 15:05:08.107743164 +0000 UTC m=+742.526878133" observedRunningTime="2026-04-16 15:05:08.768121666 +0000 UTC m=+743.187256642" watchObservedRunningTime="2026-04-16 15:05:08.770368859 +0000 UTC m=+743.189503834" Apr 16 15:05:39.762337 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:39.762308 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7669bdc57-5tbbk" Apr 16 15:05:40.605638 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:40.605606 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-2jxbh"] Apr 16 15:05:40.606064 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:40.606011 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1560df80-a1e6-491a-a76e-f51c0298a049" containerName="manager" Apr 16 15:05:40.606064 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:40.606040 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1560df80-a1e6-491a-a76e-f51c0298a049" containerName="manager" Apr 16 15:05:40.606270 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:40.606120 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1560df80-a1e6-491a-a76e-f51c0298a049" containerName="manager" Apr 16 15:05:40.609209 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:40.609189 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-2jxbh" Apr 16 15:05:40.611521 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:40.611499 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-wnkjg\"" Apr 16 15:05:40.611614 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:40.611532 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 15:05:40.620992 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:40.620970 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-2jxbh"] Apr 16 15:05:40.751493 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:40.751460 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jpjl\" (UniqueName: \"kubernetes.io/projected/f31ce00d-1f54-4549-befe-7b377443d8b2-kube-api-access-7jpjl\") pod \"model-serving-api-86f7b4b499-2jxbh\" (UID: \"f31ce00d-1f54-4549-befe-7b377443d8b2\") " pod="kserve/model-serving-api-86f7b4b499-2jxbh" Apr 16 15:05:40.751663 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:40.751520 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f31ce00d-1f54-4549-befe-7b377443d8b2-tls-certs\") pod \"model-serving-api-86f7b4b499-2jxbh\" (UID: \"f31ce00d-1f54-4549-befe-7b377443d8b2\") " pod="kserve/model-serving-api-86f7b4b499-2jxbh" Apr 16 15:05:40.852812 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:40.852774 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f31ce00d-1f54-4549-befe-7b377443d8b2-tls-certs\") pod \"model-serving-api-86f7b4b499-2jxbh\" (UID: \"f31ce00d-1f54-4549-befe-7b377443d8b2\") " pod="kserve/model-serving-api-86f7b4b499-2jxbh" Apr 16 15:05:40.853291 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:40.852875 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jpjl\" (UniqueName: \"kubernetes.io/projected/f31ce00d-1f54-4549-befe-7b377443d8b2-kube-api-access-7jpjl\") pod \"model-serving-api-86f7b4b499-2jxbh\" (UID: \"f31ce00d-1f54-4549-befe-7b377443d8b2\") " pod="kserve/model-serving-api-86f7b4b499-2jxbh" Apr 16 15:05:40.855331 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:40.855308 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f31ce00d-1f54-4549-befe-7b377443d8b2-tls-certs\") pod \"model-serving-api-86f7b4b499-2jxbh\" (UID: \"f31ce00d-1f54-4549-befe-7b377443d8b2\") " pod="kserve/model-serving-api-86f7b4b499-2jxbh" Apr 16 15:05:40.861216 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:40.861149 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jpjl\" (UniqueName: \"kubernetes.io/projected/f31ce00d-1f54-4549-befe-7b377443d8b2-kube-api-access-7jpjl\") pod \"model-serving-api-86f7b4b499-2jxbh\" (UID: \"f31ce00d-1f54-4549-befe-7b377443d8b2\") " pod="kserve/model-serving-api-86f7b4b499-2jxbh" Apr 16 15:05:40.921238 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:40.921196 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-2jxbh" Apr 16 15:05:41.044058 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:41.043950 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-2jxbh"] Apr 16 15:05:41.046248 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:05:41.046220 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf31ce00d_1f54_4549_befe_7b377443d8b2.slice/crio-20f5a8d5e43965c80afe3c39c6a17c3bcc9565bd3a0777a949d5971271b6c3f7 WatchSource:0}: Error finding container 20f5a8d5e43965c80afe3c39c6a17c3bcc9565bd3a0777a949d5971271b6c3f7: Status 404 returned error can't find the container with id 20f5a8d5e43965c80afe3c39c6a17c3bcc9565bd3a0777a949d5971271b6c3f7 Apr 16 15:05:41.871870 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:41.871832 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-2jxbh" event={"ID":"f31ce00d-1f54-4549-befe-7b377443d8b2","Type":"ContainerStarted","Data":"20f5a8d5e43965c80afe3c39c6a17c3bcc9565bd3a0777a949d5971271b6c3f7"} Apr 16 15:05:43.879266 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:43.879230 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-2jxbh" event={"ID":"f31ce00d-1f54-4549-befe-7b377443d8b2","Type":"ContainerStarted","Data":"b1eda7ca53f93655f8d12f4c7bca6ac07e4eebc29b754071f7046be32d68f0a7"} Apr 16 15:05:43.879658 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:43.879355 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-2jxbh" Apr 16 15:05:43.900740 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:43.900689 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-2jxbh" podStartSLOduration=2.087142271 podStartE2EDuration="3.900675431s" podCreationTimestamp="2026-04-16 15:05:40 +0000 UTC" firstStartedPulling="2026-04-16 15:05:41.047961498 +0000 UTC m=+775.467096455" lastFinishedPulling="2026-04-16 15:05:42.861494659 +0000 UTC m=+777.280629615" observedRunningTime="2026-04-16 15:05:43.899462063 +0000 UTC m=+778.318597039" watchObservedRunningTime="2026-04-16 15:05:43.900675431 +0000 UTC m=+778.319810406" Apr 16 15:05:54.886232 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:05:54.886186 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-2jxbh" Apr 16 15:06:24.180655 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:24.180621 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-rvntz"] Apr 16 15:06:24.183814 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:24.183797 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-rvntz" Apr 16 15:06:24.185922 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:24.185900 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 16 15:06:24.186057 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:24.185900 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 16 15:06:24.190863 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:24.190836 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-rvntz"] Apr 16 15:06:24.234082 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:24.234045 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0e9169df-66a6-46a5-b2de-67e2ef9a9606-data\") pod \"seaweedfs-tls-serving-7fd5766db9-rvntz\" (UID: \"0e9169df-66a6-46a5-b2de-67e2ef9a9606\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-rvntz" Apr 16 15:06:24.234266 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:24.234105 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/0e9169df-66a6-46a5-b2de-67e2ef9a9606-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-rvntz\" (UID: \"0e9169df-66a6-46a5-b2de-67e2ef9a9606\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-rvntz" Apr 16 15:06:24.234266 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:24.234142 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs2zr\" (UniqueName: \"kubernetes.io/projected/0e9169df-66a6-46a5-b2de-67e2ef9a9606-kube-api-access-fs2zr\") pod \"seaweedfs-tls-serving-7fd5766db9-rvntz\" (UID: \"0e9169df-66a6-46a5-b2de-67e2ef9a9606\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-rvntz" Apr 16 15:06:24.335476 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:24.335441 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0e9169df-66a6-46a5-b2de-67e2ef9a9606-data\") pod \"seaweedfs-tls-serving-7fd5766db9-rvntz\" (UID: \"0e9169df-66a6-46a5-b2de-67e2ef9a9606\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-rvntz" Apr 16 15:06:24.335637 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:24.335517 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/0e9169df-66a6-46a5-b2de-67e2ef9a9606-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-rvntz\" (UID: \"0e9169df-66a6-46a5-b2de-67e2ef9a9606\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-rvntz" Apr 16 15:06:24.335637 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:24.335552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fs2zr\" (UniqueName: \"kubernetes.io/projected/0e9169df-66a6-46a5-b2de-67e2ef9a9606-kube-api-access-fs2zr\") pod \"seaweedfs-tls-serving-7fd5766db9-rvntz\" (UID: \"0e9169df-66a6-46a5-b2de-67e2ef9a9606\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-rvntz" Apr 16 15:06:24.335713 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:06:24.335668 2576 projected.go:264] Couldn't get secret kserve/seaweedfs-tls-serving: secret "seaweedfs-tls-serving" not found Apr 16 15:06:24.335713 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:06:24.335690 2576 projected.go:194] Error preparing data for projected volume seaweedfs-tls-serving for pod kserve/seaweedfs-tls-serving-7fd5766db9-rvntz: secret "seaweedfs-tls-serving" not found Apr 16 15:06:24.335775 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:06:24.335757 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e9169df-66a6-46a5-b2de-67e2ef9a9606-seaweedfs-tls-serving podName:0e9169df-66a6-46a5-b2de-67e2ef9a9606 nodeName:}" failed. No retries permitted until 2026-04-16 15:06:24.835737322 +0000 UTC m=+819.254872291 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "seaweedfs-tls-serving" (UniqueName: "kubernetes.io/projected/0e9169df-66a6-46a5-b2de-67e2ef9a9606-seaweedfs-tls-serving") pod "seaweedfs-tls-serving-7fd5766db9-rvntz" (UID: "0e9169df-66a6-46a5-b2de-67e2ef9a9606") : secret "seaweedfs-tls-serving" not found Apr 16 15:06:24.335868 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:24.335849 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0e9169df-66a6-46a5-b2de-67e2ef9a9606-data\") pod \"seaweedfs-tls-serving-7fd5766db9-rvntz\" (UID: \"0e9169df-66a6-46a5-b2de-67e2ef9a9606\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-rvntz" Apr 16 15:06:24.345476 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:24.345447 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs2zr\" (UniqueName: \"kubernetes.io/projected/0e9169df-66a6-46a5-b2de-67e2ef9a9606-kube-api-access-fs2zr\") pod \"seaweedfs-tls-serving-7fd5766db9-rvntz\" (UID: \"0e9169df-66a6-46a5-b2de-67e2ef9a9606\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-rvntz" Apr 16 15:06:24.840317 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:24.840275 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/0e9169df-66a6-46a5-b2de-67e2ef9a9606-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-rvntz\" (UID: \"0e9169df-66a6-46a5-b2de-67e2ef9a9606\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-rvntz" Apr 16 15:06:24.842649 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:24.842625 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/0e9169df-66a6-46a5-b2de-67e2ef9a9606-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-rvntz\" (UID: \"0e9169df-66a6-46a5-b2de-67e2ef9a9606\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-rvntz" Apr 16 15:06:25.094750 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:25.094665 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-rvntz" Apr 16 15:06:25.214355 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:25.214332 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-rvntz"] Apr 16 15:06:25.216856 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:06:25.216827 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e9169df_66a6_46a5_b2de_67e2ef9a9606.slice/crio-d5d608082c0120ec20eec2e923eb4b4c9e2fb2909e54d585496144bb0690e219 WatchSource:0}: Error finding container d5d608082c0120ec20eec2e923eb4b4c9e2fb2909e54d585496144bb0690e219: Status 404 returned error can't find the container with id d5d608082c0120ec20eec2e923eb4b4c9e2fb2909e54d585496144bb0690e219 Apr 16 15:06:26.032824 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:26.032785 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-rvntz" event={"ID":"0e9169df-66a6-46a5-b2de-67e2ef9a9606","Type":"ContainerStarted","Data":"d32808e683b7bdd853acb3b65f5837066ecac24e3b8670909f5a1086fe34a436"} Apr 16 15:06:26.032824 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:26.032827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-rvntz" event={"ID":"0e9169df-66a6-46a5-b2de-67e2ef9a9606","Type":"ContainerStarted","Data":"d5d608082c0120ec20eec2e923eb4b4c9e2fb2909e54d585496144bb0690e219"} Apr 16 15:06:26.049078 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:26.049007 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-rvntz" podStartSLOduration=1.798326033 podStartE2EDuration="2.048994635s" podCreationTimestamp="2026-04-16 15:06:24 +0000 UTC" firstStartedPulling="2026-04-16 15:06:25.218108905 +0000 UTC m=+819.637243857" lastFinishedPulling="2026-04-16 15:06:25.468777506 +0000 UTC m=+819.887912459" observedRunningTime="2026-04-16 15:06:26.047754966 +0000 UTC m=+820.466889940" watchObservedRunningTime="2026-04-16 15:06:26.048994635 +0000 UTC m=+820.468129646" Apr 16 15:06:42.628374 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:42.628335 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k"] Apr 16 15:06:42.631754 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:42.631735 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" Apr 16 15:06:42.634165 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:42.634141 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tsb5d\"" Apr 16 15:06:42.639156 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:42.639129 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k"] Apr 16 15:06:42.710292 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:42.710253 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/968414fb-53db-4c60-8030-db726b7449fe-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-5d8576b666-c782k\" (UID: \"968414fb-53db-4c60-8030-db726b7449fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" Apr 16 15:06:42.811317 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:42.811283 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/968414fb-53db-4c60-8030-db726b7449fe-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-5d8576b666-c782k\" (UID: \"968414fb-53db-4c60-8030-db726b7449fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" Apr 16 15:06:42.811679 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:42.811655 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/968414fb-53db-4c60-8030-db726b7449fe-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-5d8576b666-c782k\" (UID: \"968414fb-53db-4c60-8030-db726b7449fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" Apr 16 15:06:42.943691 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:42.943651 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" Apr 16 15:06:43.270594 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:43.270529 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k"] Apr 16 15:06:43.272961 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:06:43.272927 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod968414fb_53db_4c60_8030_db726b7449fe.slice/crio-6d3392c5cc85b41beb087455d86fa5e109f6e1b8e8354d4ba9b1d293e666d1f9 WatchSource:0}: Error finding container 6d3392c5cc85b41beb087455d86fa5e109f6e1b8e8354d4ba9b1d293e666d1f9: Status 404 returned error can't find the container with id 6d3392c5cc85b41beb087455d86fa5e109f6e1b8e8354d4ba9b1d293e666d1f9 Apr 16 15:06:44.095054 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:44.094999 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" event={"ID":"968414fb-53db-4c60-8030-db726b7449fe","Type":"ContainerStarted","Data":"6d3392c5cc85b41beb087455d86fa5e109f6e1b8e8354d4ba9b1d293e666d1f9"} Apr 16 15:06:47.106954 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:47.106918 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" event={"ID":"968414fb-53db-4c60-8030-db726b7449fe","Type":"ContainerStarted","Data":"28c2b1a698296e1a779ee068f0580fcf7c31ab739040425ee7f3ddd11e6be735"} Apr 16 15:06:51.123211 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:51.123175 2576 generic.go:358] "Generic (PLEG): container finished" podID="968414fb-53db-4c60-8030-db726b7449fe" containerID="28c2b1a698296e1a779ee068f0580fcf7c31ab739040425ee7f3ddd11e6be735" exitCode=0 Apr 16 15:06:51.123603 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:06:51.123226 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" event={"ID":"968414fb-53db-4c60-8030-db726b7449fe","Type":"ContainerDied","Data":"28c2b1a698296e1a779ee068f0580fcf7c31ab739040425ee7f3ddd11e6be735"} Apr 16 15:07:04.360045 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:04.360001 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:07:05.185666 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:05.185628 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" event={"ID":"968414fb-53db-4c60-8030-db726b7449fe","Type":"ContainerStarted","Data":"754265a328ae7b0cf64b686122fac35ac90859ef01f9af3f2aefdc7d5719f072"} Apr 16 15:07:08.198345 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:08.198308 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" event={"ID":"968414fb-53db-4c60-8030-db726b7449fe","Type":"ContainerStarted","Data":"a484c29487f6f5b247e6f05b838b77f3da350fe78ae7f4bf63e8fbc8defd9c78"} Apr 16 15:07:08.198718 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:08.198503 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" Apr 16 15:07:08.199952 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:08.199925 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 15:07:08.214912 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:08.214868 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podStartSLOduration=2.003827552 podStartE2EDuration="26.21485526s" podCreationTimestamp="2026-04-16 15:06:42 +0000 UTC" firstStartedPulling="2026-04-16 15:06:43.275081921 +0000 UTC m=+837.694216878" lastFinishedPulling="2026-04-16 15:07:07.486109625 +0000 UTC m=+861.905244586" observedRunningTime="2026-04-16 15:07:08.212811588 +0000 UTC m=+862.631946563" watchObservedRunningTime="2026-04-16 15:07:08.21485526 +0000 UTC m=+862.633990235" Apr 16 15:07:09.207868 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:09.205445 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" Apr 16 15:07:09.208347 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:09.208316 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 15:07:09.208690 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:09.208654 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:07:10.207631 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:10.207585 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 15:07:10.207948 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:10.207919 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:07:20.208334 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:20.208291 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 15:07:20.208805 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:20.208776 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:07:30.208121 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:30.207988 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 15:07:30.208569 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:30.208469 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:07:40.207869 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:40.207823 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 15:07:40.208385 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:40.208359 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:07:46.246261 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:46.246231 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:07:46.246660 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:46.246408 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:07:46.248419 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:46.248402 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:07:46.248514 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:46.248460 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:07:50.207793 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:50.207750 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 15:07:50.208288 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:07:50.208263 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:08:00.207632 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:00.207582 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 15:08:00.208130 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:00.207956 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:08:10.208576 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:10.208525 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 15:08:10.209177 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:10.208984 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:08:20.208216 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:20.208185 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" Apr 16 15:08:20.208724 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:20.208653 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" Apr 16 15:08:27.759755 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:27.759720 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k"] Apr 16 15:08:27.761992 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:27.760123 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="kserve-container" containerID="cri-o://754265a328ae7b0cf64b686122fac35ac90859ef01f9af3f2aefdc7d5719f072" gracePeriod=30 Apr 16 15:08:27.761992 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:27.760210 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="agent" containerID="cri-o://a484c29487f6f5b247e6f05b838b77f3da350fe78ae7f4bf63e8fbc8defd9c78" gracePeriod=30 Apr 16 15:08:27.858079 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:27.858011 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2"] Apr 16 15:08:27.861703 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:27.861682 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" Apr 16 15:08:27.871653 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:27.871624 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2"] Apr 16 15:08:27.948441 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:27.948405 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fdb6f45-ea50-4d34-96de-52cf6cb55b98-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2\" (UID: \"2fdb6f45-ea50-4d34-96de-52cf6cb55b98\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" Apr 16 15:08:28.049851 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:28.049752 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fdb6f45-ea50-4d34-96de-52cf6cb55b98-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2\" (UID: \"2fdb6f45-ea50-4d34-96de-52cf6cb55b98\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" Apr 16 15:08:28.050203 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:28.050181 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fdb6f45-ea50-4d34-96de-52cf6cb55b98-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2\" (UID: \"2fdb6f45-ea50-4d34-96de-52cf6cb55b98\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" Apr 16 15:08:28.174410 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:28.174377 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" Apr 16 15:08:28.297539 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:28.297508 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2"] Apr 16 15:08:28.301312 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:08:28.301249 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fdb6f45_ea50_4d34_96de_52cf6cb55b98.slice/crio-95275c75c4bd79cc3272596f7fa6e58533a777be43b1d0fe8c3dc82ca169ba30 WatchSource:0}: Error finding container 95275c75c4bd79cc3272596f7fa6e58533a777be43b1d0fe8c3dc82ca169ba30: Status 404 returned error can't find the container with id 95275c75c4bd79cc3272596f7fa6e58533a777be43b1d0fe8c3dc82ca169ba30 Apr 16 15:08:28.483255 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:28.483220 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" event={"ID":"2fdb6f45-ea50-4d34-96de-52cf6cb55b98","Type":"ContainerStarted","Data":"c26508d8e51a856e476be2bb6e1f74b6fcb47d37e27366fdb7352e9eaf4b6fe6"} Apr 16 15:08:28.483431 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:28.483262 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" event={"ID":"2fdb6f45-ea50-4d34-96de-52cf6cb55b98","Type":"ContainerStarted","Data":"95275c75c4bd79cc3272596f7fa6e58533a777be43b1d0fe8c3dc82ca169ba30"} Apr 16 15:08:30.207966 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:30.207923 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 15:08:30.209525 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:30.209494 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:08:32.498706 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:32.498605 2576 generic.go:358] "Generic (PLEG): container finished" podID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerID="c26508d8e51a856e476be2bb6e1f74b6fcb47d37e27366fdb7352e9eaf4b6fe6" exitCode=0 Apr 16 15:08:32.498706 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:32.498682 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" event={"ID":"2fdb6f45-ea50-4d34-96de-52cf6cb55b98","Type":"ContainerDied","Data":"c26508d8e51a856e476be2bb6e1f74b6fcb47d37e27366fdb7352e9eaf4b6fe6"} Apr 16 15:08:32.500659 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:32.500638 2576 generic.go:358] "Generic (PLEG): container finished" podID="968414fb-53db-4c60-8030-db726b7449fe" containerID="754265a328ae7b0cf64b686122fac35ac90859ef01f9af3f2aefdc7d5719f072" exitCode=0 Apr 16 15:08:32.500760 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:32.500695 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" event={"ID":"968414fb-53db-4c60-8030-db726b7449fe","Type":"ContainerDied","Data":"754265a328ae7b0cf64b686122fac35ac90859ef01f9af3f2aefdc7d5719f072"} Apr 16 15:08:33.505978 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:33.505945 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" event={"ID":"2fdb6f45-ea50-4d34-96de-52cf6cb55b98","Type":"ContainerStarted","Data":"8a136d8197a6229c18ef2ab928f824edf71131702b67467ca7d997db1a21cdcd"} Apr 16 15:08:33.506383 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:33.505986 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" event={"ID":"2fdb6f45-ea50-4d34-96de-52cf6cb55b98","Type":"ContainerStarted","Data":"48c6920ad0c550b192e5bb24c3db0c2eb65644e8283c9e8d6d546b7d9c7461e7"} Apr 16 15:08:33.506383 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:33.506281 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" Apr 16 15:08:33.507659 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:33.507631 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 15:08:33.524429 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:33.524388 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podStartSLOduration=6.524371883 podStartE2EDuration="6.524371883s" podCreationTimestamp="2026-04-16 15:08:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:08:33.521855995 +0000 UTC m=+947.940990971" watchObservedRunningTime="2026-04-16 15:08:33.524371883 +0000 UTC m=+947.943506860" Apr 16 15:08:34.509771 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:34.509734 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" Apr 16 15:08:34.510184 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:34.509921 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 15:08:34.510686 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:34.510664 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:08:35.513611 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:35.513573 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 15:08:35.514054 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:35.513966 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:08:40.208158 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:40.208112 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 15:08:40.208996 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:40.208968 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:08:45.513753 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:45.513698 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 15:08:45.514229 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:45.514205 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:08:50.207794 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:50.207755 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 15:08:50.208213 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:50.207874 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" Apr 16 15:08:50.209356 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:50.209329 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:08:50.209464 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:50.209424 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" Apr 16 15:08:55.514356 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:55.514301 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 15:08:55.514887 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:55.514736 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:08:58.402577 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:58.402554 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" Apr 16 15:08:58.418844 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:58.418532 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/968414fb-53db-4c60-8030-db726b7449fe-kserve-provision-location\") pod \"968414fb-53db-4c60-8030-db726b7449fe\" (UID: \"968414fb-53db-4c60-8030-db726b7449fe\") " Apr 16 15:08:58.419456 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:58.419414 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/968414fb-53db-4c60-8030-db726b7449fe-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "968414fb-53db-4c60-8030-db726b7449fe" (UID: "968414fb-53db-4c60-8030-db726b7449fe"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:08:58.519376 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:58.519338 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/968414fb-53db-4c60-8030-db726b7449fe-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:08:58.597908 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:58.597828 2576 generic.go:358] "Generic (PLEG): container finished" podID="968414fb-53db-4c60-8030-db726b7449fe" containerID="a484c29487f6f5b247e6f05b838b77f3da350fe78ae7f4bf63e8fbc8defd9c78" exitCode=0 Apr 16 15:08:58.598068 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:58.597908 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" Apr 16 15:08:58.598068 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:58.597906 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" event={"ID":"968414fb-53db-4c60-8030-db726b7449fe","Type":"ContainerDied","Data":"a484c29487f6f5b247e6f05b838b77f3da350fe78ae7f4bf63e8fbc8defd9c78"} Apr 16 15:08:58.598068 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:58.597946 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k" event={"ID":"968414fb-53db-4c60-8030-db726b7449fe","Type":"ContainerDied","Data":"6d3392c5cc85b41beb087455d86fa5e109f6e1b8e8354d4ba9b1d293e666d1f9"} Apr 16 15:08:58.598068 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:58.597966 2576 scope.go:117] "RemoveContainer" containerID="a484c29487f6f5b247e6f05b838b77f3da350fe78ae7f4bf63e8fbc8defd9c78" Apr 16 15:08:58.606505 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:58.606488 2576 scope.go:117] "RemoveContainer" containerID="754265a328ae7b0cf64b686122fac35ac90859ef01f9af3f2aefdc7d5719f072" Apr 16 15:08:58.613806 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:58.613788 2576 scope.go:117] "RemoveContainer" containerID="28c2b1a698296e1a779ee068f0580fcf7c31ab739040425ee7f3ddd11e6be735" Apr 16 15:08:58.619390 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:58.619364 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k"] Apr 16 15:08:58.622250 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:58.622231 2576 scope.go:117] "RemoveContainer" containerID="a484c29487f6f5b247e6f05b838b77f3da350fe78ae7f4bf63e8fbc8defd9c78" Apr 16 15:08:58.622507 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:08:58.622488 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a484c29487f6f5b247e6f05b838b77f3da350fe78ae7f4bf63e8fbc8defd9c78\": container with ID starting with a484c29487f6f5b247e6f05b838b77f3da350fe78ae7f4bf63e8fbc8defd9c78 not found: ID does not exist" containerID="a484c29487f6f5b247e6f05b838b77f3da350fe78ae7f4bf63e8fbc8defd9c78" Apr 16 15:08:58.622548 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:58.622517 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a484c29487f6f5b247e6f05b838b77f3da350fe78ae7f4bf63e8fbc8defd9c78"} err="failed to get container status \"a484c29487f6f5b247e6f05b838b77f3da350fe78ae7f4bf63e8fbc8defd9c78\": rpc error: code = NotFound desc = could not find container \"a484c29487f6f5b247e6f05b838b77f3da350fe78ae7f4bf63e8fbc8defd9c78\": container with ID starting with a484c29487f6f5b247e6f05b838b77f3da350fe78ae7f4bf63e8fbc8defd9c78 not found: ID does not exist" Apr 16 15:08:58.622548 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:58.622538 2576 scope.go:117] "RemoveContainer" containerID="754265a328ae7b0cf64b686122fac35ac90859ef01f9af3f2aefdc7d5719f072" Apr 16 15:08:58.622807 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:08:58.622791 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"754265a328ae7b0cf64b686122fac35ac90859ef01f9af3f2aefdc7d5719f072\": container with ID starting with 754265a328ae7b0cf64b686122fac35ac90859ef01f9af3f2aefdc7d5719f072 not found: ID does not exist" containerID="754265a328ae7b0cf64b686122fac35ac90859ef01f9af3f2aefdc7d5719f072" Apr 16 15:08:58.622846 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:58.622815 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"754265a328ae7b0cf64b686122fac35ac90859ef01f9af3f2aefdc7d5719f072"} err="failed to get container status \"754265a328ae7b0cf64b686122fac35ac90859ef01f9af3f2aefdc7d5719f072\": rpc error: code = NotFound desc = could not find container \"754265a328ae7b0cf64b686122fac35ac90859ef01f9af3f2aefdc7d5719f072\": container with ID starting with 754265a328ae7b0cf64b686122fac35ac90859ef01f9af3f2aefdc7d5719f072 not found: ID does not exist" Apr 16 15:08:58.622846 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:58.622831 2576 scope.go:117] "RemoveContainer" containerID="28c2b1a698296e1a779ee068f0580fcf7c31ab739040425ee7f3ddd11e6be735" Apr 16 15:08:58.623077 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:08:58.623055 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28c2b1a698296e1a779ee068f0580fcf7c31ab739040425ee7f3ddd11e6be735\": container with ID starting with 28c2b1a698296e1a779ee068f0580fcf7c31ab739040425ee7f3ddd11e6be735 not found: ID does not exist" containerID="28c2b1a698296e1a779ee068f0580fcf7c31ab739040425ee7f3ddd11e6be735" Apr 16 15:08:58.623077 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:58.623068 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d8576b666-c782k"] Apr 16 15:08:58.623217 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:08:58.623080 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28c2b1a698296e1a779ee068f0580fcf7c31ab739040425ee7f3ddd11e6be735"} err="failed to get container status \"28c2b1a698296e1a779ee068f0580fcf7c31ab739040425ee7f3ddd11e6be735\": rpc error: code = NotFound desc = could not find container \"28c2b1a698296e1a779ee068f0580fcf7c31ab739040425ee7f3ddd11e6be735\": container with ID starting with 28c2b1a698296e1a779ee068f0580fcf7c31ab739040425ee7f3ddd11e6be735 not found: ID does not exist" Apr 16 15:09:00.307633 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:09:00.307602 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="968414fb-53db-4c60-8030-db726b7449fe" path="/var/lib/kubelet/pods/968414fb-53db-4c60-8030-db726b7449fe/volumes" Apr 16 15:09:05.513715 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:09:05.513670 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 15:09:05.514231 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:09:05.514212 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:09:15.514013 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:09:15.513962 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 15:09:15.514437 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:09:15.514410 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:09:25.513733 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:09:25.513683 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 15:09:25.514170 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:09:25.514067 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:09:35.513929 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:09:35.513878 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 15:09:35.514452 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:09:35.514363 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:09:45.514201 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:09:45.514164 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" Apr 16 15:09:45.514686 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:09:45.514452 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" Apr 16 15:09:52.932692 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:09:52.932659 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2"] Apr 16 15:09:52.933122 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:09:52.933047 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="kserve-container" containerID="cri-o://48c6920ad0c550b192e5bb24c3db0c2eb65644e8283c9e8d6d546b7d9c7461e7" gracePeriod=30 Apr 16 15:09:52.933215 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:09:52.933158 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="agent" containerID="cri-o://8a136d8197a6229c18ef2ab928f824edf71131702b67467ca7d997db1a21cdcd" gracePeriod=30 Apr 16 15:09:55.514347 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:09:55.514289 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 15:09:55.514752 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:09:55.514632 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:09:57.816183 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:09:57.816152 2576 generic.go:358] "Generic (PLEG): container finished" podID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerID="48c6920ad0c550b192e5bb24c3db0c2eb65644e8283c9e8d6d546b7d9c7461e7" exitCode=0 Apr 16 15:09:57.816528 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:09:57.816195 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" event={"ID":"2fdb6f45-ea50-4d34-96de-52cf6cb55b98","Type":"ContainerDied","Data":"48c6920ad0c550b192e5bb24c3db0c2eb65644e8283c9e8d6d546b7d9c7461e7"} Apr 16 15:10:03.015225 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:03.015185 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf"] Apr 16 15:10:03.015687 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:03.015637 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="agent" Apr 16 15:10:03.015687 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:03.015651 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="agent" Apr 16 15:10:03.015687 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:03.015667 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="storage-initializer" Apr 16 15:10:03.015687 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:03.015673 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="storage-initializer" Apr 16 15:10:03.015687 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:03.015685 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="kserve-container" Apr 16 15:10:03.015687 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:03.015691 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="kserve-container" Apr 16 15:10:03.015981 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:03.015752 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="kserve-container" Apr 16 15:10:03.015981 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:03.015763 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="968414fb-53db-4c60-8030-db726b7449fe" containerName="agent" Apr 16 15:10:03.018992 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:03.018975 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" Apr 16 15:10:03.027662 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:03.027594 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf"] Apr 16 15:10:03.071708 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:03.071663 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed0fc457-d488-4087-bfa9-5cc3853f0a2a-kserve-provision-location\") pod \"isvc-logger-predictor-7c7767ff77-fcpnf\" (UID: \"ed0fc457-d488-4087-bfa9-5cc3853f0a2a\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" Apr 16 15:10:03.172228 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:03.172186 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed0fc457-d488-4087-bfa9-5cc3853f0a2a-kserve-provision-location\") pod \"isvc-logger-predictor-7c7767ff77-fcpnf\" (UID: \"ed0fc457-d488-4087-bfa9-5cc3853f0a2a\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" Apr 16 15:10:03.172601 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:03.172578 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed0fc457-d488-4087-bfa9-5cc3853f0a2a-kserve-provision-location\") pod \"isvc-logger-predictor-7c7767ff77-fcpnf\" (UID: \"ed0fc457-d488-4087-bfa9-5cc3853f0a2a\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" Apr 16 15:10:03.332664 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:03.332563 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" Apr 16 15:10:03.465345 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:03.465315 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf"] Apr 16 15:10:03.468633 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:10:03.468596 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded0fc457_d488_4087_bfa9_5cc3853f0a2a.slice/crio-6f98b9bd7c8284b804dad84b4130efd82f2368b30bef9a36524f05d5511da76a WatchSource:0}: Error finding container 6f98b9bd7c8284b804dad84b4130efd82f2368b30bef9a36524f05d5511da76a: Status 404 returned error can't find the container with id 6f98b9bd7c8284b804dad84b4130efd82f2368b30bef9a36524f05d5511da76a Apr 16 15:10:03.838464 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:03.838429 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" event={"ID":"ed0fc457-d488-4087-bfa9-5cc3853f0a2a","Type":"ContainerStarted","Data":"d376a4f81676efa352efad64bbed8ab4c736b98f9df14f780ec21903310a0fa3"} Apr 16 15:10:03.838464 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:03.838468 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" event={"ID":"ed0fc457-d488-4087-bfa9-5cc3853f0a2a","Type":"ContainerStarted","Data":"6f98b9bd7c8284b804dad84b4130efd82f2368b30bef9a36524f05d5511da76a"} Apr 16 15:10:05.513662 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:05.513619 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 15:10:05.514088 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:05.513933 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:10:07.853231 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:07.853198 2576 generic.go:358] "Generic (PLEG): container finished" podID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerID="d376a4f81676efa352efad64bbed8ab4c736b98f9df14f780ec21903310a0fa3" exitCode=0 Apr 16 15:10:07.853641 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:07.853271 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" event={"ID":"ed0fc457-d488-4087-bfa9-5cc3853f0a2a","Type":"ContainerDied","Data":"d376a4f81676efa352efad64bbed8ab4c736b98f9df14f780ec21903310a0fa3"} Apr 16 15:10:08.859595 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:08.859555 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" event={"ID":"ed0fc457-d488-4087-bfa9-5cc3853f0a2a","Type":"ContainerStarted","Data":"76715681552d50fc86e221c6a1d03886fc58d98fa0cad5ab1ef4de652b4a44ea"} Apr 16 15:10:08.859595 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:08.859598 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" event={"ID":"ed0fc457-d488-4087-bfa9-5cc3853f0a2a","Type":"ContainerStarted","Data":"f87e07dd9cc46e35f88813cd2e59dabca2c0f842fd7a4f0aa98107ef8e87ada8"} Apr 16 15:10:08.860171 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:08.859896 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" Apr 16 15:10:08.861266 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:08.861243 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 15:10:08.876448 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:08.876395 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podStartSLOduration=6.87637659 podStartE2EDuration="6.87637659s" podCreationTimestamp="2026-04-16 15:10:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:10:08.875087261 +0000 UTC m=+1043.294222237" watchObservedRunningTime="2026-04-16 15:10:08.87637659 +0000 UTC m=+1043.295511566" Apr 16 15:10:09.862823 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:09.862792 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" Apr 16 15:10:09.863251 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:09.862918 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 15:10:09.863827 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:09.863800 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:10:10.866687 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:10.866645 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 15:10:10.867096 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:10.866995 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:10:15.514175 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:15.514124 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Apr 16 15:10:15.514623 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:15.514285 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" Apr 16 15:10:15.514623 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:15.514431 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:10:15.514623 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:15.514536 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" Apr 16 15:10:20.866857 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:20.866807 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 15:10:20.867388 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:20.867363 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:10:23.083431 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:23.083409 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" Apr 16 15:10:23.153759 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:23.153720 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fdb6f45-ea50-4d34-96de-52cf6cb55b98-kserve-provision-location\") pod \"2fdb6f45-ea50-4d34-96de-52cf6cb55b98\" (UID: \"2fdb6f45-ea50-4d34-96de-52cf6cb55b98\") " Apr 16 15:10:23.154078 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:23.154052 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fdb6f45-ea50-4d34-96de-52cf6cb55b98-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2fdb6f45-ea50-4d34-96de-52cf6cb55b98" (UID: "2fdb6f45-ea50-4d34-96de-52cf6cb55b98"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:10:23.255241 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:23.255160 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fdb6f45-ea50-4d34-96de-52cf6cb55b98-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:10:23.913972 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:23.913931 2576 generic.go:358] "Generic (PLEG): container finished" podID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerID="8a136d8197a6229c18ef2ab928f824edf71131702b67467ca7d997db1a21cdcd" exitCode=0 Apr 16 15:10:23.914175 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:23.914016 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" Apr 16 15:10:23.914175 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:23.914045 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" event={"ID":"2fdb6f45-ea50-4d34-96de-52cf6cb55b98","Type":"ContainerDied","Data":"8a136d8197a6229c18ef2ab928f824edf71131702b67467ca7d997db1a21cdcd"} Apr 16 15:10:23.914175 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:23.914082 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2" event={"ID":"2fdb6f45-ea50-4d34-96de-52cf6cb55b98","Type":"ContainerDied","Data":"95275c75c4bd79cc3272596f7fa6e58533a777be43b1d0fe8c3dc82ca169ba30"} Apr 16 15:10:23.914175 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:23.914098 2576 scope.go:117] "RemoveContainer" containerID="8a136d8197a6229c18ef2ab928f824edf71131702b67467ca7d997db1a21cdcd" Apr 16 15:10:23.922490 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:23.922473 2576 scope.go:117] "RemoveContainer" containerID="48c6920ad0c550b192e5bb24c3db0c2eb65644e8283c9e8d6d546b7d9c7461e7" Apr 16 15:10:23.929781 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:23.929763 2576 scope.go:117] "RemoveContainer" containerID="c26508d8e51a856e476be2bb6e1f74b6fcb47d37e27366fdb7352e9eaf4b6fe6" Apr 16 15:10:23.935642 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:23.935614 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2"] Apr 16 15:10:23.938449 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:23.938103 2576 scope.go:117] "RemoveContainer" containerID="8a136d8197a6229c18ef2ab928f824edf71131702b67467ca7d997db1a21cdcd" Apr 16 15:10:23.938449 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:10:23.938399 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a136d8197a6229c18ef2ab928f824edf71131702b67467ca7d997db1a21cdcd\": container with ID starting with 8a136d8197a6229c18ef2ab928f824edf71131702b67467ca7d997db1a21cdcd not found: ID does not exist" containerID="8a136d8197a6229c18ef2ab928f824edf71131702b67467ca7d997db1a21cdcd" Apr 16 15:10:23.938449 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:23.938425 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a136d8197a6229c18ef2ab928f824edf71131702b67467ca7d997db1a21cdcd"} err="failed to get container status \"8a136d8197a6229c18ef2ab928f824edf71131702b67467ca7d997db1a21cdcd\": rpc error: code = NotFound desc = could not find container \"8a136d8197a6229c18ef2ab928f824edf71131702b67467ca7d997db1a21cdcd\": container with ID starting with 8a136d8197a6229c18ef2ab928f824edf71131702b67467ca7d997db1a21cdcd not found: ID does not exist" Apr 16 15:10:23.938449 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:23.938443 2576 scope.go:117] "RemoveContainer" containerID="48c6920ad0c550b192e5bb24c3db0c2eb65644e8283c9e8d6d546b7d9c7461e7" Apr 16 15:10:23.938698 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:10:23.938677 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48c6920ad0c550b192e5bb24c3db0c2eb65644e8283c9e8d6d546b7d9c7461e7\": container with ID starting with 48c6920ad0c550b192e5bb24c3db0c2eb65644e8283c9e8d6d546b7d9c7461e7 not found: ID does not exist" containerID="48c6920ad0c550b192e5bb24c3db0c2eb65644e8283c9e8d6d546b7d9c7461e7" Apr 16 15:10:23.938759 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:23.938705 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48c6920ad0c550b192e5bb24c3db0c2eb65644e8283c9e8d6d546b7d9c7461e7"} err="failed to get container status \"48c6920ad0c550b192e5bb24c3db0c2eb65644e8283c9e8d6d546b7d9c7461e7\": rpc error: code = NotFound desc = could not find container \"48c6920ad0c550b192e5bb24c3db0c2eb65644e8283c9e8d6d546b7d9c7461e7\": container with ID starting with 48c6920ad0c550b192e5bb24c3db0c2eb65644e8283c9e8d6d546b7d9c7461e7 not found: ID does not exist" Apr 16 15:10:23.938759 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:23.938724 2576 scope.go:117] "RemoveContainer" containerID="c26508d8e51a856e476be2bb6e1f74b6fcb47d37e27366fdb7352e9eaf4b6fe6" Apr 16 15:10:23.938976 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:10:23.938951 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c26508d8e51a856e476be2bb6e1f74b6fcb47d37e27366fdb7352e9eaf4b6fe6\": container with ID starting with c26508d8e51a856e476be2bb6e1f74b6fcb47d37e27366fdb7352e9eaf4b6fe6 not found: ID does not exist" containerID="c26508d8e51a856e476be2bb6e1f74b6fcb47d37e27366fdb7352e9eaf4b6fe6" Apr 16 15:10:23.939050 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:23.938979 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c26508d8e51a856e476be2bb6e1f74b6fcb47d37e27366fdb7352e9eaf4b6fe6"} err="failed to get container status \"c26508d8e51a856e476be2bb6e1f74b6fcb47d37e27366fdb7352e9eaf4b6fe6\": rpc error: code = NotFound desc = could not find container \"c26508d8e51a856e476be2bb6e1f74b6fcb47d37e27366fdb7352e9eaf4b6fe6\": container with ID starting with c26508d8e51a856e476be2bb6e1f74b6fcb47d37e27366fdb7352e9eaf4b6fe6 not found: ID does not exist" Apr 16 15:10:23.939258 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:23.939236 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-564b94bff5-tjgs2"] Apr 16 15:10:24.307788 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:24.307695 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" path="/var/lib/kubelet/pods/2fdb6f45-ea50-4d34-96de-52cf6cb55b98/volumes" Apr 16 15:10:30.866840 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:30.866748 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 15:10:30.867253 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:30.867175 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:10:40.867600 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:40.867553 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 15:10:40.868177 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:40.867995 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:10:50.867435 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:50.867382 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 15:10:50.867826 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:10:50.867779 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:11:00.867343 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:00.867290 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 15:11:00.867850 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:00.867824 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:11:10.867351 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:10.867296 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 15:11:10.867814 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:10.867736 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:11:20.867856 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:20.867822 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" Apr 16 15:11:20.868304 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:20.868005 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" Apr 16 15:11:28.251842 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:28.251810 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf"] Apr 16 15:11:28.252444 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:28.252228 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="kserve-container" containerID="cri-o://f87e07dd9cc46e35f88813cd2e59dabca2c0f842fd7a4f0aa98107ef8e87ada8" gracePeriod=30 Apr 16 15:11:28.252444 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:28.252318 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="agent" containerID="cri-o://76715681552d50fc86e221c6a1d03886fc58d98fa0cad5ab1ef4de652b4a44ea" gracePeriod=30 Apr 16 15:11:28.272112 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:28.272088 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj"] Apr 16 15:11:28.272500 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:28.272484 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="storage-initializer" Apr 16 15:11:28.272580 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:28.272503 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="storage-initializer" Apr 16 15:11:28.272580 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:28.272533 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="kserve-container" Apr 16 15:11:28.272580 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:28.272542 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="kserve-container" Apr 16 15:11:28.272580 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:28.272552 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="agent" Apr 16 15:11:28.272580 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:28.272561 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="agent" Apr 16 15:11:28.272822 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:28.272670 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="agent" Apr 16 15:11:28.272822 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:28.272684 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2fdb6f45-ea50-4d34-96de-52cf6cb55b98" containerName="kserve-container" Apr 16 15:11:28.276074 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:28.276057 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" Apr 16 15:11:28.285086 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:28.285067 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj"] Apr 16 15:11:28.412706 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:28.412676 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89b44339-93d2-467e-9380-e295fd4c420d-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-xghcj\" (UID: \"89b44339-93d2-467e-9380-e295fd4c420d\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" Apr 16 15:11:28.513863 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:28.513778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89b44339-93d2-467e-9380-e295fd4c420d-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-xghcj\" (UID: \"89b44339-93d2-467e-9380-e295fd4c420d\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" Apr 16 15:11:28.514218 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:28.514196 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89b44339-93d2-467e-9380-e295fd4c420d-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-xghcj\" (UID: \"89b44339-93d2-467e-9380-e295fd4c420d\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" Apr 16 15:11:28.587958 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:28.587912 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" Apr 16 15:11:28.708894 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:28.708864 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj"] Apr 16 15:11:28.710951 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:11:28.710916 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89b44339_93d2_467e_9380_e295fd4c420d.slice/crio-80a03d899e1c2310abe831af79c0019b29db4728131737fdb5c539defb807c1b WatchSource:0}: Error finding container 80a03d899e1c2310abe831af79c0019b29db4728131737fdb5c539defb807c1b: Status 404 returned error can't find the container with id 80a03d899e1c2310abe831af79c0019b29db4728131737fdb5c539defb807c1b Apr 16 15:11:29.148010 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:29.147901 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" event={"ID":"89b44339-93d2-467e-9380-e295fd4c420d","Type":"ContainerStarted","Data":"1dea9a51504764636707ec7878186943c550cc043d0ce79c9d2240d9ea1d6857"} Apr 16 15:11:29.148010 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:29.147968 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" event={"ID":"89b44339-93d2-467e-9380-e295fd4c420d","Type":"ContainerStarted","Data":"80a03d899e1c2310abe831af79c0019b29db4728131737fdb5c539defb807c1b"} Apr 16 15:11:30.867498 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:30.867445 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 15:11:30.867877 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:30.867722 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:11:33.165073 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:33.165039 2576 generic.go:358] "Generic (PLEG): container finished" podID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerID="f87e07dd9cc46e35f88813cd2e59dabca2c0f842fd7a4f0aa98107ef8e87ada8" exitCode=0 Apr 16 15:11:33.165458 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:33.165093 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" event={"ID":"ed0fc457-d488-4087-bfa9-5cc3853f0a2a","Type":"ContainerDied","Data":"f87e07dd9cc46e35f88813cd2e59dabca2c0f842fd7a4f0aa98107ef8e87ada8"} Apr 16 15:11:33.166369 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:33.166348 2576 generic.go:358] "Generic (PLEG): container finished" podID="89b44339-93d2-467e-9380-e295fd4c420d" containerID="1dea9a51504764636707ec7878186943c550cc043d0ce79c9d2240d9ea1d6857" exitCode=0 Apr 16 15:11:33.166449 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:33.166422 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" event={"ID":"89b44339-93d2-467e-9380-e295fd4c420d","Type":"ContainerDied","Data":"1dea9a51504764636707ec7878186943c550cc043d0ce79c9d2240d9ea1d6857"} Apr 16 15:11:40.196733 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:40.196698 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" event={"ID":"89b44339-93d2-467e-9380-e295fd4c420d","Type":"ContainerStarted","Data":"782963570f5733de3e9f8014cf40686c6930c226fc30fc6ec2ca094d1f03700f"} Apr 16 15:11:40.197115 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:40.196979 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" Apr 16 15:11:40.198240 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:40.198215 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" podUID="89b44339-93d2-467e-9380-e295fd4c420d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 15:11:40.213880 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:40.213835 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" podStartSLOduration=5.323258683 podStartE2EDuration="12.213824095s" podCreationTimestamp="2026-04-16 15:11:28 +0000 UTC" firstStartedPulling="2026-04-16 15:11:33.167590577 +0000 UTC m=+1127.586725530" lastFinishedPulling="2026-04-16 15:11:40.058155989 +0000 UTC m=+1134.477290942" observedRunningTime="2026-04-16 15:11:40.211341189 +0000 UTC m=+1134.630476163" watchObservedRunningTime="2026-04-16 15:11:40.213824095 +0000 UTC m=+1134.632959070" Apr 16 15:11:40.866739 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:40.866690 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 15:11:40.867055 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:40.867009 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:11:41.200972 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:41.200934 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" podUID="89b44339-93d2-467e-9380-e295fd4c420d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 15:11:50.867517 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:50.867468 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 15:11:50.868053 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:50.867628 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" Apr 16 15:11:50.868053 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:50.867816 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 15:11:50.868053 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:50.867918 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" Apr 16 15:11:51.201102 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:51.201066 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" podUID="89b44339-93d2-467e-9380-e295fd4c420d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 15:11:58.403014 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:58.402990 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" Apr 16 15:11:58.458128 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:58.458100 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed0fc457-d488-4087-bfa9-5cc3853f0a2a-kserve-provision-location\") pod \"ed0fc457-d488-4087-bfa9-5cc3853f0a2a\" (UID: \"ed0fc457-d488-4087-bfa9-5cc3853f0a2a\") " Apr 16 15:11:58.458441 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:58.458415 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed0fc457-d488-4087-bfa9-5cc3853f0a2a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ed0fc457-d488-4087-bfa9-5cc3853f0a2a" (UID: "ed0fc457-d488-4087-bfa9-5cc3853f0a2a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:11:58.559398 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:58.559314 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed0fc457-d488-4087-bfa9-5cc3853f0a2a-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:11:59.268620 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:59.268588 2576 generic.go:358] "Generic (PLEG): container finished" podID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerID="76715681552d50fc86e221c6a1d03886fc58d98fa0cad5ab1ef4de652b4a44ea" exitCode=137 Apr 16 15:11:59.268804 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:59.268671 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" Apr 16 15:11:59.268804 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:59.268670 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" event={"ID":"ed0fc457-d488-4087-bfa9-5cc3853f0a2a","Type":"ContainerDied","Data":"76715681552d50fc86e221c6a1d03886fc58d98fa0cad5ab1ef4de652b4a44ea"} Apr 16 15:11:59.268804 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:59.268713 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf" event={"ID":"ed0fc457-d488-4087-bfa9-5cc3853f0a2a","Type":"ContainerDied","Data":"6f98b9bd7c8284b804dad84b4130efd82f2368b30bef9a36524f05d5511da76a"} Apr 16 15:11:59.268804 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:59.268728 2576 scope.go:117] "RemoveContainer" containerID="76715681552d50fc86e221c6a1d03886fc58d98fa0cad5ab1ef4de652b4a44ea" Apr 16 15:11:59.277722 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:59.277706 2576 scope.go:117] "RemoveContainer" containerID="f87e07dd9cc46e35f88813cd2e59dabca2c0f842fd7a4f0aa98107ef8e87ada8" Apr 16 15:11:59.285120 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:59.285100 2576 scope.go:117] "RemoveContainer" containerID="d376a4f81676efa352efad64bbed8ab4c736b98f9df14f780ec21903310a0fa3" Apr 16 15:11:59.291281 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:59.291255 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf"] Apr 16 15:11:59.293073 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:59.293052 2576 scope.go:117] "RemoveContainer" containerID="76715681552d50fc86e221c6a1d03886fc58d98fa0cad5ab1ef4de652b4a44ea" Apr 16 15:11:59.293371 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:11:59.293353 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76715681552d50fc86e221c6a1d03886fc58d98fa0cad5ab1ef4de652b4a44ea\": container with ID starting with 76715681552d50fc86e221c6a1d03886fc58d98fa0cad5ab1ef4de652b4a44ea not found: ID does not exist" containerID="76715681552d50fc86e221c6a1d03886fc58d98fa0cad5ab1ef4de652b4a44ea" Apr 16 15:11:59.293453 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:59.293380 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76715681552d50fc86e221c6a1d03886fc58d98fa0cad5ab1ef4de652b4a44ea"} err="failed to get container status \"76715681552d50fc86e221c6a1d03886fc58d98fa0cad5ab1ef4de652b4a44ea\": rpc error: code = NotFound desc = could not find container \"76715681552d50fc86e221c6a1d03886fc58d98fa0cad5ab1ef4de652b4a44ea\": container with ID starting with 76715681552d50fc86e221c6a1d03886fc58d98fa0cad5ab1ef4de652b4a44ea not found: ID does not exist" Apr 16 15:11:59.293453 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:59.293398 2576 scope.go:117] "RemoveContainer" containerID="f87e07dd9cc46e35f88813cd2e59dabca2c0f842fd7a4f0aa98107ef8e87ada8" Apr 16 15:11:59.293688 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:11:59.293666 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f87e07dd9cc46e35f88813cd2e59dabca2c0f842fd7a4f0aa98107ef8e87ada8\": container with ID starting with f87e07dd9cc46e35f88813cd2e59dabca2c0f842fd7a4f0aa98107ef8e87ada8 not found: ID does not exist" containerID="f87e07dd9cc46e35f88813cd2e59dabca2c0f842fd7a4f0aa98107ef8e87ada8" Apr 16 15:11:59.293772 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:59.293699 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f87e07dd9cc46e35f88813cd2e59dabca2c0f842fd7a4f0aa98107ef8e87ada8"} err="failed to get container status \"f87e07dd9cc46e35f88813cd2e59dabca2c0f842fd7a4f0aa98107ef8e87ada8\": rpc error: code = NotFound desc = could not find container \"f87e07dd9cc46e35f88813cd2e59dabca2c0f842fd7a4f0aa98107ef8e87ada8\": container with ID starting with f87e07dd9cc46e35f88813cd2e59dabca2c0f842fd7a4f0aa98107ef8e87ada8 not found: ID does not exist" Apr 16 15:11:59.293772 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:59.293720 2576 scope.go:117] "RemoveContainer" containerID="d376a4f81676efa352efad64bbed8ab4c736b98f9df14f780ec21903310a0fa3" Apr 16 15:11:59.293979 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:11:59.293959 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d376a4f81676efa352efad64bbed8ab4c736b98f9df14f780ec21903310a0fa3\": container with ID starting with d376a4f81676efa352efad64bbed8ab4c736b98f9df14f780ec21903310a0fa3 not found: ID does not exist" containerID="d376a4f81676efa352efad64bbed8ab4c736b98f9df14f780ec21903310a0fa3" Apr 16 15:11:59.294153 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:59.293982 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d376a4f81676efa352efad64bbed8ab4c736b98f9df14f780ec21903310a0fa3"} err="failed to get container status \"d376a4f81676efa352efad64bbed8ab4c736b98f9df14f780ec21903310a0fa3\": rpc error: code = NotFound desc = could not find container \"d376a4f81676efa352efad64bbed8ab4c736b98f9df14f780ec21903310a0fa3\": container with ID starting with d376a4f81676efa352efad64bbed8ab4c736b98f9df14f780ec21903310a0fa3 not found: ID does not exist" Apr 16 15:11:59.294781 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:11:59.294764 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7c7767ff77-fcpnf"] Apr 16 15:12:00.307613 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:12:00.307581 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" path="/var/lib/kubelet/pods/ed0fc457-d488-4087-bfa9-5cc3853f0a2a/volumes" Apr 16 15:12:01.201755 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:12:01.201714 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" podUID="89b44339-93d2-467e-9380-e295fd4c420d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 15:12:11.200932 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:12:11.200885 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" podUID="89b44339-93d2-467e-9380-e295fd4c420d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 15:12:21.201958 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:12:21.201914 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" podUID="89b44339-93d2-467e-9380-e295fd4c420d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 15:12:31.201145 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:12:31.201093 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" podUID="89b44339-93d2-467e-9380-e295fd4c420d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 15:12:41.201170 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:12:41.201121 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" podUID="89b44339-93d2-467e-9380-e295fd4c420d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 15:12:46.273739 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:12:46.273709 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:12:46.275007 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:12:46.274842 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:12:46.276474 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:12:46.276450 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:12:46.277362 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:12:46.277345 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:12:51.201331 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:12:51.201283 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" podUID="89b44339-93d2-467e-9380-e295fd4c420d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 15:13:01.202340 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:01.202304 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" Apr 16 15:13:08.429717 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:08.429682 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj"] Apr 16 15:13:08.430195 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:08.430017 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" podUID="89b44339-93d2-467e-9380-e295fd4c420d" containerName="kserve-container" containerID="cri-o://782963570f5733de3e9f8014cf40686c6930c226fc30fc6ec2ca094d1f03700f" gracePeriod=30 Apr 16 15:13:08.496758 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:08.496719 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv"] Apr 16 15:13:08.497142 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:08.497126 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="kserve-container" Apr 16 15:13:08.497142 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:08.497142 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="kserve-container" Apr 16 15:13:08.497259 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:08.497157 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="agent" Apr 16 15:13:08.497259 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:08.497162 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="agent" Apr 16 15:13:08.497259 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:08.497183 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="storage-initializer" Apr 16 15:13:08.497259 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:08.497188 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="storage-initializer" Apr 16 15:13:08.497259 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:08.497248 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="agent" Apr 16 15:13:08.497259 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:08.497257 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed0fc457-d488-4087-bfa9-5cc3853f0a2a" containerName="kserve-container" Apr 16 15:13:08.500092 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:08.500070 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" Apr 16 15:13:08.508908 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:08.508880 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv"] Apr 16 15:13:08.562222 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:08.562188 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8091cd92-4777-4c13-a06a-184513576ec8-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv\" (UID: \"8091cd92-4777-4c13-a06a-184513576ec8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" Apr 16 15:13:08.663400 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:08.663362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8091cd92-4777-4c13-a06a-184513576ec8-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv\" (UID: \"8091cd92-4777-4c13-a06a-184513576ec8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" Apr 16 15:13:08.663754 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:08.663733 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8091cd92-4777-4c13-a06a-184513576ec8-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv\" (UID: \"8091cd92-4777-4c13-a06a-184513576ec8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" Apr 16 15:13:08.812864 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:08.812782 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" Apr 16 15:13:08.936962 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:08.936933 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv"] Apr 16 15:13:08.938972 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:13:08.938930 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8091cd92_4777_4c13_a06a_184513576ec8.slice/crio-c4723125781a3033f14a31f513f7f2a2f766515721bf7f3f1f517a260af7e173 WatchSource:0}: Error finding container c4723125781a3033f14a31f513f7f2a2f766515721bf7f3f1f517a260af7e173: Status 404 returned error can't find the container with id c4723125781a3033f14a31f513f7f2a2f766515721bf7f3f1f517a260af7e173 Apr 16 15:13:08.941393 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:08.941375 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:13:09.506073 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:09.506016 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" event={"ID":"8091cd92-4777-4c13-a06a-184513576ec8","Type":"ContainerStarted","Data":"de37dbcc00f6e64c6019a54e43d0eff99a66de6251c7a21997b9dd3d90dcaa85"} Apr 16 15:13:09.506073 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:09.506073 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" event={"ID":"8091cd92-4777-4c13-a06a-184513576ec8","Type":"ContainerStarted","Data":"c4723125781a3033f14a31f513f7f2a2f766515721bf7f3f1f517a260af7e173"} Apr 16 15:13:11.201214 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:11.201168 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" podUID="89b44339-93d2-467e-9380-e295fd4c420d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 15:13:12.978590 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:12.978565 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" Apr 16 15:13:13.108307 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:13.108224 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89b44339-93d2-467e-9380-e295fd4c420d-kserve-provision-location\") pod \"89b44339-93d2-467e-9380-e295fd4c420d\" (UID: \"89b44339-93d2-467e-9380-e295fd4c420d\") " Apr 16 15:13:13.108583 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:13.108560 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89b44339-93d2-467e-9380-e295fd4c420d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "89b44339-93d2-467e-9380-e295fd4c420d" (UID: "89b44339-93d2-467e-9380-e295fd4c420d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:13:13.209586 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:13.209535 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89b44339-93d2-467e-9380-e295fd4c420d-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:13:13.519844 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:13.519806 2576 generic.go:358] "Generic (PLEG): container finished" podID="89b44339-93d2-467e-9380-e295fd4c420d" containerID="782963570f5733de3e9f8014cf40686c6930c226fc30fc6ec2ca094d1f03700f" exitCode=0 Apr 16 15:13:13.520047 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:13.519885 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" Apr 16 15:13:13.520047 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:13.519892 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" event={"ID":"89b44339-93d2-467e-9380-e295fd4c420d","Type":"ContainerDied","Data":"782963570f5733de3e9f8014cf40686c6930c226fc30fc6ec2ca094d1f03700f"} Apr 16 15:13:13.520047 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:13.519935 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj" event={"ID":"89b44339-93d2-467e-9380-e295fd4c420d","Type":"ContainerDied","Data":"80a03d899e1c2310abe831af79c0019b29db4728131737fdb5c539defb807c1b"} Apr 16 15:13:13.520047 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:13.519954 2576 scope.go:117] "RemoveContainer" containerID="782963570f5733de3e9f8014cf40686c6930c226fc30fc6ec2ca094d1f03700f" Apr 16 15:13:13.521232 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:13.521211 2576 generic.go:358] "Generic (PLEG): container finished" podID="8091cd92-4777-4c13-a06a-184513576ec8" containerID="de37dbcc00f6e64c6019a54e43d0eff99a66de6251c7a21997b9dd3d90dcaa85" exitCode=0 Apr 16 15:13:13.521332 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:13.521285 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" event={"ID":"8091cd92-4777-4c13-a06a-184513576ec8","Type":"ContainerDied","Data":"de37dbcc00f6e64c6019a54e43d0eff99a66de6251c7a21997b9dd3d90dcaa85"} Apr 16 15:13:13.528675 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:13.528655 2576 scope.go:117] "RemoveContainer" containerID="1dea9a51504764636707ec7878186943c550cc043d0ce79c9d2240d9ea1d6857" Apr 16 15:13:13.536485 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:13.536441 2576 scope.go:117] "RemoveContainer" containerID="782963570f5733de3e9f8014cf40686c6930c226fc30fc6ec2ca094d1f03700f" Apr 16 15:13:13.537655 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:13:13.537159 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"782963570f5733de3e9f8014cf40686c6930c226fc30fc6ec2ca094d1f03700f\": container with ID starting with 782963570f5733de3e9f8014cf40686c6930c226fc30fc6ec2ca094d1f03700f not found: ID does not exist" containerID="782963570f5733de3e9f8014cf40686c6930c226fc30fc6ec2ca094d1f03700f" Apr 16 15:13:13.537655 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:13.537201 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"782963570f5733de3e9f8014cf40686c6930c226fc30fc6ec2ca094d1f03700f"} err="failed to get container status \"782963570f5733de3e9f8014cf40686c6930c226fc30fc6ec2ca094d1f03700f\": rpc error: code = NotFound desc = could not find container \"782963570f5733de3e9f8014cf40686c6930c226fc30fc6ec2ca094d1f03700f\": container with ID starting with 782963570f5733de3e9f8014cf40686c6930c226fc30fc6ec2ca094d1f03700f not found: ID does not exist" Apr 16 15:13:13.537655 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:13.537228 2576 scope.go:117] "RemoveContainer" containerID="1dea9a51504764636707ec7878186943c550cc043d0ce79c9d2240d9ea1d6857" Apr 16 15:13:13.537655 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:13:13.537502 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dea9a51504764636707ec7878186943c550cc043d0ce79c9d2240d9ea1d6857\": container with ID starting with 1dea9a51504764636707ec7878186943c550cc043d0ce79c9d2240d9ea1d6857 not found: ID does not exist" containerID="1dea9a51504764636707ec7878186943c550cc043d0ce79c9d2240d9ea1d6857" Apr 16 15:13:13.537655 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:13.537535 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dea9a51504764636707ec7878186943c550cc043d0ce79c9d2240d9ea1d6857"} err="failed to get container status \"1dea9a51504764636707ec7878186943c550cc043d0ce79c9d2240d9ea1d6857\": rpc error: code = NotFound desc = could not find container \"1dea9a51504764636707ec7878186943c550cc043d0ce79c9d2240d9ea1d6857\": container with ID starting with 1dea9a51504764636707ec7878186943c550cc043d0ce79c9d2240d9ea1d6857 not found: ID does not exist" Apr 16 15:13:13.549397 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:13.549365 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj"] Apr 16 15:13:13.551140 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:13.551115 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-xghcj"] Apr 16 15:13:14.307675 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:14.307640 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89b44339-93d2-467e-9380-e295fd4c420d" path="/var/lib/kubelet/pods/89b44339-93d2-467e-9380-e295fd4c420d/volumes" Apr 16 15:13:14.526610 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:14.526571 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" event={"ID":"8091cd92-4777-4c13-a06a-184513576ec8","Type":"ContainerStarted","Data":"623556529d87adc654bc9432817e265d5facc014b98c08b6448b62bfb3412480"} Apr 16 15:13:14.526863 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:14.526846 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" Apr 16 15:13:14.528081 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:14.528054 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" podUID="8091cd92-4777-4c13-a06a-184513576ec8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 15:13:14.542853 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:14.542804 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" podStartSLOduration=6.542789129 podStartE2EDuration="6.542789129s" podCreationTimestamp="2026-04-16 15:13:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:13:14.541870261 +0000 UTC m=+1228.961005236" watchObservedRunningTime="2026-04-16 15:13:14.542789129 +0000 UTC m=+1228.961924138" Apr 16 15:13:15.530952 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:15.530910 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" podUID="8091cd92-4777-4c13-a06a-184513576ec8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 15:13:25.531494 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:25.531444 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" podUID="8091cd92-4777-4c13-a06a-184513576ec8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 15:13:35.531088 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:35.531048 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" podUID="8091cd92-4777-4c13-a06a-184513576ec8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 15:13:45.531891 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:45.531837 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" podUID="8091cd92-4777-4c13-a06a-184513576ec8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 15:13:55.531393 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:13:55.531344 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" podUID="8091cd92-4777-4c13-a06a-184513576ec8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 15:14:05.531037 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:05.530980 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" podUID="8091cd92-4777-4c13-a06a-184513576ec8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 15:14:15.531113 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:15.531070 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" podUID="8091cd92-4777-4c13-a06a-184513576ec8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 15:14:25.531688 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:25.531648 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" podUID="8091cd92-4777-4c13-a06a-184513576ec8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 15:14:35.532193 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:35.532163 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" Apr 16 15:14:38.974418 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:38.974383 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv"] Apr 16 15:14:38.974896 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:38.974705 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" podUID="8091cd92-4777-4c13-a06a-184513576ec8" containerName="kserve-container" containerID="cri-o://623556529d87adc654bc9432817e265d5facc014b98c08b6448b62bfb3412480" gracePeriod=30 Apr 16 15:14:39.062117 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:39.062087 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5"] Apr 16 15:14:39.062601 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:39.062581 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89b44339-93d2-467e-9380-e295fd4c420d" containerName="storage-initializer" Apr 16 15:14:39.062601 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:39.062602 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b44339-93d2-467e-9380-e295fd4c420d" containerName="storage-initializer" Apr 16 15:14:39.062735 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:39.062611 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89b44339-93d2-467e-9380-e295fd4c420d" containerName="kserve-container" Apr 16 15:14:39.062735 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:39.062618 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b44339-93d2-467e-9380-e295fd4c420d" containerName="kserve-container" Apr 16 15:14:39.062735 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:39.062686 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="89b44339-93d2-467e-9380-e295fd4c420d" containerName="kserve-container" Apr 16 15:14:39.065908 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:39.065889 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5" Apr 16 15:14:39.075506 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:39.075482 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5"] Apr 16 15:14:39.165162 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:39.165123 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c6a4913-0e35-48aa-be05-69b4e3b32a68-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5\" (UID: \"5c6a4913-0e35-48aa-be05-69b4e3b32a68\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5" Apr 16 15:14:39.266591 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:39.266514 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c6a4913-0e35-48aa-be05-69b4e3b32a68-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5\" (UID: \"5c6a4913-0e35-48aa-be05-69b4e3b32a68\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5" Apr 16 15:14:39.266911 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:39.266888 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c6a4913-0e35-48aa-be05-69b4e3b32a68-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5\" (UID: \"5c6a4913-0e35-48aa-be05-69b4e3b32a68\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5" Apr 16 15:14:39.377041 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:39.376980 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5" Apr 16 15:14:39.500415 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:39.500387 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5"] Apr 16 15:14:39.502598 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:14:39.502572 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c6a4913_0e35_48aa_be05_69b4e3b32a68.slice/crio-e13dafe846701b597bf4432593f1af46348d6837695819e1bea039aa6e962d7b WatchSource:0}: Error finding container e13dafe846701b597bf4432593f1af46348d6837695819e1bea039aa6e962d7b: Status 404 returned error can't find the container with id e13dafe846701b597bf4432593f1af46348d6837695819e1bea039aa6e962d7b Apr 16 15:14:39.838953 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:39.838867 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5" event={"ID":"5c6a4913-0e35-48aa-be05-69b4e3b32a68","Type":"ContainerStarted","Data":"a1a45354f04d8209c4493057c7c096b4db3cde431accedfc2d4e465045bc7ef6"} Apr 16 15:14:39.838953 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:39.838911 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5" event={"ID":"5c6a4913-0e35-48aa-be05-69b4e3b32a68","Type":"ContainerStarted","Data":"e13dafe846701b597bf4432593f1af46348d6837695819e1bea039aa6e962d7b"} Apr 16 15:14:43.533648 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:43.533625 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" Apr 16 15:14:43.604060 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:43.603970 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8091cd92-4777-4c13-a06a-184513576ec8-kserve-provision-location\") pod \"8091cd92-4777-4c13-a06a-184513576ec8\" (UID: \"8091cd92-4777-4c13-a06a-184513576ec8\") " Apr 16 15:14:43.604310 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:43.604287 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8091cd92-4777-4c13-a06a-184513576ec8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8091cd92-4777-4c13-a06a-184513576ec8" (UID: "8091cd92-4777-4c13-a06a-184513576ec8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:14:43.705431 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:43.705398 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8091cd92-4777-4c13-a06a-184513576ec8-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:14:43.856354 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:43.856270 2576 generic.go:358] "Generic (PLEG): container finished" podID="8091cd92-4777-4c13-a06a-184513576ec8" containerID="623556529d87adc654bc9432817e265d5facc014b98c08b6448b62bfb3412480" exitCode=0 Apr 16 15:14:43.856354 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:43.856342 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" Apr 16 15:14:43.856542 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:43.856353 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" event={"ID":"8091cd92-4777-4c13-a06a-184513576ec8","Type":"ContainerDied","Data":"623556529d87adc654bc9432817e265d5facc014b98c08b6448b62bfb3412480"} Apr 16 15:14:43.856542 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:43.856393 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv" event={"ID":"8091cd92-4777-4c13-a06a-184513576ec8","Type":"ContainerDied","Data":"c4723125781a3033f14a31f513f7f2a2f766515721bf7f3f1f517a260af7e173"} Apr 16 15:14:43.856542 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:43.856412 2576 scope.go:117] "RemoveContainer" containerID="623556529d87adc654bc9432817e265d5facc014b98c08b6448b62bfb3412480" Apr 16 15:14:43.858080 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:43.858057 2576 generic.go:358] "Generic (PLEG): container finished" podID="5c6a4913-0e35-48aa-be05-69b4e3b32a68" containerID="a1a45354f04d8209c4493057c7c096b4db3cde431accedfc2d4e465045bc7ef6" exitCode=0 Apr 16 15:14:43.858177 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:43.858109 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5" event={"ID":"5c6a4913-0e35-48aa-be05-69b4e3b32a68","Type":"ContainerDied","Data":"a1a45354f04d8209c4493057c7c096b4db3cde431accedfc2d4e465045bc7ef6"} Apr 16 15:14:43.865279 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:43.865268 2576 scope.go:117] "RemoveContainer" containerID="de37dbcc00f6e64c6019a54e43d0eff99a66de6251c7a21997b9dd3d90dcaa85" Apr 16 15:14:43.872762 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:43.872747 2576 scope.go:117] "RemoveContainer" containerID="623556529d87adc654bc9432817e265d5facc014b98c08b6448b62bfb3412480" Apr 16 15:14:43.872989 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:14:43.872969 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"623556529d87adc654bc9432817e265d5facc014b98c08b6448b62bfb3412480\": container with ID starting with 623556529d87adc654bc9432817e265d5facc014b98c08b6448b62bfb3412480 not found: ID does not exist" containerID="623556529d87adc654bc9432817e265d5facc014b98c08b6448b62bfb3412480" Apr 16 15:14:43.873081 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:43.873001 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"623556529d87adc654bc9432817e265d5facc014b98c08b6448b62bfb3412480"} err="failed to get container status \"623556529d87adc654bc9432817e265d5facc014b98c08b6448b62bfb3412480\": rpc error: code = NotFound desc = could not find container \"623556529d87adc654bc9432817e265d5facc014b98c08b6448b62bfb3412480\": container with ID starting with 623556529d87adc654bc9432817e265d5facc014b98c08b6448b62bfb3412480 not found: ID does not exist" Apr 16 15:14:43.873081 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:43.873038 2576 scope.go:117] "RemoveContainer" containerID="de37dbcc00f6e64c6019a54e43d0eff99a66de6251c7a21997b9dd3d90dcaa85" Apr 16 15:14:43.873301 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:14:43.873281 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de37dbcc00f6e64c6019a54e43d0eff99a66de6251c7a21997b9dd3d90dcaa85\": container with ID starting with de37dbcc00f6e64c6019a54e43d0eff99a66de6251c7a21997b9dd3d90dcaa85 not found: ID does not exist" containerID="de37dbcc00f6e64c6019a54e43d0eff99a66de6251c7a21997b9dd3d90dcaa85" Apr 16 15:14:43.873343 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:43.873312 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de37dbcc00f6e64c6019a54e43d0eff99a66de6251c7a21997b9dd3d90dcaa85"} err="failed to get container status \"de37dbcc00f6e64c6019a54e43d0eff99a66de6251c7a21997b9dd3d90dcaa85\": rpc error: code = NotFound desc = could not find container \"de37dbcc00f6e64c6019a54e43d0eff99a66de6251c7a21997b9dd3d90dcaa85\": container with ID starting with de37dbcc00f6e64c6019a54e43d0eff99a66de6251c7a21997b9dd3d90dcaa85 not found: ID does not exist" Apr 16 15:14:43.887263 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:43.887225 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv"] Apr 16 15:14:43.888957 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:43.888938 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dwxrv"] Apr 16 15:14:44.309646 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:14:44.309604 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8091cd92-4777-4c13-a06a-184513576ec8" path="/var/lib/kubelet/pods/8091cd92-4777-4c13-a06a-184513576ec8/volumes" Apr 16 15:17:01.389737 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:01.389702 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5" event={"ID":"5c6a4913-0e35-48aa-be05-69b4e3b32a68","Type":"ContainerStarted","Data":"2f0a2a5a9c2010e8876e979e3a47e19fb06c2e1661b792d3214901af014a7eb1"} Apr 16 15:17:01.390174 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:01.389807 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5" Apr 16 15:17:01.416062 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:01.415992 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5" podStartSLOduration=5.670623831 podStartE2EDuration="2m22.415977596s" podCreationTimestamp="2026-04-16 15:14:39 +0000 UTC" firstStartedPulling="2026-04-16 15:14:43.859347544 +0000 UTC m=+1318.278482523" lastFinishedPulling="2026-04-16 15:17:00.604701328 +0000 UTC m=+1455.023836288" observedRunningTime="2026-04-16 15:17:01.41309897 +0000 UTC m=+1455.832233945" watchObservedRunningTime="2026-04-16 15:17:01.415977596 +0000 UTC m=+1455.835112571" Apr 16 15:17:32.398797 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:32.398766 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5" Apr 16 15:17:39.241632 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:39.241592 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5"] Apr 16 15:17:39.242221 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:39.241967 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5" podUID="5c6a4913-0e35-48aa-be05-69b4e3b32a68" containerName="kserve-container" containerID="cri-o://2f0a2a5a9c2010e8876e979e3a47e19fb06c2e1661b792d3214901af014a7eb1" gracePeriod=30 Apr 16 15:17:39.351061 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:39.351012 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7"] Apr 16 15:17:39.351430 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:39.351416 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8091cd92-4777-4c13-a06a-184513576ec8" containerName="storage-initializer" Apr 16 15:17:39.351430 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:39.351430 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8091cd92-4777-4c13-a06a-184513576ec8" containerName="storage-initializer" Apr 16 15:17:39.351559 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:39.351458 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8091cd92-4777-4c13-a06a-184513576ec8" containerName="kserve-container" Apr 16 15:17:39.351559 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:39.351464 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8091cd92-4777-4c13-a06a-184513576ec8" containerName="kserve-container" Apr 16 15:17:39.351559 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:39.351538 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8091cd92-4777-4c13-a06a-184513576ec8" containerName="kserve-container" Apr 16 15:17:39.355314 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:39.355287 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7" Apr 16 15:17:39.361908 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:39.361880 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7"] Apr 16 15:17:39.468095 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:39.468058 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4eff692-25d1-48e4-810c-a5325fabf194-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7\" (UID: \"d4eff692-25d1-48e4-810c-a5325fabf194\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7" Apr 16 15:17:39.568698 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:39.568607 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4eff692-25d1-48e4-810c-a5325fabf194-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7\" (UID: \"d4eff692-25d1-48e4-810c-a5325fabf194\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7" Apr 16 15:17:39.569001 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:39.568979 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4eff692-25d1-48e4-810c-a5325fabf194-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7\" (UID: \"d4eff692-25d1-48e4-810c-a5325fabf194\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7" Apr 16 15:17:39.667119 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:39.667083 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7" Apr 16 15:17:39.789228 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:39.789199 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7"] Apr 16 15:17:39.791466 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:17:39.791431 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4eff692_25d1_48e4_810c_a5325fabf194.slice/crio-6a2136766e494d2837bdc9350625d6e9a27de29a8c49e1cc8075e556886f3b8e WatchSource:0}: Error finding container 6a2136766e494d2837bdc9350625d6e9a27de29a8c49e1cc8075e556886f3b8e: Status 404 returned error can't find the container with id 6a2136766e494d2837bdc9350625d6e9a27de29a8c49e1cc8075e556886f3b8e Apr 16 15:17:40.285949 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:40.285925 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5" Apr 16 15:17:40.376143 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:40.376092 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c6a4913-0e35-48aa-be05-69b4e3b32a68-kserve-provision-location\") pod \"5c6a4913-0e35-48aa-be05-69b4e3b32a68\" (UID: \"5c6a4913-0e35-48aa-be05-69b4e3b32a68\") " Apr 16 15:17:40.376423 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:40.376401 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c6a4913-0e35-48aa-be05-69b4e3b32a68-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5c6a4913-0e35-48aa-be05-69b4e3b32a68" (UID: "5c6a4913-0e35-48aa-be05-69b4e3b32a68"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:17:40.477182 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:40.477136 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c6a4913-0e35-48aa-be05-69b4e3b32a68-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:17:40.522293 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:40.522246 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7" event={"ID":"d4eff692-25d1-48e4-810c-a5325fabf194","Type":"ContainerStarted","Data":"7f6f803aba8531c13217feaec6eb0fe6956a973bc7477b3657956e35d7bc1325"} Apr 16 15:17:40.522463 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:40.522304 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7" event={"ID":"d4eff692-25d1-48e4-810c-a5325fabf194","Type":"ContainerStarted","Data":"6a2136766e494d2837bdc9350625d6e9a27de29a8c49e1cc8075e556886f3b8e"} Apr 16 15:17:40.523603 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:40.523576 2576 generic.go:358] "Generic (PLEG): container finished" podID="5c6a4913-0e35-48aa-be05-69b4e3b32a68" containerID="2f0a2a5a9c2010e8876e979e3a47e19fb06c2e1661b792d3214901af014a7eb1" exitCode=0 Apr 16 15:17:40.523712 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:40.523637 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5" Apr 16 15:17:40.523712 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:40.523658 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5" event={"ID":"5c6a4913-0e35-48aa-be05-69b4e3b32a68","Type":"ContainerDied","Data":"2f0a2a5a9c2010e8876e979e3a47e19fb06c2e1661b792d3214901af014a7eb1"} Apr 16 15:17:40.523712 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:40.523692 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5" event={"ID":"5c6a4913-0e35-48aa-be05-69b4e3b32a68","Type":"ContainerDied","Data":"e13dafe846701b597bf4432593f1af46348d6837695819e1bea039aa6e962d7b"} Apr 16 15:17:40.523712 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:40.523707 2576 scope.go:117] "RemoveContainer" containerID="2f0a2a5a9c2010e8876e979e3a47e19fb06c2e1661b792d3214901af014a7eb1" Apr 16 15:17:40.532717 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:40.532699 2576 scope.go:117] "RemoveContainer" containerID="a1a45354f04d8209c4493057c7c096b4db3cde431accedfc2d4e465045bc7ef6" Apr 16 15:17:40.540801 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:40.540782 2576 scope.go:117] "RemoveContainer" containerID="2f0a2a5a9c2010e8876e979e3a47e19fb06c2e1661b792d3214901af014a7eb1" Apr 16 15:17:40.541084 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:17:40.541060 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f0a2a5a9c2010e8876e979e3a47e19fb06c2e1661b792d3214901af014a7eb1\": container with ID starting with 2f0a2a5a9c2010e8876e979e3a47e19fb06c2e1661b792d3214901af014a7eb1 not found: ID does not exist" containerID="2f0a2a5a9c2010e8876e979e3a47e19fb06c2e1661b792d3214901af014a7eb1" Apr 16 15:17:40.541225 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:40.541090 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f0a2a5a9c2010e8876e979e3a47e19fb06c2e1661b792d3214901af014a7eb1"} err="failed to get container status \"2f0a2a5a9c2010e8876e979e3a47e19fb06c2e1661b792d3214901af014a7eb1\": rpc error: code = NotFound desc = could not find container \"2f0a2a5a9c2010e8876e979e3a47e19fb06c2e1661b792d3214901af014a7eb1\": container with ID starting with 2f0a2a5a9c2010e8876e979e3a47e19fb06c2e1661b792d3214901af014a7eb1 not found: ID does not exist" Apr 16 15:17:40.541225 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:40.541110 2576 scope.go:117] "RemoveContainer" containerID="a1a45354f04d8209c4493057c7c096b4db3cde431accedfc2d4e465045bc7ef6" Apr 16 15:17:40.541379 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:17:40.541357 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1a45354f04d8209c4493057c7c096b4db3cde431accedfc2d4e465045bc7ef6\": container with ID starting with a1a45354f04d8209c4493057c7c096b4db3cde431accedfc2d4e465045bc7ef6 not found: ID does not exist" containerID="a1a45354f04d8209c4493057c7c096b4db3cde431accedfc2d4e465045bc7ef6" Apr 16 15:17:40.541420 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:40.541389 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1a45354f04d8209c4493057c7c096b4db3cde431accedfc2d4e465045bc7ef6"} err="failed to get container status \"a1a45354f04d8209c4493057c7c096b4db3cde431accedfc2d4e465045bc7ef6\": rpc error: code = NotFound desc = could not find container \"a1a45354f04d8209c4493057c7c096b4db3cde431accedfc2d4e465045bc7ef6\": container with ID starting with a1a45354f04d8209c4493057c7c096b4db3cde431accedfc2d4e465045bc7ef6 not found: ID does not exist" Apr 16 15:17:40.552856 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:40.552834 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5"] Apr 16 15:17:40.554866 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:40.554848 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-7kmd5"] Apr 16 15:17:42.307573 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:42.307542 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c6a4913-0e35-48aa-be05-69b4e3b32a68" path="/var/lib/kubelet/pods/5c6a4913-0e35-48aa-be05-69b4e3b32a68/volumes" Apr 16 15:17:43.539116 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:43.539086 2576 generic.go:358] "Generic (PLEG): container finished" podID="d4eff692-25d1-48e4-810c-a5325fabf194" containerID="7f6f803aba8531c13217feaec6eb0fe6956a973bc7477b3657956e35d7bc1325" exitCode=0 Apr 16 15:17:43.539434 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:43.539169 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7" event={"ID":"d4eff692-25d1-48e4-810c-a5325fabf194","Type":"ContainerDied","Data":"7f6f803aba8531c13217feaec6eb0fe6956a973bc7477b3657956e35d7bc1325"} Apr 16 15:17:44.544467 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:44.544432 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7" event={"ID":"d4eff692-25d1-48e4-810c-a5325fabf194","Type":"ContainerStarted","Data":"e37efdd6d61017cf38a62c83f680492cb928eec6556cbe0b3cf1ed57bd2187a7"} Apr 16 15:17:44.544941 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:44.544676 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7" Apr 16 15:17:44.546110 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:44.546082 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7" podUID="d4eff692-25d1-48e4-810c-a5325fabf194" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 15:17:44.565697 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:44.565646 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7" podStartSLOduration=5.565634316 podStartE2EDuration="5.565634316s" podCreationTimestamp="2026-04-16 15:17:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:17:44.559782251 +0000 UTC m=+1498.978917226" watchObservedRunningTime="2026-04-16 15:17:44.565634316 +0000 UTC m=+1498.984769290" Apr 16 15:17:45.548398 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:45.548355 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7" podUID="d4eff692-25d1-48e4-810c-a5325fabf194" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 15:17:46.303422 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:46.303389 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:17:46.305294 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:46.305272 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:17:46.307709 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:46.307686 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:17:46.309255 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:46.309236 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:17:55.550105 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:55.550069 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7" Apr 16 15:17:59.410767 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:59.410679 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7"] Apr 16 15:17:59.411235 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:59.410929 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7" podUID="d4eff692-25d1-48e4-810c-a5325fabf194" containerName="kserve-container" containerID="cri-o://e37efdd6d61017cf38a62c83f680492cb928eec6556cbe0b3cf1ed57bd2187a7" gracePeriod=30 Apr 16 15:17:59.441402 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:59.441361 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj"] Apr 16 15:17:59.441784 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:59.441769 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c6a4913-0e35-48aa-be05-69b4e3b32a68" containerName="storage-initializer" Apr 16 15:17:59.441784 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:59.441785 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c6a4913-0e35-48aa-be05-69b4e3b32a68" containerName="storage-initializer" Apr 16 15:17:59.441872 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:59.441809 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c6a4913-0e35-48aa-be05-69b4e3b32a68" containerName="kserve-container" Apr 16 15:17:59.441872 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:59.441815 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c6a4913-0e35-48aa-be05-69b4e3b32a68" containerName="kserve-container" Apr 16 15:17:59.441933 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:59.441881 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c6a4913-0e35-48aa-be05-69b4e3b32a68" containerName="kserve-container" Apr 16 15:17:59.446450 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:59.446428 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj" Apr 16 15:17:59.452869 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:59.452837 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj"] Apr 16 15:17:59.534569 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:59.534539 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ad3d097-ff49-4df7-8db1-ef1ca0ec8435-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj\" (UID: \"0ad3d097-ff49-4df7-8db1-ef1ca0ec8435\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj" Apr 16 15:17:59.635688 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:59.635642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ad3d097-ff49-4df7-8db1-ef1ca0ec8435-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj\" (UID: \"0ad3d097-ff49-4df7-8db1-ef1ca0ec8435\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj" Apr 16 15:17:59.636127 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:59.636107 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ad3d097-ff49-4df7-8db1-ef1ca0ec8435-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj\" (UID: \"0ad3d097-ff49-4df7-8db1-ef1ca0ec8435\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj" Apr 16 15:17:59.758093 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:59.757960 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj" Apr 16 15:17:59.884965 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:17:59.884938 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj"] Apr 16 15:17:59.923377 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:17:59.923341 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ad3d097_ff49_4df7_8db1_ef1ca0ec8435.slice/crio-c721c5d1111efb698b922ed4c48d32248389537962071b02fddbbe7b392b65e7 WatchSource:0}: Error finding container c721c5d1111efb698b922ed4c48d32248389537962071b02fddbbe7b392b65e7: Status 404 returned error can't find the container with id c721c5d1111efb698b922ed4c48d32248389537962071b02fddbbe7b392b65e7 Apr 16 15:18:00.042248 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:00.042226 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7" Apr 16 15:18:00.139686 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:00.139649 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4eff692-25d1-48e4-810c-a5325fabf194-kserve-provision-location\") pod \"d4eff692-25d1-48e4-810c-a5325fabf194\" (UID: \"d4eff692-25d1-48e4-810c-a5325fabf194\") " Apr 16 15:18:00.139980 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:00.139959 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4eff692-25d1-48e4-810c-a5325fabf194-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d4eff692-25d1-48e4-810c-a5325fabf194" (UID: "d4eff692-25d1-48e4-810c-a5325fabf194"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:18:00.241259 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:00.241216 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4eff692-25d1-48e4-810c-a5325fabf194-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:18:00.605162 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:00.605063 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj" event={"ID":"0ad3d097-ff49-4df7-8db1-ef1ca0ec8435","Type":"ContainerStarted","Data":"9f67c9c4fcb17d0665f7a8accc2d74e44b2dae7908e1282faa44179871ae4621"} Apr 16 15:18:00.605162 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:00.605106 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj" event={"ID":"0ad3d097-ff49-4df7-8db1-ef1ca0ec8435","Type":"ContainerStarted","Data":"c721c5d1111efb698b922ed4c48d32248389537962071b02fddbbe7b392b65e7"} Apr 16 15:18:00.606496 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:00.606471 2576 generic.go:358] "Generic (PLEG): container finished" podID="d4eff692-25d1-48e4-810c-a5325fabf194" containerID="e37efdd6d61017cf38a62c83f680492cb928eec6556cbe0b3cf1ed57bd2187a7" exitCode=0 Apr 16 15:18:00.606564 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:00.606513 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7" event={"ID":"d4eff692-25d1-48e4-810c-a5325fabf194","Type":"ContainerDied","Data":"e37efdd6d61017cf38a62c83f680492cb928eec6556cbe0b3cf1ed57bd2187a7"} Apr 16 15:18:00.606564 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:00.606548 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7" event={"ID":"d4eff692-25d1-48e4-810c-a5325fabf194","Type":"ContainerDied","Data":"6a2136766e494d2837bdc9350625d6e9a27de29a8c49e1cc8075e556886f3b8e"} Apr 16 15:18:00.606564 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:00.606556 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7" Apr 16 15:18:00.606678 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:00.606565 2576 scope.go:117] "RemoveContainer" containerID="e37efdd6d61017cf38a62c83f680492cb928eec6556cbe0b3cf1ed57bd2187a7" Apr 16 15:18:00.614718 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:00.614701 2576 scope.go:117] "RemoveContainer" containerID="7f6f803aba8531c13217feaec6eb0fe6956a973bc7477b3657956e35d7bc1325" Apr 16 15:18:00.622215 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:00.622197 2576 scope.go:117] "RemoveContainer" containerID="e37efdd6d61017cf38a62c83f680492cb928eec6556cbe0b3cf1ed57bd2187a7" Apr 16 15:18:00.622573 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:18:00.622546 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e37efdd6d61017cf38a62c83f680492cb928eec6556cbe0b3cf1ed57bd2187a7\": container with ID starting with e37efdd6d61017cf38a62c83f680492cb928eec6556cbe0b3cf1ed57bd2187a7 not found: ID does not exist" containerID="e37efdd6d61017cf38a62c83f680492cb928eec6556cbe0b3cf1ed57bd2187a7" Apr 16 15:18:00.622661 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:00.622599 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e37efdd6d61017cf38a62c83f680492cb928eec6556cbe0b3cf1ed57bd2187a7"} err="failed to get container status \"e37efdd6d61017cf38a62c83f680492cb928eec6556cbe0b3cf1ed57bd2187a7\": rpc error: code = NotFound desc = could not find container \"e37efdd6d61017cf38a62c83f680492cb928eec6556cbe0b3cf1ed57bd2187a7\": container with ID starting with e37efdd6d61017cf38a62c83f680492cb928eec6556cbe0b3cf1ed57bd2187a7 not found: ID does not exist" Apr 16 15:18:00.622661 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:00.622624 2576 scope.go:117] "RemoveContainer" containerID="7f6f803aba8531c13217feaec6eb0fe6956a973bc7477b3657956e35d7bc1325" Apr 16 15:18:00.622890 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:18:00.622873 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f6f803aba8531c13217feaec6eb0fe6956a973bc7477b3657956e35d7bc1325\": container with ID starting with 7f6f803aba8531c13217feaec6eb0fe6956a973bc7477b3657956e35d7bc1325 not found: ID does not exist" containerID="7f6f803aba8531c13217feaec6eb0fe6956a973bc7477b3657956e35d7bc1325" Apr 16 15:18:00.622933 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:00.622897 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f6f803aba8531c13217feaec6eb0fe6956a973bc7477b3657956e35d7bc1325"} err="failed to get container status \"7f6f803aba8531c13217feaec6eb0fe6956a973bc7477b3657956e35d7bc1325\": rpc error: code = NotFound desc = could not find container \"7f6f803aba8531c13217feaec6eb0fe6956a973bc7477b3657956e35d7bc1325\": container with ID starting with 7f6f803aba8531c13217feaec6eb0fe6956a973bc7477b3657956e35d7bc1325 not found: ID does not exist" Apr 16 15:18:00.632851 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:00.632830 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7"] Apr 16 15:18:00.636602 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:00.636581 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-bcss7"] Apr 16 15:18:02.307567 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:02.307537 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4eff692-25d1-48e4-810c-a5325fabf194" path="/var/lib/kubelet/pods/d4eff692-25d1-48e4-810c-a5325fabf194/volumes" Apr 16 15:18:04.625904 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:04.625870 2576 generic.go:358] "Generic (PLEG): container finished" podID="0ad3d097-ff49-4df7-8db1-ef1ca0ec8435" containerID="9f67c9c4fcb17d0665f7a8accc2d74e44b2dae7908e1282faa44179871ae4621" exitCode=0 Apr 16 15:18:04.626274 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:04.625940 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj" event={"ID":"0ad3d097-ff49-4df7-8db1-ef1ca0ec8435","Type":"ContainerDied","Data":"9f67c9c4fcb17d0665f7a8accc2d74e44b2dae7908e1282faa44179871ae4621"} Apr 16 15:18:05.631147 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:05.631108 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj" event={"ID":"0ad3d097-ff49-4df7-8db1-ef1ca0ec8435","Type":"ContainerStarted","Data":"57a7000e2cfe93d19567af0a43bf90ab13babd34db7b62bf12fdbe1bd159f215"} Apr 16 15:18:05.631566 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:05.631318 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj" Apr 16 15:18:05.648372 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:05.648316 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj" podStartSLOduration=6.648300806 podStartE2EDuration="6.648300806s" podCreationTimestamp="2026-04-16 15:17:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:18:05.646404592 +0000 UTC m=+1520.065539567" watchObservedRunningTime="2026-04-16 15:18:05.648300806 +0000 UTC m=+1520.067435781" Apr 16 15:18:36.639708 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:36.639672 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj" Apr 16 15:18:39.555891 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:39.555861 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj"] Apr 16 15:18:39.556302 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:39.556197 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj" podUID="0ad3d097-ff49-4df7-8db1-ef1ca0ec8435" containerName="kserve-container" containerID="cri-o://57a7000e2cfe93d19567af0a43bf90ab13babd34db7b62bf12fdbe1bd159f215" gracePeriod=30 Apr 16 15:18:39.618541 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:39.618505 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx"] Apr 16 15:18:39.619110 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:39.619087 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4eff692-25d1-48e4-810c-a5325fabf194" containerName="storage-initializer" Apr 16 15:18:39.619110 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:39.619106 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4eff692-25d1-48e4-810c-a5325fabf194" containerName="storage-initializer" Apr 16 15:18:39.619216 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:39.619141 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4eff692-25d1-48e4-810c-a5325fabf194" containerName="kserve-container" Apr 16 15:18:39.619216 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:39.619150 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4eff692-25d1-48e4-810c-a5325fabf194" containerName="kserve-container" Apr 16 15:18:39.619286 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:39.619251 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d4eff692-25d1-48e4-810c-a5325fabf194" containerName="kserve-container" Apr 16 15:18:39.622928 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:39.622906 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" Apr 16 15:18:39.633478 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:39.633453 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx"] Apr 16 15:18:39.800379 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:39.800326 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08962624-179d-4b46-a7fd-c1ceece189f9-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-65975766bf-cz4hx\" (UID: \"08962624-179d-4b46-a7fd-c1ceece189f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" Apr 16 15:18:39.901087 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:39.901055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08962624-179d-4b46-a7fd-c1ceece189f9-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-65975766bf-cz4hx\" (UID: \"08962624-179d-4b46-a7fd-c1ceece189f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" Apr 16 15:18:39.901545 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:39.901516 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08962624-179d-4b46-a7fd-c1ceece189f9-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-65975766bf-cz4hx\" (UID: \"08962624-179d-4b46-a7fd-c1ceece189f9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" Apr 16 15:18:39.936529 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:39.936489 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" Apr 16 15:18:40.065963 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:40.065935 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx"] Apr 16 15:18:40.068757 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:18:40.068691 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08962624_179d_4b46_a7fd_c1ceece189f9.slice/crio-566cfc3147615818f6574b8eb16fbe27e6b0cb84f8f52546cd7d2a44f98c500d WatchSource:0}: Error finding container 566cfc3147615818f6574b8eb16fbe27e6b0cb84f8f52546cd7d2a44f98c500d: Status 404 returned error can't find the container with id 566cfc3147615818f6574b8eb16fbe27e6b0cb84f8f52546cd7d2a44f98c500d Apr 16 15:18:40.070885 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:40.070865 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:18:40.759310 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:40.759268 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" event={"ID":"08962624-179d-4b46-a7fd-c1ceece189f9","Type":"ContainerStarted","Data":"89b0f37e17719bcec67c033d75d898c48e960b7e4fd448509e7f9e139a4d39a2"} Apr 16 15:18:40.759687 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:40.759318 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" event={"ID":"08962624-179d-4b46-a7fd-c1ceece189f9","Type":"ContainerStarted","Data":"566cfc3147615818f6574b8eb16fbe27e6b0cb84f8f52546cd7d2a44f98c500d"} Apr 16 15:18:40.894085 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:40.894061 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj" Apr 16 15:18:41.011332 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:41.011239 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ad3d097-ff49-4df7-8db1-ef1ca0ec8435-kserve-provision-location\") pod \"0ad3d097-ff49-4df7-8db1-ef1ca0ec8435\" (UID: \"0ad3d097-ff49-4df7-8db1-ef1ca0ec8435\") " Apr 16 15:18:41.011610 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:41.011585 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ad3d097-ff49-4df7-8db1-ef1ca0ec8435-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0ad3d097-ff49-4df7-8db1-ef1ca0ec8435" (UID: "0ad3d097-ff49-4df7-8db1-ef1ca0ec8435"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:18:41.112242 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:41.112186 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ad3d097-ff49-4df7-8db1-ef1ca0ec8435-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:18:41.765685 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:41.765650 2576 generic.go:358] "Generic (PLEG): container finished" podID="0ad3d097-ff49-4df7-8db1-ef1ca0ec8435" containerID="57a7000e2cfe93d19567af0a43bf90ab13babd34db7b62bf12fdbe1bd159f215" exitCode=0 Apr 16 15:18:41.766134 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:41.765719 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj" Apr 16 15:18:41.766134 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:41.765718 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj" event={"ID":"0ad3d097-ff49-4df7-8db1-ef1ca0ec8435","Type":"ContainerDied","Data":"57a7000e2cfe93d19567af0a43bf90ab13babd34db7b62bf12fdbe1bd159f215"} Apr 16 15:18:41.766134 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:41.765761 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj" event={"ID":"0ad3d097-ff49-4df7-8db1-ef1ca0ec8435","Type":"ContainerDied","Data":"c721c5d1111efb698b922ed4c48d32248389537962071b02fddbbe7b392b65e7"} Apr 16 15:18:41.766134 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:41.765780 2576 scope.go:117] "RemoveContainer" containerID="57a7000e2cfe93d19567af0a43bf90ab13babd34db7b62bf12fdbe1bd159f215" Apr 16 15:18:41.774468 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:41.774299 2576 scope.go:117] "RemoveContainer" containerID="9f67c9c4fcb17d0665f7a8accc2d74e44b2dae7908e1282faa44179871ae4621" Apr 16 15:18:41.781874 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:41.781851 2576 scope.go:117] "RemoveContainer" containerID="57a7000e2cfe93d19567af0a43bf90ab13babd34db7b62bf12fdbe1bd159f215" Apr 16 15:18:41.782200 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:18:41.782181 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57a7000e2cfe93d19567af0a43bf90ab13babd34db7b62bf12fdbe1bd159f215\": container with ID starting with 57a7000e2cfe93d19567af0a43bf90ab13babd34db7b62bf12fdbe1bd159f215 not found: ID does not exist" containerID="57a7000e2cfe93d19567af0a43bf90ab13babd34db7b62bf12fdbe1bd159f215" Apr 16 15:18:41.782256 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:41.782209 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57a7000e2cfe93d19567af0a43bf90ab13babd34db7b62bf12fdbe1bd159f215"} err="failed to get container status \"57a7000e2cfe93d19567af0a43bf90ab13babd34db7b62bf12fdbe1bd159f215\": rpc error: code = NotFound desc = could not find container \"57a7000e2cfe93d19567af0a43bf90ab13babd34db7b62bf12fdbe1bd159f215\": container with ID starting with 57a7000e2cfe93d19567af0a43bf90ab13babd34db7b62bf12fdbe1bd159f215 not found: ID does not exist" Apr 16 15:18:41.782256 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:41.782230 2576 scope.go:117] "RemoveContainer" containerID="9f67c9c4fcb17d0665f7a8accc2d74e44b2dae7908e1282faa44179871ae4621" Apr 16 15:18:41.782467 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:18:41.782451 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f67c9c4fcb17d0665f7a8accc2d74e44b2dae7908e1282faa44179871ae4621\": container with ID starting with 9f67c9c4fcb17d0665f7a8accc2d74e44b2dae7908e1282faa44179871ae4621 not found: ID does not exist" containerID="9f67c9c4fcb17d0665f7a8accc2d74e44b2dae7908e1282faa44179871ae4621" Apr 16 15:18:41.782509 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:41.782471 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f67c9c4fcb17d0665f7a8accc2d74e44b2dae7908e1282faa44179871ae4621"} err="failed to get container status \"9f67c9c4fcb17d0665f7a8accc2d74e44b2dae7908e1282faa44179871ae4621\": rpc error: code = NotFound desc = could not find container \"9f67c9c4fcb17d0665f7a8accc2d74e44b2dae7908e1282faa44179871ae4621\": container with ID starting with 9f67c9c4fcb17d0665f7a8accc2d74e44b2dae7908e1282faa44179871ae4621 not found: ID does not exist" Apr 16 15:18:41.786558 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:41.786533 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj"] Apr 16 15:18:41.789900 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:41.789868 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-lxbwj"] Apr 16 15:18:42.308305 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:42.308270 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ad3d097-ff49-4df7-8db1-ef1ca0ec8435" path="/var/lib/kubelet/pods/0ad3d097-ff49-4df7-8db1-ef1ca0ec8435/volumes" Apr 16 15:18:44.778983 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:44.778947 2576 generic.go:358] "Generic (PLEG): container finished" podID="08962624-179d-4b46-a7fd-c1ceece189f9" containerID="89b0f37e17719bcec67c033d75d898c48e960b7e4fd448509e7f9e139a4d39a2" exitCode=0 Apr 16 15:18:44.779401 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:44.779030 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" event={"ID":"08962624-179d-4b46-a7fd-c1ceece189f9","Type":"ContainerDied","Data":"89b0f37e17719bcec67c033d75d898c48e960b7e4fd448509e7f9e139a4d39a2"} Apr 16 15:18:45.788682 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:45.788644 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" event={"ID":"08962624-179d-4b46-a7fd-c1ceece189f9","Type":"ContainerStarted","Data":"0cd77457e0f6c26f2bfa28e00f7dfed663db9e81b726f49b1b5f126e86a253b4"} Apr 16 15:18:48.800905 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:48.800871 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" event={"ID":"08962624-179d-4b46-a7fd-c1ceece189f9","Type":"ContainerStarted","Data":"427b83292572dccb268bd76c046917e7f8290ac0ada6b7ad35c684f0e0740934"} Apr 16 15:18:48.801533 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:48.800934 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" Apr 16 15:18:48.801533 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:48.800980 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" Apr 16 15:18:48.818618 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:18:48.818563 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" podStartSLOduration=6.448109296 podStartE2EDuration="9.818547692s" podCreationTimestamp="2026-04-16 15:18:39 +0000 UTC" firstStartedPulling="2026-04-16 15:18:44.84630019 +0000 UTC m=+1559.265435143" lastFinishedPulling="2026-04-16 15:18:48.216738572 +0000 UTC m=+1562.635873539" observedRunningTime="2026-04-16 15:18:48.816252643 +0000 UTC m=+1563.235387618" watchObservedRunningTime="2026-04-16 15:18:48.818547692 +0000 UTC m=+1563.237682667" Apr 16 15:19:19.808296 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:19:19.808261 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" Apr 16 15:19:49.809463 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:19:49.809379 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" Apr 16 15:19:59.709749 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:19:59.709704 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx"] Apr 16 15:19:59.710194 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:19:59.710036 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" podUID="08962624-179d-4b46-a7fd-c1ceece189f9" containerName="kserve-container" containerID="cri-o://0cd77457e0f6c26f2bfa28e00f7dfed663db9e81b726f49b1b5f126e86a253b4" gracePeriod=30 Apr 16 15:19:59.710194 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:19:59.710076 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" podUID="08962624-179d-4b46-a7fd-c1ceece189f9" containerName="kserve-agent" containerID="cri-o://427b83292572dccb268bd76c046917e7f8290ac0ada6b7ad35c684f0e0740934" gracePeriod=30 Apr 16 15:19:59.754140 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:19:59.754098 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds"] Apr 16 15:19:59.754672 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:19:59.754646 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ad3d097-ff49-4df7-8db1-ef1ca0ec8435" containerName="kserve-container" Apr 16 15:19:59.754672 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:19:59.754669 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ad3d097-ff49-4df7-8db1-ef1ca0ec8435" containerName="kserve-container" Apr 16 15:19:59.754846 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:19:59.754683 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ad3d097-ff49-4df7-8db1-ef1ca0ec8435" containerName="storage-initializer" Apr 16 15:19:59.754846 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:19:59.754691 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ad3d097-ff49-4df7-8db1-ef1ca0ec8435" containerName="storage-initializer" Apr 16 15:19:59.754846 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:19:59.754794 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ad3d097-ff49-4df7-8db1-ef1ca0ec8435" containerName="kserve-container" Apr 16 15:19:59.758458 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:19:59.758436 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" Apr 16 15:19:59.767635 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:19:59.767609 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds"] Apr 16 15:19:59.805787 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:19:59.805750 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" podUID="08962624-179d-4b46-a7fd-c1ceece189f9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.41:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 15:19:59.928330 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:19:59.928289 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/effd37f5-5b3f-4197-80c0-c972991f7394-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-h6xds\" (UID: \"effd37f5-5b3f-4197-80c0-c972991f7394\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" Apr 16 15:20:00.029768 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:00.029667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/effd37f5-5b3f-4197-80c0-c972991f7394-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-h6xds\" (UID: \"effd37f5-5b3f-4197-80c0-c972991f7394\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" Apr 16 15:20:00.030135 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:00.030114 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/effd37f5-5b3f-4197-80c0-c972991f7394-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-h6xds\" (UID: \"effd37f5-5b3f-4197-80c0-c972991f7394\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" Apr 16 15:20:00.070557 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:00.070526 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" Apr 16 15:20:00.197501 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:00.197202 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds"] Apr 16 15:20:01.064306 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:01.064269 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" event={"ID":"effd37f5-5b3f-4197-80c0-c972991f7394","Type":"ContainerStarted","Data":"feed2acb898896debece4120e5425a1f6591b242d03d6d4e240000c22120bb2e"} Apr 16 15:20:01.064306 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:01.064311 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" event={"ID":"effd37f5-5b3f-4197-80c0-c972991f7394","Type":"ContainerStarted","Data":"67f781dfb4a6b3a007f223feffcdad71853f01f7379882639ed432e5c41c93bf"} Apr 16 15:20:02.069833 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:02.069748 2576 generic.go:358] "Generic (PLEG): container finished" podID="08962624-179d-4b46-a7fd-c1ceece189f9" containerID="0cd77457e0f6c26f2bfa28e00f7dfed663db9e81b726f49b1b5f126e86a253b4" exitCode=0 Apr 16 15:20:02.069833 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:02.069820 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" event={"ID":"08962624-179d-4b46-a7fd-c1ceece189f9","Type":"ContainerDied","Data":"0cd77457e0f6c26f2bfa28e00f7dfed663db9e81b726f49b1b5f126e86a253b4"} Apr 16 15:20:05.081804 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:05.081772 2576 generic.go:358] "Generic (PLEG): container finished" podID="effd37f5-5b3f-4197-80c0-c972991f7394" containerID="feed2acb898896debece4120e5425a1f6591b242d03d6d4e240000c22120bb2e" exitCode=0 Apr 16 15:20:05.082195 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:05.081819 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" event={"ID":"effd37f5-5b3f-4197-80c0-c972991f7394","Type":"ContainerDied","Data":"feed2acb898896debece4120e5425a1f6591b242d03d6d4e240000c22120bb2e"} Apr 16 15:20:09.805222 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:09.805180 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" podUID="08962624-179d-4b46-a7fd-c1ceece189f9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.41:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 15:20:18.139471 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:18.139436 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" event={"ID":"effd37f5-5b3f-4197-80c0-c972991f7394","Type":"ContainerStarted","Data":"1fe6605b5878289286f8b170c40ec87bc4b9ecbd0388cb97f0f1beb188e76503"} Apr 16 15:20:18.139871 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:18.139750 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" Apr 16 15:20:18.141271 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:18.141239 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" podUID="effd37f5-5b3f-4197-80c0-c972991f7394" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 15:20:18.156289 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:18.156244 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" podStartSLOduration=6.962444786 podStartE2EDuration="19.156232002s" podCreationTimestamp="2026-04-16 15:19:59 +0000 UTC" firstStartedPulling="2026-04-16 15:20:05.08305063 +0000 UTC m=+1639.502185587" lastFinishedPulling="2026-04-16 15:20:17.276837846 +0000 UTC m=+1651.695972803" observedRunningTime="2026-04-16 15:20:18.153966714 +0000 UTC m=+1652.573101702" watchObservedRunningTime="2026-04-16 15:20:18.156232002 +0000 UTC m=+1652.575366977" Apr 16 15:20:19.143985 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:19.143945 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" podUID="effd37f5-5b3f-4197-80c0-c972991f7394" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 15:20:19.805460 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:19.805417 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" podUID="08962624-179d-4b46-a7fd-c1ceece189f9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.41:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 15:20:19.805651 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:19.805535 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" Apr 16 15:20:29.144033 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:29.143987 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" podUID="effd37f5-5b3f-4197-80c0-c972991f7394" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 15:20:29.876998 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:29.876975 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" Apr 16 15:20:30.011947 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:30.011846 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08962624-179d-4b46-a7fd-c1ceece189f9-kserve-provision-location\") pod \"08962624-179d-4b46-a7fd-c1ceece189f9\" (UID: \"08962624-179d-4b46-a7fd-c1ceece189f9\") " Apr 16 15:20:30.012227 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:30.012204 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08962624-179d-4b46-a7fd-c1ceece189f9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "08962624-179d-4b46-a7fd-c1ceece189f9" (UID: "08962624-179d-4b46-a7fd-c1ceece189f9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:20:30.112560 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:30.112515 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08962624-179d-4b46-a7fd-c1ceece189f9-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:20:30.185145 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:30.185104 2576 generic.go:358] "Generic (PLEG): container finished" podID="08962624-179d-4b46-a7fd-c1ceece189f9" containerID="427b83292572dccb268bd76c046917e7f8290ac0ada6b7ad35c684f0e0740934" exitCode=137 Apr 16 15:20:30.185552 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:30.185152 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" event={"ID":"08962624-179d-4b46-a7fd-c1ceece189f9","Type":"ContainerDied","Data":"427b83292572dccb268bd76c046917e7f8290ac0ada6b7ad35c684f0e0740934"} Apr 16 15:20:30.185552 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:30.185192 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" event={"ID":"08962624-179d-4b46-a7fd-c1ceece189f9","Type":"ContainerDied","Data":"566cfc3147615818f6574b8eb16fbe27e6b0cb84f8f52546cd7d2a44f98c500d"} Apr 16 15:20:30.185552 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:30.185194 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" Apr 16 15:20:30.185552 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:30.185208 2576 scope.go:117] "RemoveContainer" containerID="427b83292572dccb268bd76c046917e7f8290ac0ada6b7ad35c684f0e0740934" Apr 16 15:20:30.194090 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:30.194072 2576 scope.go:117] "RemoveContainer" containerID="0cd77457e0f6c26f2bfa28e00f7dfed663db9e81b726f49b1b5f126e86a253b4" Apr 16 15:20:30.201506 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:30.201486 2576 scope.go:117] "RemoveContainer" containerID="89b0f37e17719bcec67c033d75d898c48e960b7e4fd448509e7f9e139a4d39a2" Apr 16 15:20:30.212059 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:30.210085 2576 scope.go:117] "RemoveContainer" containerID="427b83292572dccb268bd76c046917e7f8290ac0ada6b7ad35c684f0e0740934" Apr 16 15:20:30.212059 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:20:30.210418 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"427b83292572dccb268bd76c046917e7f8290ac0ada6b7ad35c684f0e0740934\": container with ID starting with 427b83292572dccb268bd76c046917e7f8290ac0ada6b7ad35c684f0e0740934 not found: ID does not exist" containerID="427b83292572dccb268bd76c046917e7f8290ac0ada6b7ad35c684f0e0740934" Apr 16 15:20:30.212059 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:30.210472 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"427b83292572dccb268bd76c046917e7f8290ac0ada6b7ad35c684f0e0740934"} err="failed to get container status \"427b83292572dccb268bd76c046917e7f8290ac0ada6b7ad35c684f0e0740934\": rpc error: code = NotFound desc = could not find container \"427b83292572dccb268bd76c046917e7f8290ac0ada6b7ad35c684f0e0740934\": container with ID starting with 427b83292572dccb268bd76c046917e7f8290ac0ada6b7ad35c684f0e0740934 not found: ID does not exist" Apr 16 15:20:30.212059 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:30.210499 2576 scope.go:117] "RemoveContainer" containerID="0cd77457e0f6c26f2bfa28e00f7dfed663db9e81b726f49b1b5f126e86a253b4" Apr 16 15:20:30.212059 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:30.210590 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx"] Apr 16 15:20:30.212059 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:20:30.210757 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cd77457e0f6c26f2bfa28e00f7dfed663db9e81b726f49b1b5f126e86a253b4\": container with ID starting with 0cd77457e0f6c26f2bfa28e00f7dfed663db9e81b726f49b1b5f126e86a253b4 not found: ID does not exist" containerID="0cd77457e0f6c26f2bfa28e00f7dfed663db9e81b726f49b1b5f126e86a253b4" Apr 16 15:20:30.212059 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:30.210793 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cd77457e0f6c26f2bfa28e00f7dfed663db9e81b726f49b1b5f126e86a253b4"} err="failed to get container status \"0cd77457e0f6c26f2bfa28e00f7dfed663db9e81b726f49b1b5f126e86a253b4\": rpc error: code = NotFound desc = could not find container \"0cd77457e0f6c26f2bfa28e00f7dfed663db9e81b726f49b1b5f126e86a253b4\": container with ID starting with 0cd77457e0f6c26f2bfa28e00f7dfed663db9e81b726f49b1b5f126e86a253b4 not found: ID does not exist" Apr 16 15:20:30.212059 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:30.210813 2576 scope.go:117] "RemoveContainer" containerID="89b0f37e17719bcec67c033d75d898c48e960b7e4fd448509e7f9e139a4d39a2" Apr 16 15:20:30.212059 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:20:30.211077 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89b0f37e17719bcec67c033d75d898c48e960b7e4fd448509e7f9e139a4d39a2\": container with ID starting with 89b0f37e17719bcec67c033d75d898c48e960b7e4fd448509e7f9e139a4d39a2 not found: ID does not exist" containerID="89b0f37e17719bcec67c033d75d898c48e960b7e4fd448509e7f9e139a4d39a2" Apr 16 15:20:30.212059 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:30.211111 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89b0f37e17719bcec67c033d75d898c48e960b7e4fd448509e7f9e139a4d39a2"} err="failed to get container status \"89b0f37e17719bcec67c033d75d898c48e960b7e4fd448509e7f9e139a4d39a2\": rpc error: code = NotFound desc = could not find container \"89b0f37e17719bcec67c033d75d898c48e960b7e4fd448509e7f9e139a4d39a2\": container with ID starting with 89b0f37e17719bcec67c033d75d898c48e960b7e4fd448509e7f9e139a4d39a2 not found: ID does not exist" Apr 16 15:20:30.214188 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:30.214160 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx"] Apr 16 15:20:30.307370 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:30.307287 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08962624-179d-4b46-a7fd-c1ceece189f9" path="/var/lib/kubelet/pods/08962624-179d-4b46-a7fd-c1ceece189f9/volumes" Apr 16 15:20:30.805773 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:30.805725 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" podUID="08962624-179d-4b46-a7fd-c1ceece189f9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.41:8080/v1/models/isvc-sklearn-mcp\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 16 15:20:30.809878 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:30.809850 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-65975766bf-cz4hx" podUID="08962624-179d-4b46-a7fd-c1ceece189f9" containerName="kserve-agent" probeResult="failure" output="dial tcp 10.134.0.41:9081: i/o timeout" Apr 16 15:20:39.144128 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:39.144085 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" podUID="effd37f5-5b3f-4197-80c0-c972991f7394" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 15:20:49.144605 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:49.144561 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" podUID="effd37f5-5b3f-4197-80c0-c972991f7394" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 15:20:59.144715 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:20:59.144604 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" podUID="effd37f5-5b3f-4197-80c0-c972991f7394" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 15:21:09.145927 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:21:09.145894 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" Apr 16 15:21:11.284986 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:21:11.284943 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds"] Apr 16 15:21:11.285367 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:21:11.285195 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" podUID="effd37f5-5b3f-4197-80c0-c972991f7394" containerName="kserve-container" containerID="cri-o://1fe6605b5878289286f8b170c40ec87bc4b9ecbd0388cb97f0f1beb188e76503" gracePeriod=30 Apr 16 15:21:14.129541 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:21:14.129515 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" Apr 16 15:21:14.196654 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:21:14.196579 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/effd37f5-5b3f-4197-80c0-c972991f7394-kserve-provision-location\") pod \"effd37f5-5b3f-4197-80c0-c972991f7394\" (UID: \"effd37f5-5b3f-4197-80c0-c972991f7394\") " Apr 16 15:21:14.206368 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:21:14.206336 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/effd37f5-5b3f-4197-80c0-c972991f7394-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "effd37f5-5b3f-4197-80c0-c972991f7394" (UID: "effd37f5-5b3f-4197-80c0-c972991f7394"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:21:14.297493 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:21:14.297452 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/effd37f5-5b3f-4197-80c0-c972991f7394-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:21:14.337175 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:21:14.337135 2576 generic.go:358] "Generic (PLEG): container finished" podID="effd37f5-5b3f-4197-80c0-c972991f7394" containerID="1fe6605b5878289286f8b170c40ec87bc4b9ecbd0388cb97f0f1beb188e76503" exitCode=0 Apr 16 15:21:14.337311 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:21:14.337198 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" Apr 16 15:21:14.337311 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:21:14.337225 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" event={"ID":"effd37f5-5b3f-4197-80c0-c972991f7394","Type":"ContainerDied","Data":"1fe6605b5878289286f8b170c40ec87bc4b9ecbd0388cb97f0f1beb188e76503"} Apr 16 15:21:14.337311 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:21:14.337267 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds" event={"ID":"effd37f5-5b3f-4197-80c0-c972991f7394","Type":"ContainerDied","Data":"67f781dfb4a6b3a007f223feffcdad71853f01f7379882639ed432e5c41c93bf"} Apr 16 15:21:14.337311 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:21:14.337286 2576 scope.go:117] "RemoveContainer" containerID="1fe6605b5878289286f8b170c40ec87bc4b9ecbd0388cb97f0f1beb188e76503" Apr 16 15:21:14.345839 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:21:14.345820 2576 scope.go:117] "RemoveContainer" containerID="feed2acb898896debece4120e5425a1f6591b242d03d6d4e240000c22120bb2e" Apr 16 15:21:14.352180 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:21:14.352159 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds"] Apr 16 15:21:14.353660 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:21:14.353642 2576 scope.go:117] "RemoveContainer" containerID="1fe6605b5878289286f8b170c40ec87bc4b9ecbd0388cb97f0f1beb188e76503" Apr 16 15:21:14.354047 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:21:14.353995 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fe6605b5878289286f8b170c40ec87bc4b9ecbd0388cb97f0f1beb188e76503\": container with ID starting with 1fe6605b5878289286f8b170c40ec87bc4b9ecbd0388cb97f0f1beb188e76503 not found: ID does not exist" containerID="1fe6605b5878289286f8b170c40ec87bc4b9ecbd0388cb97f0f1beb188e76503" Apr 16 15:21:14.354154 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:21:14.354047 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fe6605b5878289286f8b170c40ec87bc4b9ecbd0388cb97f0f1beb188e76503"} err="failed to get container status \"1fe6605b5878289286f8b170c40ec87bc4b9ecbd0388cb97f0f1beb188e76503\": rpc error: code = NotFound desc = could not find container \"1fe6605b5878289286f8b170c40ec87bc4b9ecbd0388cb97f0f1beb188e76503\": container with ID starting with 1fe6605b5878289286f8b170c40ec87bc4b9ecbd0388cb97f0f1beb188e76503 not found: ID does not exist" Apr 16 15:21:14.354154 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:21:14.354072 2576 scope.go:117] "RemoveContainer" containerID="feed2acb898896debece4120e5425a1f6591b242d03d6d4e240000c22120bb2e" Apr 16 15:21:14.354399 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:21:14.354347 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feed2acb898896debece4120e5425a1f6591b242d03d6d4e240000c22120bb2e\": container with ID starting with feed2acb898896debece4120e5425a1f6591b242d03d6d4e240000c22120bb2e not found: ID does not exist" containerID="feed2acb898896debece4120e5425a1f6591b242d03d6d4e240000c22120bb2e" Apr 16 15:21:14.354399 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:21:14.354383 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feed2acb898896debece4120e5425a1f6591b242d03d6d4e240000c22120bb2e"} err="failed to get container status \"feed2acb898896debece4120e5425a1f6591b242d03d6d4e240000c22120bb2e\": rpc error: code = NotFound desc = could not find container \"feed2acb898896debece4120e5425a1f6591b242d03d6d4e240000c22120bb2e\": container with ID starting with feed2acb898896debece4120e5425a1f6591b242d03d6d4e240000c22120bb2e not found: ID does not exist" Apr 16 15:21:14.355957 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:21:14.355940 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-h6xds"] Apr 16 15:21:16.307733 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:21:16.307701 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="effd37f5-5b3f-4197-80c0-c972991f7394" path="/var/lib/kubelet/pods/effd37f5-5b3f-4197-80c0-c972991f7394/volumes" Apr 16 15:22:46.332526 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:22:46.332497 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:22:46.334795 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:22:46.334776 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:22:46.334925 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:22:46.334779 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:22:46.337050 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:22:46.337009 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:27:46.357666 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:46.357635 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:27:46.359961 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:46.359936 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:27:46.361966 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:46.361947 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:27:46.364227 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:46.364208 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:27:58.081433 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.081399 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv"] Apr 16 15:27:58.081891 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.081873 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08962624-179d-4b46-a7fd-c1ceece189f9" containerName="kserve-container" Apr 16 15:27:58.081940 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.081893 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="08962624-179d-4b46-a7fd-c1ceece189f9" containerName="kserve-container" Apr 16 15:27:58.081940 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.081917 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="effd37f5-5b3f-4197-80c0-c972991f7394" containerName="storage-initializer" Apr 16 15:27:58.081940 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.081923 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="effd37f5-5b3f-4197-80c0-c972991f7394" containerName="storage-initializer" Apr 16 15:27:58.081940 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.081931 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="effd37f5-5b3f-4197-80c0-c972991f7394" containerName="kserve-container" Apr 16 15:27:58.081940 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.081937 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="effd37f5-5b3f-4197-80c0-c972991f7394" containerName="kserve-container" Apr 16 15:27:58.082108 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.081947 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08962624-179d-4b46-a7fd-c1ceece189f9" containerName="storage-initializer" Apr 16 15:27:58.082108 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.081952 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="08962624-179d-4b46-a7fd-c1ceece189f9" containerName="storage-initializer" Apr 16 15:27:58.082108 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.081958 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08962624-179d-4b46-a7fd-c1ceece189f9" containerName="kserve-agent" Apr 16 15:27:58.082108 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.081963 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="08962624-179d-4b46-a7fd-c1ceece189f9" containerName="kserve-agent" Apr 16 15:27:58.082108 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.082047 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="08962624-179d-4b46-a7fd-c1ceece189f9" containerName="kserve-container" Apr 16 15:27:58.082108 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.082058 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="effd37f5-5b3f-4197-80c0-c972991f7394" containerName="kserve-container" Apr 16 15:27:58.082108 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.082069 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="08962624-179d-4b46-a7fd-c1ceece189f9" containerName="kserve-agent" Apr 16 15:27:58.085285 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.085266 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" Apr 16 15:27:58.087501 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.087475 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tsb5d\"" Apr 16 15:27:58.096471 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.096450 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv"] Apr 16 15:27:58.198679 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.198641 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4af418a7-7442-4200-84d5-558d218343ec-kserve-provision-location\") pod \"isvc-primary-21dbf3-predictor-6bd6957cff-swgrv\" (UID: \"4af418a7-7442-4200-84d5-558d218343ec\") " pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" Apr 16 15:27:58.299927 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.299895 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4af418a7-7442-4200-84d5-558d218343ec-kserve-provision-location\") pod \"isvc-primary-21dbf3-predictor-6bd6957cff-swgrv\" (UID: \"4af418a7-7442-4200-84d5-558d218343ec\") " pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" Apr 16 15:27:58.300244 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.300221 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4af418a7-7442-4200-84d5-558d218343ec-kserve-provision-location\") pod \"isvc-primary-21dbf3-predictor-6bd6957cff-swgrv\" (UID: \"4af418a7-7442-4200-84d5-558d218343ec\") " pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" Apr 16 15:27:58.397415 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.397330 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" Apr 16 15:27:58.518773 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.518735 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv"] Apr 16 15:27:58.521713 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:27:58.521675 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4af418a7_7442_4200_84d5_558d218343ec.slice/crio-05bbfd459d090b992a5f6288f72f262739784bcc9173a31927d56aa5625f1b60 WatchSource:0}: Error finding container 05bbfd459d090b992a5f6288f72f262739784bcc9173a31927d56aa5625f1b60: Status 404 returned error can't find the container with id 05bbfd459d090b992a5f6288f72f262739784bcc9173a31927d56aa5625f1b60 Apr 16 15:27:58.523558 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.523540 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:27:58.732891 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.732839 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" event={"ID":"4af418a7-7442-4200-84d5-558d218343ec","Type":"ContainerStarted","Data":"4678f9649db50d82cfa6875480bacc59862c74cfb443061fc1ee90786eab4de2"} Apr 16 15:27:58.732891 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:27:58.732894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" event={"ID":"4af418a7-7442-4200-84d5-558d218343ec","Type":"ContainerStarted","Data":"05bbfd459d090b992a5f6288f72f262739784bcc9173a31927d56aa5625f1b60"} Apr 16 15:28:02.748073 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:28:02.748010 2576 generic.go:358] "Generic (PLEG): container finished" podID="4af418a7-7442-4200-84d5-558d218343ec" containerID="4678f9649db50d82cfa6875480bacc59862c74cfb443061fc1ee90786eab4de2" exitCode=0 Apr 16 15:28:02.748448 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:28:02.748090 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" event={"ID":"4af418a7-7442-4200-84d5-558d218343ec","Type":"ContainerDied","Data":"4678f9649db50d82cfa6875480bacc59862c74cfb443061fc1ee90786eab4de2"} Apr 16 15:28:03.753353 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:28:03.753316 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" event={"ID":"4af418a7-7442-4200-84d5-558d218343ec","Type":"ContainerStarted","Data":"aebae36f9a4f26386fd6b56ea31a47848e0be1e97dbde5015a2678657302f424"} Apr 16 15:28:03.753728 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:28:03.753623 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" Apr 16 15:28:03.755070 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:28:03.755044 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" podUID="4af418a7-7442-4200-84d5-558d218343ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 15:28:03.770549 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:28:03.770507 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" podStartSLOduration=5.770493715 podStartE2EDuration="5.770493715s" podCreationTimestamp="2026-04-16 15:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:28:03.767767263 +0000 UTC m=+2118.186902238" watchObservedRunningTime="2026-04-16 15:28:03.770493715 +0000 UTC m=+2118.189628689" Apr 16 15:28:04.757622 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:28:04.757575 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" podUID="4af418a7-7442-4200-84d5-558d218343ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 15:28:14.757979 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:28:14.757927 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" podUID="4af418a7-7442-4200-84d5-558d218343ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 15:28:24.758540 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:28:24.758491 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" podUID="4af418a7-7442-4200-84d5-558d218343ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 15:28:34.758203 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:28:34.758098 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" podUID="4af418a7-7442-4200-84d5-558d218343ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 15:28:44.758052 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:28:44.757988 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" podUID="4af418a7-7442-4200-84d5-558d218343ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 15:28:54.758285 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:28:54.758242 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" podUID="4af418a7-7442-4200-84d5-558d218343ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 15:29:04.757661 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:04.757617 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" podUID="4af418a7-7442-4200-84d5-558d218343ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 15:29:13.304899 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:13.304864 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" Apr 16 15:29:18.155047 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:18.155000 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs"] Apr 16 15:29:18.158750 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:18.158730 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs" Apr 16 15:29:18.161005 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:18.160983 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-21dbf3-dockercfg-p9khm\"" Apr 16 15:29:18.161132 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:18.161035 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-21dbf3\"" Apr 16 15:29:18.161132 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:18.161035 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 15:29:18.167901 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:18.167877 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs"] Apr 16 15:29:18.318478 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:18.318433 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1df531f6-39cd-4702-b47f-a3d6fe125f73-kserve-provision-location\") pod \"isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs\" (UID: \"1df531f6-39cd-4702-b47f-a3d6fe125f73\") " pod="kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs" Apr 16 15:29:18.318667 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:18.318564 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/1df531f6-39cd-4702-b47f-a3d6fe125f73-cabundle-cert\") pod \"isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs\" (UID: \"1df531f6-39cd-4702-b47f-a3d6fe125f73\") " pod="kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs" Apr 16 15:29:18.420477 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:18.420363 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/1df531f6-39cd-4702-b47f-a3d6fe125f73-cabundle-cert\") pod \"isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs\" (UID: \"1df531f6-39cd-4702-b47f-a3d6fe125f73\") " pod="kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs" Apr 16 15:29:18.420477 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:18.420478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1df531f6-39cd-4702-b47f-a3d6fe125f73-kserve-provision-location\") pod \"isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs\" (UID: \"1df531f6-39cd-4702-b47f-a3d6fe125f73\") " pod="kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs" Apr 16 15:29:18.421052 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:18.421009 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1df531f6-39cd-4702-b47f-a3d6fe125f73-kserve-provision-location\") pod \"isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs\" (UID: \"1df531f6-39cd-4702-b47f-a3d6fe125f73\") " pod="kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs" Apr 16 15:29:18.426983 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:18.426950 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/1df531f6-39cd-4702-b47f-a3d6fe125f73-cabundle-cert\") pod \"isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs\" (UID: \"1df531f6-39cd-4702-b47f-a3d6fe125f73\") " pod="kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs" Apr 16 15:29:18.470597 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:18.470570 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs" Apr 16 15:29:18.598560 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:18.598537 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs"] Apr 16 15:29:18.600772 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:29:18.600741 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1df531f6_39cd_4702_b47f_a3d6fe125f73.slice/crio-e492695cf0d4f4c383b7605d46e06d15430a5d4b1627baabeb1e1d497919373e WatchSource:0}: Error finding container e492695cf0d4f4c383b7605d46e06d15430a5d4b1627baabeb1e1d497919373e: Status 404 returned error can't find the container with id e492695cf0d4f4c383b7605d46e06d15430a5d4b1627baabeb1e1d497919373e Apr 16 15:29:19.021377 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:19.021326 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs" event={"ID":"1df531f6-39cd-4702-b47f-a3d6fe125f73","Type":"ContainerStarted","Data":"6241ce5001c4ae2b207bdac0d35e5fbad5ac878990ab680d852e3641691f8630"} Apr 16 15:29:19.021550 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:19.021385 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs" event={"ID":"1df531f6-39cd-4702-b47f-a3d6fe125f73","Type":"ContainerStarted","Data":"e492695cf0d4f4c383b7605d46e06d15430a5d4b1627baabeb1e1d497919373e"} Apr 16 15:29:25.050760 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:25.050731 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs_1df531f6-39cd-4702-b47f-a3d6fe125f73/storage-initializer/0.log" Apr 16 15:29:25.051142 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:25.050769 2576 generic.go:358] "Generic (PLEG): container finished" podID="1df531f6-39cd-4702-b47f-a3d6fe125f73" containerID="6241ce5001c4ae2b207bdac0d35e5fbad5ac878990ab680d852e3641691f8630" exitCode=1 Apr 16 15:29:25.051142 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:25.050847 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs" event={"ID":"1df531f6-39cd-4702-b47f-a3d6fe125f73","Type":"ContainerDied","Data":"6241ce5001c4ae2b207bdac0d35e5fbad5ac878990ab680d852e3641691f8630"} Apr 16 15:29:26.056211 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:26.056181 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs_1df531f6-39cd-4702-b47f-a3d6fe125f73/storage-initializer/0.log" Apr 16 15:29:26.056595 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:26.056268 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs" event={"ID":"1df531f6-39cd-4702-b47f-a3d6fe125f73","Type":"ContainerStarted","Data":"55cebb03058a8be06288e7fc255787c77cb81a4ef94ac6610294cda08b5bbf61"} Apr 16 15:29:31.087925 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:31.087898 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs_1df531f6-39cd-4702-b47f-a3d6fe125f73/storage-initializer/1.log" Apr 16 15:29:31.088440 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:31.088382 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs_1df531f6-39cd-4702-b47f-a3d6fe125f73/storage-initializer/0.log" Apr 16 15:29:31.088440 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:31.088422 2576 generic.go:358] "Generic (PLEG): container finished" podID="1df531f6-39cd-4702-b47f-a3d6fe125f73" containerID="55cebb03058a8be06288e7fc255787c77cb81a4ef94ac6610294cda08b5bbf61" exitCode=1 Apr 16 15:29:31.088545 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:31.088476 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs" event={"ID":"1df531f6-39cd-4702-b47f-a3d6fe125f73","Type":"ContainerDied","Data":"55cebb03058a8be06288e7fc255787c77cb81a4ef94ac6610294cda08b5bbf61"} Apr 16 15:29:31.088545 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:31.088506 2576 scope.go:117] "RemoveContainer" containerID="6241ce5001c4ae2b207bdac0d35e5fbad5ac878990ab680d852e3641691f8630" Apr 16 15:29:31.088872 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:31.088854 2576 scope.go:117] "RemoveContainer" containerID="6241ce5001c4ae2b207bdac0d35e5fbad5ac878990ab680d852e3641691f8630" Apr 16 15:29:31.099887 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:29:31.099860 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs_kserve-ci-e2e-test_1df531f6-39cd-4702-b47f-a3d6fe125f73_0 in pod sandbox e492695cf0d4f4c383b7605d46e06d15430a5d4b1627baabeb1e1d497919373e from index: no such id: '6241ce5001c4ae2b207bdac0d35e5fbad5ac878990ab680d852e3641691f8630'" containerID="6241ce5001c4ae2b207bdac0d35e5fbad5ac878990ab680d852e3641691f8630" Apr 16 15:29:31.099961 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:29:31.099905 2576 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs_kserve-ci-e2e-test_1df531f6-39cd-4702-b47f-a3d6fe125f73_0 in pod sandbox e492695cf0d4f4c383b7605d46e06d15430a5d4b1627baabeb1e1d497919373e from index: no such id: '6241ce5001c4ae2b207bdac0d35e5fbad5ac878990ab680d852e3641691f8630'; Skipping pod \"isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs_kserve-ci-e2e-test(1df531f6-39cd-4702-b47f-a3d6fe125f73)\"" logger="UnhandledError" Apr 16 15:29:31.101280 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:29:31.101253 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs_kserve-ci-e2e-test(1df531f6-39cd-4702-b47f-a3d6fe125f73)\"" pod="kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs" podUID="1df531f6-39cd-4702-b47f-a3d6fe125f73" Apr 16 15:29:32.093653 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:32.093623 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs_1df531f6-39cd-4702-b47f-a3d6fe125f73/storage-initializer/1.log" Apr 16 15:29:36.198999 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.198967 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv"] Apr 16 15:29:36.199515 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.199250 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" podUID="4af418a7-7442-4200-84d5-558d218343ec" containerName="kserve-container" containerID="cri-o://aebae36f9a4f26386fd6b56ea31a47848e0be1e97dbde5015a2678657302f424" gracePeriod=30 Apr 16 15:29:36.251551 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.251514 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs"] Apr 16 15:29:36.352804 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.352765 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w"] Apr 16 15:29:36.363689 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.362156 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w" Apr 16 15:29:36.365942 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.365910 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-db90c5-dockercfg-7ztxg\"" Apr 16 15:29:36.366152 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.366079 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-db90c5\"" Apr 16 15:29:36.367743 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.367716 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w"] Apr 16 15:29:36.396172 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.396148 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs_1df531f6-39cd-4702-b47f-a3d6fe125f73/storage-initializer/1.log" Apr 16 15:29:36.396315 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.396219 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs" Apr 16 15:29:36.473389 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.473297 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1df531f6-39cd-4702-b47f-a3d6fe125f73-kserve-provision-location\") pod \"1df531f6-39cd-4702-b47f-a3d6fe125f73\" (UID: \"1df531f6-39cd-4702-b47f-a3d6fe125f73\") " Apr 16 15:29:36.473389 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.473386 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/1df531f6-39cd-4702-b47f-a3d6fe125f73-cabundle-cert\") pod \"1df531f6-39cd-4702-b47f-a3d6fe125f73\" (UID: \"1df531f6-39cd-4702-b47f-a3d6fe125f73\") " Apr 16 15:29:36.473623 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.473518 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9dbe00f2-9474-4aaa-b798-ef50c2b680d2-cabundle-cert\") pod \"isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w\" (UID: \"9dbe00f2-9474-4aaa-b798-ef50c2b680d2\") " pod="kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w" Apr 16 15:29:36.473623 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.473543 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9dbe00f2-9474-4aaa-b798-ef50c2b680d2-kserve-provision-location\") pod \"isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w\" (UID: \"9dbe00f2-9474-4aaa-b798-ef50c2b680d2\") " pod="kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w" Apr 16 15:29:36.473723 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.473622 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1df531f6-39cd-4702-b47f-a3d6fe125f73-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1df531f6-39cd-4702-b47f-a3d6fe125f73" (UID: "1df531f6-39cd-4702-b47f-a3d6fe125f73"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:29:36.473808 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.473786 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1df531f6-39cd-4702-b47f-a3d6fe125f73-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "1df531f6-39cd-4702-b47f-a3d6fe125f73" (UID: "1df531f6-39cd-4702-b47f-a3d6fe125f73"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:29:36.574081 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.574037 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9dbe00f2-9474-4aaa-b798-ef50c2b680d2-cabundle-cert\") pod \"isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w\" (UID: \"9dbe00f2-9474-4aaa-b798-ef50c2b680d2\") " pod="kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w" Apr 16 15:29:36.574081 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.574087 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9dbe00f2-9474-4aaa-b798-ef50c2b680d2-kserve-provision-location\") pod \"isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w\" (UID: \"9dbe00f2-9474-4aaa-b798-ef50c2b680d2\") " pod="kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w" Apr 16 15:29:36.574350 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.574174 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1df531f6-39cd-4702-b47f-a3d6fe125f73-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:29:36.574350 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.574187 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/1df531f6-39cd-4702-b47f-a3d6fe125f73-cabundle-cert\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:29:36.574552 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.574527 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9dbe00f2-9474-4aaa-b798-ef50c2b680d2-kserve-provision-location\") pod \"isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w\" (UID: \"9dbe00f2-9474-4aaa-b798-ef50c2b680d2\") " pod="kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w" Apr 16 15:29:36.574750 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.574726 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9dbe00f2-9474-4aaa-b798-ef50c2b680d2-cabundle-cert\") pod \"isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w\" (UID: \"9dbe00f2-9474-4aaa-b798-ef50c2b680d2\") " pod="kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w" Apr 16 15:29:36.691971 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.691935 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w" Apr 16 15:29:36.825338 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:36.825306 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w"] Apr 16 15:29:36.827304 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:29:36.827274 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dbe00f2_9474_4aaa_b798_ef50c2b680d2.slice/crio-08d1b17a9cb2581836aba0ab65162c9233c1a11220037636b6f330263d26fdf1 WatchSource:0}: Error finding container 08d1b17a9cb2581836aba0ab65162c9233c1a11220037636b6f330263d26fdf1: Status 404 returned error can't find the container with id 08d1b17a9cb2581836aba0ab65162c9233c1a11220037636b6f330263d26fdf1 Apr 16 15:29:37.113688 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:37.113614 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs_1df531f6-39cd-4702-b47f-a3d6fe125f73/storage-initializer/1.log" Apr 16 15:29:37.113833 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:37.113736 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs" Apr 16 15:29:37.113833 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:37.113746 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs" event={"ID":"1df531f6-39cd-4702-b47f-a3d6fe125f73","Type":"ContainerDied","Data":"e492695cf0d4f4c383b7605d46e06d15430a5d4b1627baabeb1e1d497919373e"} Apr 16 15:29:37.113833 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:37.113796 2576 scope.go:117] "RemoveContainer" containerID="55cebb03058a8be06288e7fc255787c77cb81a4ef94ac6610294cda08b5bbf61" Apr 16 15:29:37.115415 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:37.115386 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w" event={"ID":"9dbe00f2-9474-4aaa-b798-ef50c2b680d2","Type":"ContainerStarted","Data":"b399473433c66789e7c490f6ac235175fbefa3acf7327fa51a4694ee52529494"} Apr 16 15:29:37.115555 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:37.115425 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w" event={"ID":"9dbe00f2-9474-4aaa-b798-ef50c2b680d2","Type":"ContainerStarted","Data":"08d1b17a9cb2581836aba0ab65162c9233c1a11220037636b6f330263d26fdf1"} Apr 16 15:29:37.157922 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:37.157891 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs"] Apr 16 15:29:37.159589 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:37.159562 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-21dbf3-predictor-78d7c59b5d-rfdhs"] Apr 16 15:29:38.307624 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:38.307590 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1df531f6-39cd-4702-b47f-a3d6fe125f73" path="/var/lib/kubelet/pods/1df531f6-39cd-4702-b47f-a3d6fe125f73/volumes" Apr 16 15:29:40.637280 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:40.637257 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" Apr 16 15:29:40.714967 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:40.714893 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4af418a7-7442-4200-84d5-558d218343ec-kserve-provision-location\") pod \"4af418a7-7442-4200-84d5-558d218343ec\" (UID: \"4af418a7-7442-4200-84d5-558d218343ec\") " Apr 16 15:29:40.715215 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:40.715193 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4af418a7-7442-4200-84d5-558d218343ec-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4af418a7-7442-4200-84d5-558d218343ec" (UID: "4af418a7-7442-4200-84d5-558d218343ec"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:29:40.816147 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:40.816109 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4af418a7-7442-4200-84d5-558d218343ec-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:29:41.131277 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:41.131186 2576 generic.go:358] "Generic (PLEG): container finished" podID="4af418a7-7442-4200-84d5-558d218343ec" containerID="aebae36f9a4f26386fd6b56ea31a47848e0be1e97dbde5015a2678657302f424" exitCode=0 Apr 16 15:29:41.131443 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:41.131275 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" event={"ID":"4af418a7-7442-4200-84d5-558d218343ec","Type":"ContainerDied","Data":"aebae36f9a4f26386fd6b56ea31a47848e0be1e97dbde5015a2678657302f424"} Apr 16 15:29:41.131443 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:41.131311 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" event={"ID":"4af418a7-7442-4200-84d5-558d218343ec","Type":"ContainerDied","Data":"05bbfd459d090b992a5f6288f72f262739784bcc9173a31927d56aa5625f1b60"} Apr 16 15:29:41.131443 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:41.131280 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv" Apr 16 15:29:41.131443 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:41.131326 2576 scope.go:117] "RemoveContainer" containerID="aebae36f9a4f26386fd6b56ea31a47848e0be1e97dbde5015a2678657302f424" Apr 16 15:29:41.140237 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:41.140223 2576 scope.go:117] "RemoveContainer" containerID="4678f9649db50d82cfa6875480bacc59862c74cfb443061fc1ee90786eab4de2" Apr 16 15:29:41.147715 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:41.147699 2576 scope.go:117] "RemoveContainer" containerID="aebae36f9a4f26386fd6b56ea31a47848e0be1e97dbde5015a2678657302f424" Apr 16 15:29:41.147937 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:29:41.147919 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aebae36f9a4f26386fd6b56ea31a47848e0be1e97dbde5015a2678657302f424\": container with ID starting with aebae36f9a4f26386fd6b56ea31a47848e0be1e97dbde5015a2678657302f424 not found: ID does not exist" containerID="aebae36f9a4f26386fd6b56ea31a47848e0be1e97dbde5015a2678657302f424" Apr 16 15:29:41.147985 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:41.147944 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aebae36f9a4f26386fd6b56ea31a47848e0be1e97dbde5015a2678657302f424"} err="failed to get container status \"aebae36f9a4f26386fd6b56ea31a47848e0be1e97dbde5015a2678657302f424\": rpc error: code = NotFound desc = could not find container \"aebae36f9a4f26386fd6b56ea31a47848e0be1e97dbde5015a2678657302f424\": container with ID starting with aebae36f9a4f26386fd6b56ea31a47848e0be1e97dbde5015a2678657302f424 not found: ID does not exist" Apr 16 15:29:41.147985 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:41.147959 2576 scope.go:117] "RemoveContainer" containerID="4678f9649db50d82cfa6875480bacc59862c74cfb443061fc1ee90786eab4de2" Apr 16 15:29:41.148224 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:29:41.148208 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4678f9649db50d82cfa6875480bacc59862c74cfb443061fc1ee90786eab4de2\": container with ID starting with 4678f9649db50d82cfa6875480bacc59862c74cfb443061fc1ee90786eab4de2 not found: ID does not exist" containerID="4678f9649db50d82cfa6875480bacc59862c74cfb443061fc1ee90786eab4de2" Apr 16 15:29:41.148262 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:41.148229 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4678f9649db50d82cfa6875480bacc59862c74cfb443061fc1ee90786eab4de2"} err="failed to get container status \"4678f9649db50d82cfa6875480bacc59862c74cfb443061fc1ee90786eab4de2\": rpc error: code = NotFound desc = could not find container \"4678f9649db50d82cfa6875480bacc59862c74cfb443061fc1ee90786eab4de2\": container with ID starting with 4678f9649db50d82cfa6875480bacc59862c74cfb443061fc1ee90786eab4de2 not found: ID does not exist" Apr 16 15:29:41.154231 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:41.154209 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv"] Apr 16 15:29:41.156049 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:41.156013 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-21dbf3-predictor-6bd6957cff-swgrv"] Apr 16 15:29:42.307281 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:42.307248 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4af418a7-7442-4200-84d5-558d218343ec" path="/var/lib/kubelet/pods/4af418a7-7442-4200-84d5-558d218343ec/volumes" Apr 16 15:29:44.144632 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:44.144605 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w_9dbe00f2-9474-4aaa-b798-ef50c2b680d2/storage-initializer/0.log" Apr 16 15:29:44.145011 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:44.144643 2576 generic.go:358] "Generic (PLEG): container finished" podID="9dbe00f2-9474-4aaa-b798-ef50c2b680d2" containerID="b399473433c66789e7c490f6ac235175fbefa3acf7327fa51a4694ee52529494" exitCode=1 Apr 16 15:29:44.145011 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:44.144719 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w" event={"ID":"9dbe00f2-9474-4aaa-b798-ef50c2b680d2","Type":"ContainerDied","Data":"b399473433c66789e7c490f6ac235175fbefa3acf7327fa51a4694ee52529494"} Apr 16 15:29:45.149864 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:45.149836 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w_9dbe00f2-9474-4aaa-b798-ef50c2b680d2/storage-initializer/0.log" Apr 16 15:29:45.150275 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:45.149972 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w" event={"ID":"9dbe00f2-9474-4aaa-b798-ef50c2b680d2","Type":"ContainerStarted","Data":"d1bbba782c0e7d0a9ba9cd3220402693d8c618029f4490097f51aa019a86de42"} Apr 16 15:29:46.155121 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:46.155096 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w_9dbe00f2-9474-4aaa-b798-ef50c2b680d2/storage-initializer/1.log" Apr 16 15:29:46.155514 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:46.155430 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w_9dbe00f2-9474-4aaa-b798-ef50c2b680d2/storage-initializer/0.log" Apr 16 15:29:46.155514 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:46.155462 2576 generic.go:358] "Generic (PLEG): container finished" podID="9dbe00f2-9474-4aaa-b798-ef50c2b680d2" containerID="d1bbba782c0e7d0a9ba9cd3220402693d8c618029f4490097f51aa019a86de42" exitCode=1 Apr 16 15:29:46.155514 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:46.155501 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w" event={"ID":"9dbe00f2-9474-4aaa-b798-ef50c2b680d2","Type":"ContainerDied","Data":"d1bbba782c0e7d0a9ba9cd3220402693d8c618029f4490097f51aa019a86de42"} Apr 16 15:29:46.155616 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:46.155528 2576 scope.go:117] "RemoveContainer" containerID="b399473433c66789e7c490f6ac235175fbefa3acf7327fa51a4694ee52529494" Apr 16 15:29:46.155862 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:46.155844 2576 scope.go:117] "RemoveContainer" containerID="b399473433c66789e7c490f6ac235175fbefa3acf7327fa51a4694ee52529494" Apr 16 15:29:46.166392 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:29:46.166359 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w_kserve-ci-e2e-test_9dbe00f2-9474-4aaa-b798-ef50c2b680d2_0 in pod sandbox 08d1b17a9cb2581836aba0ab65162c9233c1a11220037636b6f330263d26fdf1 from index: no such id: 'b399473433c66789e7c490f6ac235175fbefa3acf7327fa51a4694ee52529494'" containerID="b399473433c66789e7c490f6ac235175fbefa3acf7327fa51a4694ee52529494" Apr 16 15:29:46.166474 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:29:46.166414 2576 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w_kserve-ci-e2e-test_9dbe00f2-9474-4aaa-b798-ef50c2b680d2_0 in pod sandbox 08d1b17a9cb2581836aba0ab65162c9233c1a11220037636b6f330263d26fdf1 from index: no such id: 'b399473433c66789e7c490f6ac235175fbefa3acf7327fa51a4694ee52529494'; Skipping pod \"isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w_kserve-ci-e2e-test(9dbe00f2-9474-4aaa-b798-ef50c2b680d2)\"" logger="UnhandledError" Apr 16 15:29:46.167770 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:29:46.167748 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w_kserve-ci-e2e-test(9dbe00f2-9474-4aaa-b798-ef50c2b680d2)\"" pod="kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w" podUID="9dbe00f2-9474-4aaa-b798-ef50c2b680d2" Apr 16 15:29:46.357118 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:46.357073 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w"] Apr 16 15:29:47.160220 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:47.160193 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w_9dbe00f2-9474-4aaa-b798-ef50c2b680d2/storage-initializer/1.log" Apr 16 15:29:47.305007 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:47.304983 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w_9dbe00f2-9474-4aaa-b798-ef50c2b680d2/storage-initializer/1.log" Apr 16 15:29:47.305160 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:47.305068 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w" Apr 16 15:29:47.380193 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:47.380159 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9dbe00f2-9474-4aaa-b798-ef50c2b680d2-kserve-provision-location\") pod \"9dbe00f2-9474-4aaa-b798-ef50c2b680d2\" (UID: \"9dbe00f2-9474-4aaa-b798-ef50c2b680d2\") " Apr 16 15:29:47.380193 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:47.380203 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9dbe00f2-9474-4aaa-b798-ef50c2b680d2-cabundle-cert\") pod \"9dbe00f2-9474-4aaa-b798-ef50c2b680d2\" (UID: \"9dbe00f2-9474-4aaa-b798-ef50c2b680d2\") " Apr 16 15:29:47.380504 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:47.380438 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dbe00f2-9474-4aaa-b798-ef50c2b680d2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9dbe00f2-9474-4aaa-b798-ef50c2b680d2" (UID: "9dbe00f2-9474-4aaa-b798-ef50c2b680d2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:29:47.380621 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:47.380601 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dbe00f2-9474-4aaa-b798-ef50c2b680d2-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "9dbe00f2-9474-4aaa-b798-ef50c2b680d2" (UID: "9dbe00f2-9474-4aaa-b798-ef50c2b680d2"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:29:47.481215 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:47.481125 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9dbe00f2-9474-4aaa-b798-ef50c2b680d2-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:29:47.481215 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:47.481151 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9dbe00f2-9474-4aaa-b798-ef50c2b680d2-cabundle-cert\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:29:48.165928 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:48.165904 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w_9dbe00f2-9474-4aaa-b798-ef50c2b680d2/storage-initializer/1.log" Apr 16 15:29:48.166341 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:48.165959 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w" event={"ID":"9dbe00f2-9474-4aaa-b798-ef50c2b680d2","Type":"ContainerDied","Data":"08d1b17a9cb2581836aba0ab65162c9233c1a11220037636b6f330263d26fdf1"} Apr 16 15:29:48.166341 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:48.165990 2576 scope.go:117] "RemoveContainer" containerID="d1bbba782c0e7d0a9ba9cd3220402693d8c618029f4490097f51aa019a86de42" Apr 16 15:29:48.166341 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:48.166011 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w" Apr 16 15:29:48.198830 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:48.198803 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w"] Apr 16 15:29:48.204958 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:48.204929 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-db90c5-predictor-cd8bb7b7d-ldt9w"] Apr 16 15:29:48.307585 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:29:48.307548 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dbe00f2-9474-4aaa-b798-ef50c2b680d2" path="/var/lib/kubelet/pods/9dbe00f2-9474-4aaa-b798-ef50c2b680d2/volumes" Apr 16 15:32:46.383793 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:32:46.383755 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:32:46.386036 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:32:46.386009 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:32:46.387584 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:32:46.387568 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:32:46.389812 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:32:46.389794 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:37:46.413134 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:37:46.413064 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:37:46.415276 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:37:46.415252 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:37:46.416604 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:37:46.416581 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:37:46.418770 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:37:46.418749 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:38:40.433372 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.433329 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r"] Apr 16 15:38:40.433954 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.433876 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1df531f6-39cd-4702-b47f-a3d6fe125f73" containerName="storage-initializer" Apr 16 15:38:40.433954 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.433896 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df531f6-39cd-4702-b47f-a3d6fe125f73" containerName="storage-initializer" Apr 16 15:38:40.433954 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.433912 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9dbe00f2-9474-4aaa-b798-ef50c2b680d2" containerName="storage-initializer" Apr 16 15:38:40.433954 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.433921 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dbe00f2-9474-4aaa-b798-ef50c2b680d2" containerName="storage-initializer" Apr 16 15:38:40.433954 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.433936 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4af418a7-7442-4200-84d5-558d218343ec" containerName="storage-initializer" Apr 16 15:38:40.433954 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.433944 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af418a7-7442-4200-84d5-558d218343ec" containerName="storage-initializer" Apr 16 15:38:40.434458 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.433965 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4af418a7-7442-4200-84d5-558d218343ec" containerName="kserve-container" Apr 16 15:38:40.434458 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.433973 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af418a7-7442-4200-84d5-558d218343ec" containerName="kserve-container" Apr 16 15:38:40.434458 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.433991 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9dbe00f2-9474-4aaa-b798-ef50c2b680d2" containerName="storage-initializer" Apr 16 15:38:40.434458 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.434000 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dbe00f2-9474-4aaa-b798-ef50c2b680d2" containerName="storage-initializer" Apr 16 15:38:40.434458 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.434119 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1df531f6-39cd-4702-b47f-a3d6fe125f73" containerName="storage-initializer" Apr 16 15:38:40.434458 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.434133 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9dbe00f2-9474-4aaa-b798-ef50c2b680d2" containerName="storage-initializer" Apr 16 15:38:40.434458 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.434148 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4af418a7-7442-4200-84d5-558d218343ec" containerName="kserve-container" Apr 16 15:38:40.434458 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.434235 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1df531f6-39cd-4702-b47f-a3d6fe125f73" containerName="storage-initializer" Apr 16 15:38:40.434458 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.434245 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df531f6-39cd-4702-b47f-a3d6fe125f73" containerName="storage-initializer" Apr 16 15:38:40.434458 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.434376 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1df531f6-39cd-4702-b47f-a3d6fe125f73" containerName="storage-initializer" Apr 16 15:38:40.434458 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.434389 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9dbe00f2-9474-4aaa-b798-ef50c2b680d2" containerName="storage-initializer" Apr 16 15:38:40.437556 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.437535 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" Apr 16 15:38:40.440002 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.439983 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tsb5d\"" Apr 16 15:38:40.446850 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.446827 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r"] Apr 16 15:38:40.540557 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.540517 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d66c6aea-998a-4868-98c9-bcf96745d1ca-kserve-provision-location\") pod \"isvc-sklearn-predictor-74df74d548-5559r\" (UID: \"d66c6aea-998a-4868-98c9-bcf96745d1ca\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" Apr 16 15:38:40.641517 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.641474 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d66c6aea-998a-4868-98c9-bcf96745d1ca-kserve-provision-location\") pod \"isvc-sklearn-predictor-74df74d548-5559r\" (UID: \"d66c6aea-998a-4868-98c9-bcf96745d1ca\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" Apr 16 15:38:40.641842 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.641824 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d66c6aea-998a-4868-98c9-bcf96745d1ca-kserve-provision-location\") pod \"isvc-sklearn-predictor-74df74d548-5559r\" (UID: \"d66c6aea-998a-4868-98c9-bcf96745d1ca\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" Apr 16 15:38:40.750521 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.750433 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" Apr 16 15:38:40.869884 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.869859 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r"] Apr 16 15:38:40.872342 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:38:40.872312 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd66c6aea_998a_4868_98c9_bcf96745d1ca.slice/crio-b44dc4b526dead7c13887480d2ded6b89c94fe2de6f9a8264008fcda189e019d WatchSource:0}: Error finding container b44dc4b526dead7c13887480d2ded6b89c94fe2de6f9a8264008fcda189e019d: Status 404 returned error can't find the container with id b44dc4b526dead7c13887480d2ded6b89c94fe2de6f9a8264008fcda189e019d Apr 16 15:38:40.874288 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.874266 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:38:40.991276 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.991243 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" event={"ID":"d66c6aea-998a-4868-98c9-bcf96745d1ca","Type":"ContainerStarted","Data":"503e9a314b8678fba9819e16bf02d6ff77ca88a4160e7fff9d540efcaa9bce7e"} Apr 16 15:38:40.991276 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:40.991280 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" event={"ID":"d66c6aea-998a-4868-98c9-bcf96745d1ca","Type":"ContainerStarted","Data":"b44dc4b526dead7c13887480d2ded6b89c94fe2de6f9a8264008fcda189e019d"} Apr 16 15:38:45.008364 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:45.008278 2576 generic.go:358] "Generic (PLEG): container finished" podID="d66c6aea-998a-4868-98c9-bcf96745d1ca" containerID="503e9a314b8678fba9819e16bf02d6ff77ca88a4160e7fff9d540efcaa9bce7e" exitCode=0 Apr 16 15:38:45.008364 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:45.008352 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" event={"ID":"d66c6aea-998a-4868-98c9-bcf96745d1ca","Type":"ContainerDied","Data":"503e9a314b8678fba9819e16bf02d6ff77ca88a4160e7fff9d540efcaa9bce7e"} Apr 16 15:38:46.012952 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:46.012917 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" event={"ID":"d66c6aea-998a-4868-98c9-bcf96745d1ca","Type":"ContainerStarted","Data":"deeb02edbdc5538100d6e59c58d94a2dfd3cb5a855732152dfd0f7ae54a63c81"} Apr 16 15:38:46.013488 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:46.013218 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" Apr 16 15:38:46.014521 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:46.014494 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" podUID="d66c6aea-998a-4868-98c9-bcf96745d1ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 15:38:46.030284 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:46.030241 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" podStartSLOduration=6.030228023 podStartE2EDuration="6.030228023s" podCreationTimestamp="2026-04-16 15:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:38:46.028854191 +0000 UTC m=+2760.447989166" watchObservedRunningTime="2026-04-16 15:38:46.030228023 +0000 UTC m=+2760.449362998" Apr 16 15:38:47.016392 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:47.016355 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" podUID="d66c6aea-998a-4868-98c9-bcf96745d1ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 15:38:57.017378 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:38:57.017272 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" podUID="d66c6aea-998a-4868-98c9-bcf96745d1ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 15:39:07.017193 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:39:07.017146 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" podUID="d66c6aea-998a-4868-98c9-bcf96745d1ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 15:39:17.016599 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:39:17.016551 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" podUID="d66c6aea-998a-4868-98c9-bcf96745d1ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 15:39:27.016349 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:39:27.016309 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" podUID="d66c6aea-998a-4868-98c9-bcf96745d1ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 15:39:37.017176 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:39:37.017132 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" podUID="d66c6aea-998a-4868-98c9-bcf96745d1ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 15:39:47.016981 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:39:47.016936 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" podUID="d66c6aea-998a-4868-98c9-bcf96745d1ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 15:39:57.017994 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:39:57.017959 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" Apr 16 15:40:00.573359 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:00.573325 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r"] Apr 16 15:40:00.573831 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:00.573614 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" podUID="d66c6aea-998a-4868-98c9-bcf96745d1ca" containerName="kserve-container" containerID="cri-o://deeb02edbdc5538100d6e59c58d94a2dfd3cb5a855732152dfd0f7ae54a63c81" gracePeriod=30 Apr 16 15:40:00.622635 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:00.622603 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs"] Apr 16 15:40:00.626349 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:00.626325 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs" Apr 16 15:40:00.632823 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:00.632800 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs"] Apr 16 15:40:00.739154 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:00.739119 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/067d8f2c-3ba7-45a0-8e53-c781c81775b2-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-krhvs\" (UID: \"067d8f2c-3ba7-45a0-8e53-c781c81775b2\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs" Apr 16 15:40:00.840299 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:00.840197 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/067d8f2c-3ba7-45a0-8e53-c781c81775b2-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-krhvs\" (UID: \"067d8f2c-3ba7-45a0-8e53-c781c81775b2\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs" Apr 16 15:40:00.840634 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:00.840607 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/067d8f2c-3ba7-45a0-8e53-c781c81775b2-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-krhvs\" (UID: \"067d8f2c-3ba7-45a0-8e53-c781c81775b2\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs" Apr 16 15:40:00.939323 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:00.939288 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs" Apr 16 15:40:01.061221 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:01.061200 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs"] Apr 16 15:40:01.063743 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:40:01.063712 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod067d8f2c_3ba7_45a0_8e53_c781c81775b2.slice/crio-c07beab4801c41a16f6a007ff33795944c05d0c6fa4516a4476f1ce247034e86 WatchSource:0}: Error finding container c07beab4801c41a16f6a007ff33795944c05d0c6fa4516a4476f1ce247034e86: Status 404 returned error can't find the container with id c07beab4801c41a16f6a007ff33795944c05d0c6fa4516a4476f1ce247034e86 Apr 16 15:40:01.282390 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:01.282357 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs" event={"ID":"067d8f2c-3ba7-45a0-8e53-c781c81775b2","Type":"ContainerStarted","Data":"a70980def0254b881e260081b9f2656d43a03599744511f98a129ac0f3f8dd51"} Apr 16 15:40:01.282390 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:01.282392 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs" event={"ID":"067d8f2c-3ba7-45a0-8e53-c781c81775b2","Type":"ContainerStarted","Data":"c07beab4801c41a16f6a007ff33795944c05d0c6fa4516a4476f1ce247034e86"} Apr 16 15:40:05.020068 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:05.020042 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" Apr 16 15:40:05.081816 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:05.081785 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d66c6aea-998a-4868-98c9-bcf96745d1ca-kserve-provision-location\") pod \"d66c6aea-998a-4868-98c9-bcf96745d1ca\" (UID: \"d66c6aea-998a-4868-98c9-bcf96745d1ca\") " Apr 16 15:40:05.082103 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:05.082079 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d66c6aea-998a-4868-98c9-bcf96745d1ca-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d66c6aea-998a-4868-98c9-bcf96745d1ca" (UID: "d66c6aea-998a-4868-98c9-bcf96745d1ca"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:40:05.182763 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:05.182728 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d66c6aea-998a-4868-98c9-bcf96745d1ca-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:40:05.297071 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:05.297008 2576 generic.go:358] "Generic (PLEG): container finished" podID="067d8f2c-3ba7-45a0-8e53-c781c81775b2" containerID="a70980def0254b881e260081b9f2656d43a03599744511f98a129ac0f3f8dd51" exitCode=0 Apr 16 15:40:05.297242 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:05.297082 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs" event={"ID":"067d8f2c-3ba7-45a0-8e53-c781c81775b2","Type":"ContainerDied","Data":"a70980def0254b881e260081b9f2656d43a03599744511f98a129ac0f3f8dd51"} Apr 16 15:40:05.298546 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:05.298523 2576 generic.go:358] "Generic (PLEG): container finished" podID="d66c6aea-998a-4868-98c9-bcf96745d1ca" containerID="deeb02edbdc5538100d6e59c58d94a2dfd3cb5a855732152dfd0f7ae54a63c81" exitCode=0 Apr 16 15:40:05.298663 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:05.298581 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" Apr 16 15:40:05.298663 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:05.298582 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" event={"ID":"d66c6aea-998a-4868-98c9-bcf96745d1ca","Type":"ContainerDied","Data":"deeb02edbdc5538100d6e59c58d94a2dfd3cb5a855732152dfd0f7ae54a63c81"} Apr 16 15:40:05.298786 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:05.298681 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r" event={"ID":"d66c6aea-998a-4868-98c9-bcf96745d1ca","Type":"ContainerDied","Data":"b44dc4b526dead7c13887480d2ded6b89c94fe2de6f9a8264008fcda189e019d"} Apr 16 15:40:05.298786 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:05.298697 2576 scope.go:117] "RemoveContainer" containerID="deeb02edbdc5538100d6e59c58d94a2dfd3cb5a855732152dfd0f7ae54a63c81" Apr 16 15:40:05.308315 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:05.308295 2576 scope.go:117] "RemoveContainer" containerID="503e9a314b8678fba9819e16bf02d6ff77ca88a4160e7fff9d540efcaa9bce7e" Apr 16 15:40:05.316344 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:05.316319 2576 scope.go:117] "RemoveContainer" containerID="deeb02edbdc5538100d6e59c58d94a2dfd3cb5a855732152dfd0f7ae54a63c81" Apr 16 15:40:05.316678 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:40:05.316635 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deeb02edbdc5538100d6e59c58d94a2dfd3cb5a855732152dfd0f7ae54a63c81\": container with ID starting with deeb02edbdc5538100d6e59c58d94a2dfd3cb5a855732152dfd0f7ae54a63c81 not found: ID does not exist" containerID="deeb02edbdc5538100d6e59c58d94a2dfd3cb5a855732152dfd0f7ae54a63c81" Apr 16 15:40:05.316775 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:05.316678 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deeb02edbdc5538100d6e59c58d94a2dfd3cb5a855732152dfd0f7ae54a63c81"} err="failed to get container status \"deeb02edbdc5538100d6e59c58d94a2dfd3cb5a855732152dfd0f7ae54a63c81\": rpc error: code = NotFound desc = could not find container \"deeb02edbdc5538100d6e59c58d94a2dfd3cb5a855732152dfd0f7ae54a63c81\": container with ID starting with deeb02edbdc5538100d6e59c58d94a2dfd3cb5a855732152dfd0f7ae54a63c81 not found: ID does not exist" Apr 16 15:40:05.316775 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:05.316705 2576 scope.go:117] "RemoveContainer" containerID="503e9a314b8678fba9819e16bf02d6ff77ca88a4160e7fff9d540efcaa9bce7e" Apr 16 15:40:05.316954 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:40:05.316931 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"503e9a314b8678fba9819e16bf02d6ff77ca88a4160e7fff9d540efcaa9bce7e\": container with ID starting with 503e9a314b8678fba9819e16bf02d6ff77ca88a4160e7fff9d540efcaa9bce7e not found: ID does not exist" containerID="503e9a314b8678fba9819e16bf02d6ff77ca88a4160e7fff9d540efcaa9bce7e" Apr 16 15:40:05.317077 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:05.316960 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"503e9a314b8678fba9819e16bf02d6ff77ca88a4160e7fff9d540efcaa9bce7e"} err="failed to get container status \"503e9a314b8678fba9819e16bf02d6ff77ca88a4160e7fff9d540efcaa9bce7e\": rpc error: code = NotFound desc = could not find container \"503e9a314b8678fba9819e16bf02d6ff77ca88a4160e7fff9d540efcaa9bce7e\": container with ID starting with 503e9a314b8678fba9819e16bf02d6ff77ca88a4160e7fff9d540efcaa9bce7e not found: ID does not exist" Apr 16 15:40:05.326524 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:05.326494 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r"] Apr 16 15:40:05.329147 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:05.329117 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-74df74d548-5559r"] Apr 16 15:40:06.307735 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:06.307706 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d66c6aea-998a-4868-98c9-bcf96745d1ca" path="/var/lib/kubelet/pods/d66c6aea-998a-4868-98c9-bcf96745d1ca/volumes" Apr 16 15:40:06.308146 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:06.308000 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs" event={"ID":"067d8f2c-3ba7-45a0-8e53-c781c81775b2","Type":"ContainerStarted","Data":"106cac25a0a9b6e773a7fbaaed0ebea48d27e7ed297746f0facb05dc267909ad"} Apr 16 15:40:06.308240 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:06.308222 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs" Apr 16 15:40:06.324810 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:06.324759 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs" podStartSLOduration=6.324743296 podStartE2EDuration="6.324743296s" podCreationTimestamp="2026-04-16 15:40:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:40:06.323148724 +0000 UTC m=+2840.742283700" watchObservedRunningTime="2026-04-16 15:40:06.324743296 +0000 UTC m=+2840.743878271" Apr 16 15:40:37.332550 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:37.332460 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs" podUID="067d8f2c-3ba7-45a0-8e53-c781c81775b2" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 15:40:47.310661 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:47.310629 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs" Apr 16 15:40:50.769512 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:50.769475 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs"] Apr 16 15:40:50.769976 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:50.769806 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs" podUID="067d8f2c-3ba7-45a0-8e53-c781c81775b2" containerName="kserve-container" containerID="cri-o://106cac25a0a9b6e773a7fbaaed0ebea48d27e7ed297746f0facb05dc267909ad" gracePeriod=30 Apr 16 15:40:50.825550 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:50.825517 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27"] Apr 16 15:40:50.826017 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:50.825998 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d66c6aea-998a-4868-98c9-bcf96745d1ca" containerName="kserve-container" Apr 16 15:40:50.826017 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:50.826031 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66c6aea-998a-4868-98c9-bcf96745d1ca" containerName="kserve-container" Apr 16 15:40:50.826197 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:50.826063 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d66c6aea-998a-4868-98c9-bcf96745d1ca" containerName="storage-initializer" Apr 16 15:40:50.826197 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:50.826071 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66c6aea-998a-4868-98c9-bcf96745d1ca" containerName="storage-initializer" Apr 16 15:40:50.826197 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:50.826182 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d66c6aea-998a-4868-98c9-bcf96745d1ca" containerName="kserve-container" Apr 16 15:40:50.830371 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:50.830350 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27" Apr 16 15:40:50.838932 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:50.838886 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27"] Apr 16 15:40:50.893720 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:50.893686 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/304258f8-e1fc-41e8-8322-aafb0998c2e1-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-9c688ddc4-nlk27\" (UID: \"304258f8-e1fc-41e8-8322-aafb0998c2e1\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27" Apr 16 15:40:50.994944 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:50.994904 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/304258f8-e1fc-41e8-8322-aafb0998c2e1-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-9c688ddc4-nlk27\" (UID: \"304258f8-e1fc-41e8-8322-aafb0998c2e1\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27" Apr 16 15:40:50.995359 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:50.995333 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/304258f8-e1fc-41e8-8322-aafb0998c2e1-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-9c688ddc4-nlk27\" (UID: \"304258f8-e1fc-41e8-8322-aafb0998c2e1\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27" Apr 16 15:40:51.144237 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:51.144145 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27" Apr 16 15:40:51.265913 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:51.265888 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27"] Apr 16 15:40:51.268541 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:40:51.268512 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod304258f8_e1fc_41e8_8322_aafb0998c2e1.slice/crio-70d99d0672474e29af658a4e3a73e34ca559f86ff2085306fe9ce2d289f7fedc WatchSource:0}: Error finding container 70d99d0672474e29af658a4e3a73e34ca559f86ff2085306fe9ce2d289f7fedc: Status 404 returned error can't find the container with id 70d99d0672474e29af658a4e3a73e34ca559f86ff2085306fe9ce2d289f7fedc Apr 16 15:40:51.466441 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:51.466406 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27" event={"ID":"304258f8-e1fc-41e8-8322-aafb0998c2e1","Type":"ContainerStarted","Data":"1a3e7ad7fb6dfb14e9d920febf43f240af1c4f1298e2805e14be7e08cd99cd3e"} Apr 16 15:40:51.466441 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:51.466442 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27" event={"ID":"304258f8-e1fc-41e8-8322-aafb0998c2e1","Type":"ContainerStarted","Data":"70d99d0672474e29af658a4e3a73e34ca559f86ff2085306fe9ce2d289f7fedc"} Apr 16 15:40:57.309191 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:57.309150 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs" podUID="067d8f2c-3ba7-45a0-8e53-c781c81775b2" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.47:8080/v2/models/sklearn-v2-mlserver/ready\": dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 15:40:57.489085 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:57.489051 2576 generic.go:358] "Generic (PLEG): container finished" podID="304258f8-e1fc-41e8-8322-aafb0998c2e1" containerID="1a3e7ad7fb6dfb14e9d920febf43f240af1c4f1298e2805e14be7e08cd99cd3e" exitCode=0 Apr 16 15:40:57.489248 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:57.489119 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27" event={"ID":"304258f8-e1fc-41e8-8322-aafb0998c2e1","Type":"ContainerDied","Data":"1a3e7ad7fb6dfb14e9d920febf43f240af1c4f1298e2805e14be7e08cd99cd3e"} Apr 16 15:40:58.212575 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:58.212554 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs" Apr 16 15:40:58.362896 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:58.362806 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/067d8f2c-3ba7-45a0-8e53-c781c81775b2-kserve-provision-location\") pod \"067d8f2c-3ba7-45a0-8e53-c781c81775b2\" (UID: \"067d8f2c-3ba7-45a0-8e53-c781c81775b2\") " Apr 16 15:40:58.363276 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:58.363124 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/067d8f2c-3ba7-45a0-8e53-c781c81775b2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "067d8f2c-3ba7-45a0-8e53-c781c81775b2" (UID: "067d8f2c-3ba7-45a0-8e53-c781c81775b2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:40:58.464270 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:58.464236 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/067d8f2c-3ba7-45a0-8e53-c781c81775b2-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:40:58.494669 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:58.494631 2576 generic.go:358] "Generic (PLEG): container finished" podID="067d8f2c-3ba7-45a0-8e53-c781c81775b2" containerID="106cac25a0a9b6e773a7fbaaed0ebea48d27e7ed297746f0facb05dc267909ad" exitCode=0 Apr 16 15:40:58.494841 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:58.494702 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs" event={"ID":"067d8f2c-3ba7-45a0-8e53-c781c81775b2","Type":"ContainerDied","Data":"106cac25a0a9b6e773a7fbaaed0ebea48d27e7ed297746f0facb05dc267909ad"} Apr 16 15:40:58.494841 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:58.494732 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs" Apr 16 15:40:58.494841 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:58.494747 2576 scope.go:117] "RemoveContainer" containerID="106cac25a0a9b6e773a7fbaaed0ebea48d27e7ed297746f0facb05dc267909ad" Apr 16 15:40:58.494841 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:58.494737 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs" event={"ID":"067d8f2c-3ba7-45a0-8e53-c781c81775b2","Type":"ContainerDied","Data":"c07beab4801c41a16f6a007ff33795944c05d0c6fa4516a4476f1ce247034e86"} Apr 16 15:40:58.496628 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:58.496600 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27" event={"ID":"304258f8-e1fc-41e8-8322-aafb0998c2e1","Type":"ContainerStarted","Data":"0785f712c78a2f4d2b04458b5c1e1d147fd7f685fbe70ef0e3e12c71e8e6b7f8"} Apr 16 15:40:58.496924 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:58.496876 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27" Apr 16 15:40:58.498556 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:58.498525 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27" podUID="304258f8-e1fc-41e8-8322-aafb0998c2e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 16 15:40:58.504067 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:58.504015 2576 scope.go:117] "RemoveContainer" containerID="a70980def0254b881e260081b9f2656d43a03599744511f98a129ac0f3f8dd51" Apr 16 15:40:58.511846 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:58.511824 2576 scope.go:117] "RemoveContainer" containerID="106cac25a0a9b6e773a7fbaaed0ebea48d27e7ed297746f0facb05dc267909ad" Apr 16 15:40:58.512381 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:40:58.512251 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"106cac25a0a9b6e773a7fbaaed0ebea48d27e7ed297746f0facb05dc267909ad\": container with ID starting with 106cac25a0a9b6e773a7fbaaed0ebea48d27e7ed297746f0facb05dc267909ad not found: ID does not exist" containerID="106cac25a0a9b6e773a7fbaaed0ebea48d27e7ed297746f0facb05dc267909ad" Apr 16 15:40:58.512381 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:58.512289 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"106cac25a0a9b6e773a7fbaaed0ebea48d27e7ed297746f0facb05dc267909ad"} err="failed to get container status \"106cac25a0a9b6e773a7fbaaed0ebea48d27e7ed297746f0facb05dc267909ad\": rpc error: code = NotFound desc = could not find container \"106cac25a0a9b6e773a7fbaaed0ebea48d27e7ed297746f0facb05dc267909ad\": container with ID starting with 106cac25a0a9b6e773a7fbaaed0ebea48d27e7ed297746f0facb05dc267909ad not found: ID does not exist" Apr 16 15:40:58.512381 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:58.512314 2576 scope.go:117] "RemoveContainer" containerID="a70980def0254b881e260081b9f2656d43a03599744511f98a129ac0f3f8dd51" Apr 16 15:40:58.513083 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:40:58.513057 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a70980def0254b881e260081b9f2656d43a03599744511f98a129ac0f3f8dd51\": container with ID starting with a70980def0254b881e260081b9f2656d43a03599744511f98a129ac0f3f8dd51 not found: ID does not exist" containerID="a70980def0254b881e260081b9f2656d43a03599744511f98a129ac0f3f8dd51" Apr 16 15:40:58.513197 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:58.513092 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a70980def0254b881e260081b9f2656d43a03599744511f98a129ac0f3f8dd51"} err="failed to get container status \"a70980def0254b881e260081b9f2656d43a03599744511f98a129ac0f3f8dd51\": rpc error: code = NotFound desc = could not find container \"a70980def0254b881e260081b9f2656d43a03599744511f98a129ac0f3f8dd51\": container with ID starting with a70980def0254b881e260081b9f2656d43a03599744511f98a129ac0f3f8dd51 not found: ID does not exist" Apr 16 15:40:58.513642 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:58.513604 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27" podStartSLOduration=8.513594395 podStartE2EDuration="8.513594395s" podCreationTimestamp="2026-04-16 15:40:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:40:58.512251744 +0000 UTC m=+2892.931386713" watchObservedRunningTime="2026-04-16 15:40:58.513594395 +0000 UTC m=+2892.932729435" Apr 16 15:40:58.524868 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:58.524845 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs"] Apr 16 15:40:58.528592 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:58.528573 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-krhvs"] Apr 16 15:40:59.502849 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:40:59.502812 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27" podUID="304258f8-e1fc-41e8-8322-aafb0998c2e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 16 15:41:00.307330 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:00.307297 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="067d8f2c-3ba7-45a0-8e53-c781c81775b2" path="/var/lib/kubelet/pods/067d8f2c-3ba7-45a0-8e53-c781c81775b2/volumes" Apr 16 15:41:09.503379 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:09.503330 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27" podUID="304258f8-e1fc-41e8-8322-aafb0998c2e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 16 15:41:19.504309 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:19.504278 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27" Apr 16 15:41:27.829402 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:27.829377 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-9c688ddc4-nlk27_304258f8-e1fc-41e8-8322-aafb0998c2e1/kserve-container/0.log" Apr 16 15:41:27.969799 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:27.969768 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27"] Apr 16 15:41:27.970180 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:27.970154 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27" podUID="304258f8-e1fc-41e8-8322-aafb0998c2e1" containerName="kserve-container" containerID="cri-o://0785f712c78a2f4d2b04458b5c1e1d147fd7f685fbe70ef0e3e12c71e8e6b7f8" gracePeriod=30 Apr 16 15:41:28.007934 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:28.007889 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2"] Apr 16 15:41:28.008374 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:28.008355 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="067d8f2c-3ba7-45a0-8e53-c781c81775b2" containerName="storage-initializer" Apr 16 15:41:28.008435 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:28.008376 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="067d8f2c-3ba7-45a0-8e53-c781c81775b2" containerName="storage-initializer" Apr 16 15:41:28.008435 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:28.008399 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="067d8f2c-3ba7-45a0-8e53-c781c81775b2" containerName="kserve-container" Apr 16 15:41:28.008435 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:28.008406 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="067d8f2c-3ba7-45a0-8e53-c781c81775b2" containerName="kserve-container" Apr 16 15:41:28.008536 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:28.008463 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="067d8f2c-3ba7-45a0-8e53-c781c81775b2" containerName="kserve-container" Apr 16 15:41:28.011109 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:28.011089 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2" Apr 16 15:41:28.019711 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:28.019613 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2"] Apr 16 15:41:28.125980 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:28.125888 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2\" (UID: \"2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2" Apr 16 15:41:28.226815 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:28.226769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2\" (UID: \"2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2" Apr 16 15:41:28.227193 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:28.227174 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2\" (UID: \"2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2" Apr 16 15:41:28.324244 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:28.324215 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2" Apr 16 15:41:28.449389 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:28.449364 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2"] Apr 16 15:41:28.452043 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:41:28.451990 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b5534b6_0ae0_4d64_ab8c_b420ad3c65e2.slice/crio-16f56b1881863652077b019a1c5c68d4b59e3fb2422fee1b745988bb1054710c WatchSource:0}: Error finding container 16f56b1881863652077b019a1c5c68d4b59e3fb2422fee1b745988bb1054710c: Status 404 returned error can't find the container with id 16f56b1881863652077b019a1c5c68d4b59e3fb2422fee1b745988bb1054710c Apr 16 15:41:28.613768 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:28.613733 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2" event={"ID":"2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2","Type":"ContainerStarted","Data":"0ead2908172c2359d40ee036ec693bdca9d46bb5ee57d1445b3ba6f49fd8b403"} Apr 16 15:41:28.613768 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:28.613768 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2" event={"ID":"2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2","Type":"ContainerStarted","Data":"16f56b1881863652077b019a1c5c68d4b59e3fb2422fee1b745988bb1054710c"} Apr 16 15:41:29.100521 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:29.100497 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27" Apr 16 15:41:29.135033 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:29.134983 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/304258f8-e1fc-41e8-8322-aafb0998c2e1-kserve-provision-location\") pod \"304258f8-e1fc-41e8-8322-aafb0998c2e1\" (UID: \"304258f8-e1fc-41e8-8322-aafb0998c2e1\") " Apr 16 15:41:29.160446 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:29.160405 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/304258f8-e1fc-41e8-8322-aafb0998c2e1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "304258f8-e1fc-41e8-8322-aafb0998c2e1" (UID: "304258f8-e1fc-41e8-8322-aafb0998c2e1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:41:29.236568 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:29.236477 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/304258f8-e1fc-41e8-8322-aafb0998c2e1-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:41:29.618725 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:29.618639 2576 generic.go:358] "Generic (PLEG): container finished" podID="304258f8-e1fc-41e8-8322-aafb0998c2e1" containerID="0785f712c78a2f4d2b04458b5c1e1d147fd7f685fbe70ef0e3e12c71e8e6b7f8" exitCode=0 Apr 16 15:41:29.618725 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:29.618701 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27" Apr 16 15:41:29.618914 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:29.618726 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27" event={"ID":"304258f8-e1fc-41e8-8322-aafb0998c2e1","Type":"ContainerDied","Data":"0785f712c78a2f4d2b04458b5c1e1d147fd7f685fbe70ef0e3e12c71e8e6b7f8"} Apr 16 15:41:29.618914 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:29.618761 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27" event={"ID":"304258f8-e1fc-41e8-8322-aafb0998c2e1","Type":"ContainerDied","Data":"70d99d0672474e29af658a4e3a73e34ca559f86ff2085306fe9ce2d289f7fedc"} Apr 16 15:41:29.618914 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:29.618779 2576 scope.go:117] "RemoveContainer" containerID="0785f712c78a2f4d2b04458b5c1e1d147fd7f685fbe70ef0e3e12c71e8e6b7f8" Apr 16 15:41:29.627507 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:29.627480 2576 scope.go:117] "RemoveContainer" containerID="1a3e7ad7fb6dfb14e9d920febf43f240af1c4f1298e2805e14be7e08cd99cd3e" Apr 16 15:41:29.635092 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:29.635073 2576 scope.go:117] "RemoveContainer" containerID="0785f712c78a2f4d2b04458b5c1e1d147fd7f685fbe70ef0e3e12c71e8e6b7f8" Apr 16 15:41:29.635353 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:41:29.635333 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0785f712c78a2f4d2b04458b5c1e1d147fd7f685fbe70ef0e3e12c71e8e6b7f8\": container with ID starting with 0785f712c78a2f4d2b04458b5c1e1d147fd7f685fbe70ef0e3e12c71e8e6b7f8 not found: ID does not exist" containerID="0785f712c78a2f4d2b04458b5c1e1d147fd7f685fbe70ef0e3e12c71e8e6b7f8" Apr 16 15:41:29.635392 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:29.635365 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0785f712c78a2f4d2b04458b5c1e1d147fd7f685fbe70ef0e3e12c71e8e6b7f8"} err="failed to get container status \"0785f712c78a2f4d2b04458b5c1e1d147fd7f685fbe70ef0e3e12c71e8e6b7f8\": rpc error: code = NotFound desc = could not find container \"0785f712c78a2f4d2b04458b5c1e1d147fd7f685fbe70ef0e3e12c71e8e6b7f8\": container with ID starting with 0785f712c78a2f4d2b04458b5c1e1d147fd7f685fbe70ef0e3e12c71e8e6b7f8 not found: ID does not exist" Apr 16 15:41:29.635392 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:29.635385 2576 scope.go:117] "RemoveContainer" containerID="1a3e7ad7fb6dfb14e9d920febf43f240af1c4f1298e2805e14be7e08cd99cd3e" Apr 16 15:41:29.635611 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:41:29.635597 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a3e7ad7fb6dfb14e9d920febf43f240af1c4f1298e2805e14be7e08cd99cd3e\": container with ID starting with 1a3e7ad7fb6dfb14e9d920febf43f240af1c4f1298e2805e14be7e08cd99cd3e not found: ID does not exist" containerID="1a3e7ad7fb6dfb14e9d920febf43f240af1c4f1298e2805e14be7e08cd99cd3e" Apr 16 15:41:29.635672 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:29.635616 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a3e7ad7fb6dfb14e9d920febf43f240af1c4f1298e2805e14be7e08cd99cd3e"} err="failed to get container status \"1a3e7ad7fb6dfb14e9d920febf43f240af1c4f1298e2805e14be7e08cd99cd3e\": rpc error: code = NotFound desc = could not find container \"1a3e7ad7fb6dfb14e9d920febf43f240af1c4f1298e2805e14be7e08cd99cd3e\": container with ID starting with 1a3e7ad7fb6dfb14e9d920febf43f240af1c4f1298e2805e14be7e08cd99cd3e not found: ID does not exist" Apr 16 15:41:29.639809 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:29.639780 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27"] Apr 16 15:41:29.641506 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:29.641487 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-9c688ddc4-nlk27"] Apr 16 15:41:30.307880 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:30.307846 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="304258f8-e1fc-41e8-8322-aafb0998c2e1" path="/var/lib/kubelet/pods/304258f8-e1fc-41e8-8322-aafb0998c2e1/volumes" Apr 16 15:41:32.631993 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:32.631898 2576 generic.go:358] "Generic (PLEG): container finished" podID="2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2" containerID="0ead2908172c2359d40ee036ec693bdca9d46bb5ee57d1445b3ba6f49fd8b403" exitCode=0 Apr 16 15:41:32.631993 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:32.631975 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2" event={"ID":"2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2","Type":"ContainerDied","Data":"0ead2908172c2359d40ee036ec693bdca9d46bb5ee57d1445b3ba6f49fd8b403"} Apr 16 15:41:33.637266 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:33.637224 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2" event={"ID":"2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2","Type":"ContainerStarted","Data":"34354e17fc752fb24e88c5efb760a810b3c53dac671e6ef8426ebdd2e4f552d9"} Apr 16 15:41:33.637742 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:33.637440 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2" Apr 16 15:41:33.652578 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:41:33.652527 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2" podStartSLOduration=6.652510943 podStartE2EDuration="6.652510943s" podCreationTimestamp="2026-04-16 15:41:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:41:33.651707901 +0000 UTC m=+2928.070842876" watchObservedRunningTime="2026-04-16 15:41:33.652510943 +0000 UTC m=+2928.071645941" Apr 16 15:42:04.734214 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:04.734112 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2" podUID="2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 15:42:14.644161 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:14.644128 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2" Apr 16 15:42:18.125947 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:18.125911 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2"] Apr 16 15:42:18.126525 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:18.126207 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2" podUID="2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2" containerName="kserve-container" containerID="cri-o://34354e17fc752fb24e88c5efb760a810b3c53dac671e6ef8426ebdd2e4f552d9" gracePeriod=30 Apr 16 15:42:18.213145 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:18.213094 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh"] Apr 16 15:42:18.213644 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:18.213625 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="304258f8-e1fc-41e8-8322-aafb0998c2e1" containerName="kserve-container" Apr 16 15:42:18.213644 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:18.213645 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="304258f8-e1fc-41e8-8322-aafb0998c2e1" containerName="kserve-container" Apr 16 15:42:18.213808 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:18.213708 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="304258f8-e1fc-41e8-8322-aafb0998c2e1" containerName="storage-initializer" Apr 16 15:42:18.213808 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:18.213718 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="304258f8-e1fc-41e8-8322-aafb0998c2e1" containerName="storage-initializer" Apr 16 15:42:18.213919 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:18.213902 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="304258f8-e1fc-41e8-8322-aafb0998c2e1" containerName="kserve-container" Apr 16 15:42:18.216258 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:18.216237 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" Apr 16 15:42:18.223979 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:18.223951 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh"] Apr 16 15:42:18.269959 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:18.269919 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0453890b-986c-4994-8276-a2c081bc2623-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-84556fcc44-2smfh\" (UID: \"0453890b-986c-4994-8276-a2c081bc2623\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" Apr 16 15:42:18.370697 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:18.370659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0453890b-986c-4994-8276-a2c081bc2623-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-84556fcc44-2smfh\" (UID: \"0453890b-986c-4994-8276-a2c081bc2623\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" Apr 16 15:42:18.371118 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:18.371092 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0453890b-986c-4994-8276-a2c081bc2623-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-84556fcc44-2smfh\" (UID: \"0453890b-986c-4994-8276-a2c081bc2623\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" Apr 16 15:42:18.529092 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:18.529061 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" Apr 16 15:42:18.656957 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:18.656934 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh"] Apr 16 15:42:18.659316 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:42:18.659285 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0453890b_986c_4994_8276_a2c081bc2623.slice/crio-ca9b01a974a31ad956e3a8b544cfb6efd601daadae890093ccd34fd2b5905ee7 WatchSource:0}: Error finding container ca9b01a974a31ad956e3a8b544cfb6efd601daadae890093ccd34fd2b5905ee7: Status 404 returned error can't find the container with id ca9b01a974a31ad956e3a8b544cfb6efd601daadae890093ccd34fd2b5905ee7 Apr 16 15:42:18.793818 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:18.793720 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" event={"ID":"0453890b-986c-4994-8276-a2c081bc2623","Type":"ContainerStarted","Data":"7af92e9d5cdb57eb5d6da44ce4673b2c6e27b212cb6140251c6f129ad7077edc"} Apr 16 15:42:18.793818 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:18.793758 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" event={"ID":"0453890b-986c-4994-8276-a2c081bc2623","Type":"ContainerStarted","Data":"ca9b01a974a31ad956e3a8b544cfb6efd601daadae890093ccd34fd2b5905ee7"} Apr 16 15:42:22.809564 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:22.809521 2576 generic.go:358] "Generic (PLEG): container finished" podID="0453890b-986c-4994-8276-a2c081bc2623" containerID="7af92e9d5cdb57eb5d6da44ce4673b2c6e27b212cb6140251c6f129ad7077edc" exitCode=0 Apr 16 15:42:22.810005 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:22.809600 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" event={"ID":"0453890b-986c-4994-8276-a2c081bc2623","Type":"ContainerDied","Data":"7af92e9d5cdb57eb5d6da44ce4673b2c6e27b212cb6140251c6f129ad7077edc"} Apr 16 15:42:23.814860 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:23.814823 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" event={"ID":"0453890b-986c-4994-8276-a2c081bc2623","Type":"ContainerStarted","Data":"5e7980107d3bc1ab24c275fe2b6d142aa52e67cd673845d683cf28bb6b1251ff"} Apr 16 15:42:23.815315 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:23.815145 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" Apr 16 15:42:23.816610 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:23.816585 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" podUID="0453890b-986c-4994-8276-a2c081bc2623" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 15:42:23.831903 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:23.831847 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" podStartSLOduration=5.831832059 podStartE2EDuration="5.831832059s" podCreationTimestamp="2026-04-16 15:42:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:42:23.829603662 +0000 UTC m=+2978.248738673" watchObservedRunningTime="2026-04-16 15:42:23.831832059 +0000 UTC m=+2978.250967033" Apr 16 15:42:24.642371 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:24.642328 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2" podUID="2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.49:8080/v2/models/isvc-sklearn-v2-runtime/ready\": dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 15:42:24.818315 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:24.818264 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" podUID="0453890b-986c-4994-8276-a2c081bc2623" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 15:42:25.823973 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:25.823931 2576 generic.go:358] "Generic (PLEG): container finished" podID="2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2" containerID="34354e17fc752fb24e88c5efb760a810b3c53dac671e6ef8426ebdd2e4f552d9" exitCode=0 Apr 16 15:42:25.824342 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:25.823986 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2" event={"ID":"2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2","Type":"ContainerDied","Data":"34354e17fc752fb24e88c5efb760a810b3c53dac671e6ef8426ebdd2e4f552d9"} Apr 16 15:42:25.866665 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:25.866643 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2" Apr 16 15:42:25.934975 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:25.934939 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2-kserve-provision-location\") pod \"2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2\" (UID: \"2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2\") " Apr 16 15:42:25.935265 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:25.935241 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2" (UID: "2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:42:25.935376 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:25.935355 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:42:26.828344 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:26.828311 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2" event={"ID":"2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2","Type":"ContainerDied","Data":"16f56b1881863652077b019a1c5c68d4b59e3fb2422fee1b745988bb1054710c"} Apr 16 15:42:26.828822 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:26.828348 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2" Apr 16 15:42:26.828822 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:26.828351 2576 scope.go:117] "RemoveContainer" containerID="34354e17fc752fb24e88c5efb760a810b3c53dac671e6ef8426ebdd2e4f552d9" Apr 16 15:42:26.838399 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:26.838378 2576 scope.go:117] "RemoveContainer" containerID="0ead2908172c2359d40ee036ec693bdca9d46bb5ee57d1445b3ba6f49fd8b403" Apr 16 15:42:26.844563 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:26.844539 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2"] Apr 16 15:42:26.846487 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:26.846468 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-c8wt2"] Apr 16 15:42:28.308142 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:28.308111 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2" path="/var/lib/kubelet/pods/2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2/volumes" Apr 16 15:42:34.819010 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:34.818961 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" podUID="0453890b-986c-4994-8276-a2c081bc2623" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 15:42:44.818361 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:44.818313 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" podUID="0453890b-986c-4994-8276-a2c081bc2623" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 15:42:46.441409 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:46.441370 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:42:46.443966 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:46.443940 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:42:46.446555 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:46.446534 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:42:46.448909 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:46.448892 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:42:54.818524 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:42:54.818482 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" podUID="0453890b-986c-4994-8276-a2c081bc2623" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 15:43:04.819288 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:04.819245 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" podUID="0453890b-986c-4994-8276-a2c081bc2623" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 15:43:14.818870 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:14.818820 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" podUID="0453890b-986c-4994-8276-a2c081bc2623" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 15:43:24.818543 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:24.818497 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" podUID="0453890b-986c-4994-8276-a2c081bc2623" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 15:43:34.820100 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:34.819979 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" Apr 16 15:43:38.444800 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:38.444763 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh"] Apr 16 15:43:38.445217 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:38.445111 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" podUID="0453890b-986c-4994-8276-a2c081bc2623" containerName="kserve-container" containerID="cri-o://5e7980107d3bc1ab24c275fe2b6d142aa52e67cd673845d683cf28bb6b1251ff" gracePeriod=30 Apr 16 15:43:38.484251 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:38.484203 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct"] Apr 16 15:43:38.484599 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:38.484587 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2" containerName="kserve-container" Apr 16 15:43:38.484649 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:38.484601 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2" containerName="kserve-container" Apr 16 15:43:38.484649 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:38.484620 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2" containerName="storage-initializer" Apr 16 15:43:38.484649 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:38.484627 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2" containerName="storage-initializer" Apr 16 15:43:38.484743 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:38.484692 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b5534b6-0ae0-4d64-ab8c-b420ad3c65e2" containerName="kserve-container" Apr 16 15:43:38.486937 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:38.486916 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" Apr 16 15:43:38.495806 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:38.495775 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct"] Apr 16 15:43:38.592516 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:38.592472 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70fab0e4-c672-4281-a2c5-5f33259961af-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct\" (UID: \"70fab0e4-c672-4281-a2c5-5f33259961af\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" Apr 16 15:43:38.693987 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:38.693950 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70fab0e4-c672-4281-a2c5-5f33259961af-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct\" (UID: \"70fab0e4-c672-4281-a2c5-5f33259961af\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" Apr 16 15:43:38.694341 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:38.694322 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70fab0e4-c672-4281-a2c5-5f33259961af-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct\" (UID: \"70fab0e4-c672-4281-a2c5-5f33259961af\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" Apr 16 15:43:38.798269 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:38.798178 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" Apr 16 15:43:38.922325 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:38.922288 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct"] Apr 16 15:43:38.925065 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:43:38.925007 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70fab0e4_c672_4281_a2c5_5f33259961af.slice/crio-5c148062e0f3685fbb8007ee7751a5f260aebe11b1cf9c2af447a0590cd3b33c WatchSource:0}: Error finding container 5c148062e0f3685fbb8007ee7751a5f260aebe11b1cf9c2af447a0590cd3b33c: Status 404 returned error can't find the container with id 5c148062e0f3685fbb8007ee7751a5f260aebe11b1cf9c2af447a0590cd3b33c Apr 16 15:43:39.075395 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:39.075302 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" event={"ID":"70fab0e4-c672-4281-a2c5-5f33259961af","Type":"ContainerStarted","Data":"43b0c87a21acb08a289b0fac54842db139c654f462e6e7dd4d2ce2c94831fc2e"} Apr 16 15:43:39.075395 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:39.075350 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" event={"ID":"70fab0e4-c672-4281-a2c5-5f33259961af","Type":"ContainerStarted","Data":"5c148062e0f3685fbb8007ee7751a5f260aebe11b1cf9c2af447a0590cd3b33c"} Apr 16 15:43:42.796220 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:42.796198 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" Apr 16 15:43:42.828400 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:42.828365 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0453890b-986c-4994-8276-a2c081bc2623-kserve-provision-location\") pod \"0453890b-986c-4994-8276-a2c081bc2623\" (UID: \"0453890b-986c-4994-8276-a2c081bc2623\") " Apr 16 15:43:42.828650 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:42.828626 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0453890b-986c-4994-8276-a2c081bc2623-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0453890b-986c-4994-8276-a2c081bc2623" (UID: "0453890b-986c-4994-8276-a2c081bc2623"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:43:42.929519 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:42.929486 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0453890b-986c-4994-8276-a2c081bc2623-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:43:43.090220 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:43.090134 2576 generic.go:358] "Generic (PLEG): container finished" podID="70fab0e4-c672-4281-a2c5-5f33259961af" containerID="43b0c87a21acb08a289b0fac54842db139c654f462e6e7dd4d2ce2c94831fc2e" exitCode=0 Apr 16 15:43:43.090220 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:43.090208 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" event={"ID":"70fab0e4-c672-4281-a2c5-5f33259961af","Type":"ContainerDied","Data":"43b0c87a21acb08a289b0fac54842db139c654f462e6e7dd4d2ce2c94831fc2e"} Apr 16 15:43:43.091422 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:43.091399 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:43:43.091725 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:43.091700 2576 generic.go:358] "Generic (PLEG): container finished" podID="0453890b-986c-4994-8276-a2c081bc2623" containerID="5e7980107d3bc1ab24c275fe2b6d142aa52e67cd673845d683cf28bb6b1251ff" exitCode=0 Apr 16 15:43:43.091825 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:43.091748 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" event={"ID":"0453890b-986c-4994-8276-a2c081bc2623","Type":"ContainerDied","Data":"5e7980107d3bc1ab24c275fe2b6d142aa52e67cd673845d683cf28bb6b1251ff"} Apr 16 15:43:43.091825 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:43.091764 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" Apr 16 15:43:43.091825 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:43.091782 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh" event={"ID":"0453890b-986c-4994-8276-a2c081bc2623","Type":"ContainerDied","Data":"ca9b01a974a31ad956e3a8b544cfb6efd601daadae890093ccd34fd2b5905ee7"} Apr 16 15:43:43.091825 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:43.091803 2576 scope.go:117] "RemoveContainer" containerID="5e7980107d3bc1ab24c275fe2b6d142aa52e67cd673845d683cf28bb6b1251ff" Apr 16 15:43:43.102857 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:43.102831 2576 scope.go:117] "RemoveContainer" containerID="7af92e9d5cdb57eb5d6da44ce4673b2c6e27b212cb6140251c6f129ad7077edc" Apr 16 15:43:43.119248 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:43.119197 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh"] Apr 16 15:43:43.120053 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:43.120015 2576 scope.go:117] "RemoveContainer" containerID="5e7980107d3bc1ab24c275fe2b6d142aa52e67cd673845d683cf28bb6b1251ff" Apr 16 15:43:43.120337 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:43:43.120314 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e7980107d3bc1ab24c275fe2b6d142aa52e67cd673845d683cf28bb6b1251ff\": container with ID starting with 5e7980107d3bc1ab24c275fe2b6d142aa52e67cd673845d683cf28bb6b1251ff not found: ID does not exist" containerID="5e7980107d3bc1ab24c275fe2b6d142aa52e67cd673845d683cf28bb6b1251ff" Apr 16 15:43:43.120447 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:43.120350 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e7980107d3bc1ab24c275fe2b6d142aa52e67cd673845d683cf28bb6b1251ff"} err="failed to get container status \"5e7980107d3bc1ab24c275fe2b6d142aa52e67cd673845d683cf28bb6b1251ff\": rpc error: code = NotFound desc = could not find container \"5e7980107d3bc1ab24c275fe2b6d142aa52e67cd673845d683cf28bb6b1251ff\": container with ID starting with 5e7980107d3bc1ab24c275fe2b6d142aa52e67cd673845d683cf28bb6b1251ff not found: ID does not exist" Apr 16 15:43:43.120447 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:43.120370 2576 scope.go:117] "RemoveContainer" containerID="7af92e9d5cdb57eb5d6da44ce4673b2c6e27b212cb6140251c6f129ad7077edc" Apr 16 15:43:43.120612 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:43:43.120597 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7af92e9d5cdb57eb5d6da44ce4673b2c6e27b212cb6140251c6f129ad7077edc\": container with ID starting with 7af92e9d5cdb57eb5d6da44ce4673b2c6e27b212cb6140251c6f129ad7077edc not found: ID does not exist" containerID="7af92e9d5cdb57eb5d6da44ce4673b2c6e27b212cb6140251c6f129ad7077edc" Apr 16 15:43:43.120672 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:43.120615 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7af92e9d5cdb57eb5d6da44ce4673b2c6e27b212cb6140251c6f129ad7077edc"} err="failed to get container status \"7af92e9d5cdb57eb5d6da44ce4673b2c6e27b212cb6140251c6f129ad7077edc\": rpc error: code = NotFound desc = could not find container \"7af92e9d5cdb57eb5d6da44ce4673b2c6e27b212cb6140251c6f129ad7077edc\": container with ID starting with 7af92e9d5cdb57eb5d6da44ce4673b2c6e27b212cb6140251c6f129ad7077edc not found: ID does not exist" Apr 16 15:43:43.121107 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:43.121088 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-84556fcc44-2smfh"] Apr 16 15:43:44.096880 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:44.096840 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" event={"ID":"70fab0e4-c672-4281-a2c5-5f33259961af","Type":"ContainerStarted","Data":"f69c40e5b8a4a37af03a597e186ae0064403f082ba915aa8017e74dfd5a15ba9"} Apr 16 15:43:44.097341 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:44.097224 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" Apr 16 15:43:44.098714 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:44.098685 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" podUID="70fab0e4-c672-4281-a2c5-5f33259961af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 16 15:43:44.113290 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:44.113252 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" podStartSLOduration=6.113239332 podStartE2EDuration="6.113239332s" podCreationTimestamp="2026-04-16 15:43:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:43:44.111548655 +0000 UTC m=+3058.530683640" watchObservedRunningTime="2026-04-16 15:43:44.113239332 +0000 UTC m=+3058.532374307" Apr 16 15:43:44.307901 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:44.307872 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0453890b-986c-4994-8276-a2c081bc2623" path="/var/lib/kubelet/pods/0453890b-986c-4994-8276-a2c081bc2623/volumes" Apr 16 15:43:45.101336 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:45.101292 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" podUID="70fab0e4-c672-4281-a2c5-5f33259961af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 16 15:43:55.101987 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:43:55.101940 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" podUID="70fab0e4-c672-4281-a2c5-5f33259961af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 16 15:44:05.101999 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:44:05.101953 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" podUID="70fab0e4-c672-4281-a2c5-5f33259961af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 16 15:44:15.101355 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:44:15.101301 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" podUID="70fab0e4-c672-4281-a2c5-5f33259961af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 16 15:44:25.102126 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:44:25.102083 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" podUID="70fab0e4-c672-4281-a2c5-5f33259961af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 16 15:44:35.101867 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:44:35.101823 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" podUID="70fab0e4-c672-4281-a2c5-5f33259961af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 16 15:44:45.101488 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:44:45.101439 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" podUID="70fab0e4-c672-4281-a2c5-5f33259961af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 16 15:44:48.302938 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:44:48.302890 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" podUID="70fab0e4-c672-4281-a2c5-5f33259961af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 16 15:44:58.307535 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:44:58.307464 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" Apr 16 15:44:58.641473 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:44:58.641385 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct"] Apr 16 15:44:58.641742 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:44:58.641715 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" podUID="70fab0e4-c672-4281-a2c5-5f33259961af" containerName="kserve-container" containerID="cri-o://f69c40e5b8a4a37af03a597e186ae0064403f082ba915aa8017e74dfd5a15ba9" gracePeriod=30 Apr 16 15:45:03.089296 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:45:03.089268 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" Apr 16 15:45:03.266852 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:45:03.266755 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70fab0e4-c672-4281-a2c5-5f33259961af-kserve-provision-location\") pod \"70fab0e4-c672-4281-a2c5-5f33259961af\" (UID: \"70fab0e4-c672-4281-a2c5-5f33259961af\") " Apr 16 15:45:03.267131 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:45:03.267105 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70fab0e4-c672-4281-a2c5-5f33259961af-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "70fab0e4-c672-4281-a2c5-5f33259961af" (UID: "70fab0e4-c672-4281-a2c5-5f33259961af"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:45:03.368211 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:45:03.368167 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70fab0e4-c672-4281-a2c5-5f33259961af-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:45:03.379276 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:45:03.379238 2576 generic.go:358] "Generic (PLEG): container finished" podID="70fab0e4-c672-4281-a2c5-5f33259961af" containerID="f69c40e5b8a4a37af03a597e186ae0064403f082ba915aa8017e74dfd5a15ba9" exitCode=0 Apr 16 15:45:03.379443 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:45:03.379292 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" event={"ID":"70fab0e4-c672-4281-a2c5-5f33259961af","Type":"ContainerDied","Data":"f69c40e5b8a4a37af03a597e186ae0064403f082ba915aa8017e74dfd5a15ba9"} Apr 16 15:45:03.379443 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:45:03.379337 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" event={"ID":"70fab0e4-c672-4281-a2c5-5f33259961af","Type":"ContainerDied","Data":"5c148062e0f3685fbb8007ee7751a5f260aebe11b1cf9c2af447a0590cd3b33c"} Apr 16 15:45:03.379443 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:45:03.379353 2576 scope.go:117] "RemoveContainer" containerID="f69c40e5b8a4a37af03a597e186ae0064403f082ba915aa8017e74dfd5a15ba9" Apr 16 15:45:03.379443 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:45:03.379304 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct" Apr 16 15:45:03.387653 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:45:03.387635 2576 scope.go:117] "RemoveContainer" containerID="43b0c87a21acb08a289b0fac54842db139c654f462e6e7dd4d2ce2c94831fc2e" Apr 16 15:45:03.395058 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:45:03.395014 2576 scope.go:117] "RemoveContainer" containerID="f69c40e5b8a4a37af03a597e186ae0064403f082ba915aa8017e74dfd5a15ba9" Apr 16 15:45:03.395297 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:45:03.395275 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f69c40e5b8a4a37af03a597e186ae0064403f082ba915aa8017e74dfd5a15ba9\": container with ID starting with f69c40e5b8a4a37af03a597e186ae0064403f082ba915aa8017e74dfd5a15ba9 not found: ID does not exist" containerID="f69c40e5b8a4a37af03a597e186ae0064403f082ba915aa8017e74dfd5a15ba9" Apr 16 15:45:03.395347 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:45:03.395306 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f69c40e5b8a4a37af03a597e186ae0064403f082ba915aa8017e74dfd5a15ba9"} err="failed to get container status \"f69c40e5b8a4a37af03a597e186ae0064403f082ba915aa8017e74dfd5a15ba9\": rpc error: code = NotFound desc = could not find container \"f69c40e5b8a4a37af03a597e186ae0064403f082ba915aa8017e74dfd5a15ba9\": container with ID starting with f69c40e5b8a4a37af03a597e186ae0064403f082ba915aa8017e74dfd5a15ba9 not found: ID does not exist" Apr 16 15:45:03.395347 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:45:03.395324 2576 scope.go:117] "RemoveContainer" containerID="43b0c87a21acb08a289b0fac54842db139c654f462e6e7dd4d2ce2c94831fc2e" Apr 16 15:45:03.395557 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:45:03.395540 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43b0c87a21acb08a289b0fac54842db139c654f462e6e7dd4d2ce2c94831fc2e\": container with ID starting with 43b0c87a21acb08a289b0fac54842db139c654f462e6e7dd4d2ce2c94831fc2e not found: ID does not exist" containerID="43b0c87a21acb08a289b0fac54842db139c654f462e6e7dd4d2ce2c94831fc2e" Apr 16 15:45:03.395609 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:45:03.395562 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43b0c87a21acb08a289b0fac54842db139c654f462e6e7dd4d2ce2c94831fc2e"} err="failed to get container status \"43b0c87a21acb08a289b0fac54842db139c654f462e6e7dd4d2ce2c94831fc2e\": rpc error: code = NotFound desc = could not find container \"43b0c87a21acb08a289b0fac54842db139c654f462e6e7dd4d2ce2c94831fc2e\": container with ID starting with 43b0c87a21acb08a289b0fac54842db139c654f462e6e7dd4d2ce2c94831fc2e not found: ID does not exist" Apr 16 15:45:03.399729 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:45:03.399706 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct"] Apr 16 15:45:03.403443 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:45:03.403422 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-56d7d747c9-zlrct"] Apr 16 15:45:04.307668 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:45:04.307634 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70fab0e4-c672-4281-a2c5-5f33259961af" path="/var/lib/kubelet/pods/70fab0e4-c672-4281-a2c5-5f33259961af/volumes" Apr 16 15:47:46.468986 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:47:46.468955 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:47:46.471186 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:47:46.471162 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:47:46.473738 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:47:46.473717 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:47:46.476232 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:47:46.476215 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:50:03.951695 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:03.951657 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs"] Apr 16 15:50:03.954160 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:03.952204 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0453890b-986c-4994-8276-a2c081bc2623" containerName="storage-initializer" Apr 16 15:50:03.954160 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:03.952225 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0453890b-986c-4994-8276-a2c081bc2623" containerName="storage-initializer" Apr 16 15:50:03.954160 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:03.952238 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70fab0e4-c672-4281-a2c5-5f33259961af" containerName="kserve-container" Apr 16 15:50:03.954160 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:03.952245 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="70fab0e4-c672-4281-a2c5-5f33259961af" containerName="kserve-container" Apr 16 15:50:03.954160 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:03.952255 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70fab0e4-c672-4281-a2c5-5f33259961af" containerName="storage-initializer" Apr 16 15:50:03.954160 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:03.952261 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="70fab0e4-c672-4281-a2c5-5f33259961af" containerName="storage-initializer" Apr 16 15:50:03.954160 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:03.952279 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0453890b-986c-4994-8276-a2c081bc2623" containerName="kserve-container" Apr 16 15:50:03.954160 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:03.952285 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0453890b-986c-4994-8276-a2c081bc2623" containerName="kserve-container" Apr 16 15:50:03.954160 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:03.952343 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="70fab0e4-c672-4281-a2c5-5f33259961af" containerName="kserve-container" Apr 16 15:50:03.954160 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:03.952353 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0453890b-986c-4994-8276-a2c081bc2623" containerName="kserve-container" Apr 16 15:50:03.955382 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:03.955366 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs" Apr 16 15:50:03.957569 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:03.957543 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tsb5d\"" Apr 16 15:50:03.965734 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:03.965711 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs"] Apr 16 15:50:04.048569 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:04.048536 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abbb9426-e596-48d4-8ae2-5d54895036c1-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs\" (UID: \"abbb9426-e596-48d4-8ae2-5d54895036c1\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs" Apr 16 15:50:04.149037 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:04.148992 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abbb9426-e596-48d4-8ae2-5d54895036c1-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs\" (UID: \"abbb9426-e596-48d4-8ae2-5d54895036c1\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs" Apr 16 15:50:04.149385 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:04.149367 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abbb9426-e596-48d4-8ae2-5d54895036c1-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs\" (UID: \"abbb9426-e596-48d4-8ae2-5d54895036c1\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs" Apr 16 15:50:04.266578 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:04.266484 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs" Apr 16 15:50:04.392004 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:04.391979 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs"] Apr 16 15:50:04.394249 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:50:04.394216 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabbb9426_e596_48d4_8ae2_5d54895036c1.slice/crio-e14b51ffbe43b5c6d249c98b66ab8f806a66db38c6bc1b902cbd5a7991876dda WatchSource:0}: Error finding container e14b51ffbe43b5c6d249c98b66ab8f806a66db38c6bc1b902cbd5a7991876dda: Status 404 returned error can't find the container with id e14b51ffbe43b5c6d249c98b66ab8f806a66db38c6bc1b902cbd5a7991876dda Apr 16 15:50:04.396313 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:04.396293 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:50:04.425388 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:04.425360 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs" event={"ID":"abbb9426-e596-48d4-8ae2-5d54895036c1","Type":"ContainerStarted","Data":"e14b51ffbe43b5c6d249c98b66ab8f806a66db38c6bc1b902cbd5a7991876dda"} Apr 16 15:50:05.430690 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:05.430655 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs" event={"ID":"abbb9426-e596-48d4-8ae2-5d54895036c1","Type":"ContainerStarted","Data":"9ab9450e893ab24e7e6b910518db25ea69b943f36fdec93a71b3d8872c4b96ac"} Apr 16 15:50:08.442014 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:08.441920 2576 generic.go:358] "Generic (PLEG): container finished" podID="abbb9426-e596-48d4-8ae2-5d54895036c1" containerID="9ab9450e893ab24e7e6b910518db25ea69b943f36fdec93a71b3d8872c4b96ac" exitCode=0 Apr 16 15:50:08.442014 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:08.441990 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs" event={"ID":"abbb9426-e596-48d4-8ae2-5d54895036c1","Type":"ContainerDied","Data":"9ab9450e893ab24e7e6b910518db25ea69b943f36fdec93a71b3d8872c4b96ac"} Apr 16 15:50:09.446924 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:09.446894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs" event={"ID":"abbb9426-e596-48d4-8ae2-5d54895036c1","Type":"ContainerStarted","Data":"0478b2350082b551f138b60d76ef0e21e8c9ad76571eef2a87f78137c190a84f"} Apr 16 15:50:09.447344 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:09.447163 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs" Apr 16 15:50:09.464007 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:09.463955 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs" podStartSLOduration=6.463938279 podStartE2EDuration="6.463938279s" podCreationTimestamp="2026-04-16 15:50:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:50:09.461474042 +0000 UTC m=+3443.880609016" watchObservedRunningTime="2026-04-16 15:50:09.463938279 +0000 UTC m=+3443.883073254" Apr 16 15:50:40.533685 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:40.533650 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs" Apr 16 15:50:44.166635 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:44.166603 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs"] Apr 16 15:50:44.167071 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:44.166843 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs" podUID="abbb9426-e596-48d4-8ae2-5d54895036c1" containerName="kserve-container" containerID="cri-o://0478b2350082b551f138b60d76ef0e21e8c9ad76571eef2a87f78137c190a84f" gracePeriod=30 Apr 16 15:50:44.242465 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:44.242429 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w"] Apr 16 15:50:44.246195 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:44.246173 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w" Apr 16 15:50:44.252576 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:44.252554 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w"] Apr 16 15:50:44.301140 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:44.301106 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c4178f3-aa32-4b87-8abf-8641c807e3e4-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-wjp8w\" (UID: \"3c4178f3-aa32-4b87-8abf-8641c807e3e4\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w" Apr 16 15:50:44.401943 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:44.401892 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c4178f3-aa32-4b87-8abf-8641c807e3e4-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-wjp8w\" (UID: \"3c4178f3-aa32-4b87-8abf-8641c807e3e4\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w" Apr 16 15:50:44.402502 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:44.402480 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c4178f3-aa32-4b87-8abf-8641c807e3e4-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-wjp8w\" (UID: \"3c4178f3-aa32-4b87-8abf-8641c807e3e4\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w" Apr 16 15:50:44.560540 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:44.560443 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w" Apr 16 15:50:44.683761 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:44.683723 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w"] Apr 16 15:50:44.685787 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:50:44.685753 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c4178f3_aa32_4b87_8abf_8641c807e3e4.slice/crio-429fb260405101cdfca9cdf630afdcafa9c6589aa813f26b9ec581f691979fc2 WatchSource:0}: Error finding container 429fb260405101cdfca9cdf630afdcafa9c6589aa813f26b9ec581f691979fc2: Status 404 returned error can't find the container with id 429fb260405101cdfca9cdf630afdcafa9c6589aa813f26b9ec581f691979fc2 Apr 16 15:50:45.579123 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:45.579085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w" event={"ID":"3c4178f3-aa32-4b87-8abf-8641c807e3e4","Type":"ContainerStarted","Data":"8f24710e1d85ab01e2f1503ab3669c4861567bd997f0fb4fd3f56039d94844b8"} Apr 16 15:50:45.579123 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:45.579122 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w" event={"ID":"3c4178f3-aa32-4b87-8abf-8641c807e3e4","Type":"ContainerStarted","Data":"429fb260405101cdfca9cdf630afdcafa9c6589aa813f26b9ec581f691979fc2"} Apr 16 15:50:48.591681 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:48.591651 2576 generic.go:358] "Generic (PLEG): container finished" podID="3c4178f3-aa32-4b87-8abf-8641c807e3e4" containerID="8f24710e1d85ab01e2f1503ab3669c4861567bd997f0fb4fd3f56039d94844b8" exitCode=0 Apr 16 15:50:48.592071 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:48.591719 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w" event={"ID":"3c4178f3-aa32-4b87-8abf-8641c807e3e4","Type":"ContainerDied","Data":"8f24710e1d85ab01e2f1503ab3669c4861567bd997f0fb4fd3f56039d94844b8"} Apr 16 15:50:49.597184 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:49.597147 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w" event={"ID":"3c4178f3-aa32-4b87-8abf-8641c807e3e4","Type":"ContainerStarted","Data":"2dad7fbe2a6f463f905031702305b2e950e6ebec15dd47f3de4c6e25e62bf80f"} Apr 16 15:50:49.597583 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:49.597347 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w" Apr 16 15:50:49.613280 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:49.613231 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w" podStartSLOduration=5.61321553 podStartE2EDuration="5.61321553s" podCreationTimestamp="2026-04-16 15:50:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:50:49.611258654 +0000 UTC m=+3484.030393629" watchObservedRunningTime="2026-04-16 15:50:49.61321553 +0000 UTC m=+3484.032350564" Apr 16 15:50:50.450890 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:50.450849 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs" podUID="abbb9426-e596-48d4-8ae2-5d54895036c1" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.52:8080/v2/models/isvc-xgboost-v2-mlserver/ready\": dial tcp 10.134.0.52:8080: connect: connection refused" Apr 16 15:50:51.998013 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:50:51.997974 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabbb9426_e596_48d4_8ae2_5d54895036c1.slice/crio-0478b2350082b551f138b60d76ef0e21e8c9ad76571eef2a87f78137c190a84f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabbb9426_e596_48d4_8ae2_5d54895036c1.slice/crio-conmon-0478b2350082b551f138b60d76ef0e21e8c9ad76571eef2a87f78137c190a84f.scope\": RecentStats: unable to find data in memory cache]" Apr 16 15:50:52.110712 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:52.110686 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs" Apr 16 15:50:52.170352 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:52.170304 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abbb9426-e596-48d4-8ae2-5d54895036c1-kserve-provision-location\") pod \"abbb9426-e596-48d4-8ae2-5d54895036c1\" (UID: \"abbb9426-e596-48d4-8ae2-5d54895036c1\") " Apr 16 15:50:52.170644 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:52.170619 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abbb9426-e596-48d4-8ae2-5d54895036c1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "abbb9426-e596-48d4-8ae2-5d54895036c1" (UID: "abbb9426-e596-48d4-8ae2-5d54895036c1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:50:52.271139 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:52.271051 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abbb9426-e596-48d4-8ae2-5d54895036c1-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:50:52.609828 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:52.609741 2576 generic.go:358] "Generic (PLEG): container finished" podID="abbb9426-e596-48d4-8ae2-5d54895036c1" containerID="0478b2350082b551f138b60d76ef0e21e8c9ad76571eef2a87f78137c190a84f" exitCode=0 Apr 16 15:50:52.609828 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:52.609808 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs" Apr 16 15:50:52.610097 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:52.609827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs" event={"ID":"abbb9426-e596-48d4-8ae2-5d54895036c1","Type":"ContainerDied","Data":"0478b2350082b551f138b60d76ef0e21e8c9ad76571eef2a87f78137c190a84f"} Apr 16 15:50:52.610097 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:52.609869 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs" event={"ID":"abbb9426-e596-48d4-8ae2-5d54895036c1","Type":"ContainerDied","Data":"e14b51ffbe43b5c6d249c98b66ab8f806a66db38c6bc1b902cbd5a7991876dda"} Apr 16 15:50:52.610097 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:52.609891 2576 scope.go:117] "RemoveContainer" containerID="0478b2350082b551f138b60d76ef0e21e8c9ad76571eef2a87f78137c190a84f" Apr 16 15:50:52.618104 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:52.618088 2576 scope.go:117] "RemoveContainer" containerID="9ab9450e893ab24e7e6b910518db25ea69b943f36fdec93a71b3d8872c4b96ac" Apr 16 15:50:52.625053 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:52.625012 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs"] Apr 16 15:50:52.626041 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:52.626002 2576 scope.go:117] "RemoveContainer" containerID="0478b2350082b551f138b60d76ef0e21e8c9ad76571eef2a87f78137c190a84f" Apr 16 15:50:52.626789 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:50:52.626730 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0478b2350082b551f138b60d76ef0e21e8c9ad76571eef2a87f78137c190a84f\": container with ID starting with 0478b2350082b551f138b60d76ef0e21e8c9ad76571eef2a87f78137c190a84f not found: ID does not exist" containerID="0478b2350082b551f138b60d76ef0e21e8c9ad76571eef2a87f78137c190a84f" Apr 16 15:50:52.626906 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:52.626768 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0478b2350082b551f138b60d76ef0e21e8c9ad76571eef2a87f78137c190a84f"} err="failed to get container status \"0478b2350082b551f138b60d76ef0e21e8c9ad76571eef2a87f78137c190a84f\": rpc error: code = NotFound desc = could not find container \"0478b2350082b551f138b60d76ef0e21e8c9ad76571eef2a87f78137c190a84f\": container with ID starting with 0478b2350082b551f138b60d76ef0e21e8c9ad76571eef2a87f78137c190a84f not found: ID does not exist" Apr 16 15:50:52.626906 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:52.626885 2576 scope.go:117] "RemoveContainer" containerID="9ab9450e893ab24e7e6b910518db25ea69b943f36fdec93a71b3d8872c4b96ac" Apr 16 15:50:52.627183 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:50:52.627163 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ab9450e893ab24e7e6b910518db25ea69b943f36fdec93a71b3d8872c4b96ac\": container with ID starting with 9ab9450e893ab24e7e6b910518db25ea69b943f36fdec93a71b3d8872c4b96ac not found: ID does not exist" containerID="9ab9450e893ab24e7e6b910518db25ea69b943f36fdec93a71b3d8872c4b96ac" Apr 16 15:50:52.627242 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:52.627190 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ab9450e893ab24e7e6b910518db25ea69b943f36fdec93a71b3d8872c4b96ac"} err="failed to get container status \"9ab9450e893ab24e7e6b910518db25ea69b943f36fdec93a71b3d8872c4b96ac\": rpc error: code = NotFound desc = could not find container \"9ab9450e893ab24e7e6b910518db25ea69b943f36fdec93a71b3d8872c4b96ac\": container with ID starting with 9ab9450e893ab24e7e6b910518db25ea69b943f36fdec93a71b3d8872c4b96ac not found: ID does not exist" Apr 16 15:50:52.628809 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:52.628791 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-khmcs"] Apr 16 15:50:54.307729 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:50:54.307694 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abbb9426-e596-48d4-8ae2-5d54895036c1" path="/var/lib/kubelet/pods/abbb9426-e596-48d4-8ae2-5d54895036c1/volumes" Apr 16 15:51:20.633654 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:51:20.633571 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w" Apr 16 15:51:24.459182 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:51:24.459144 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w"] Apr 16 15:51:24.459638 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:51:24.459494 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w" podUID="3c4178f3-aa32-4b87-8abf-8641c807e3e4" containerName="kserve-container" containerID="cri-o://2dad7fbe2a6f463f905031702305b2e950e6ebec15dd47f3de4c6e25e62bf80f" gracePeriod=30 Apr 16 15:51:30.601932 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:51:30.601885 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w" podUID="3c4178f3-aa32-4b87-8abf-8641c807e3e4" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.53:8080/v2/models/xgboost-v2-mlserver/ready\": dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 15:51:31.755129 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:51:31.755098 2576 generic.go:358] "Generic (PLEG): container finished" podID="3c4178f3-aa32-4b87-8abf-8641c807e3e4" containerID="2dad7fbe2a6f463f905031702305b2e950e6ebec15dd47f3de4c6e25e62bf80f" exitCode=0 Apr 16 15:51:31.755477 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:51:31.755157 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w" event={"ID":"3c4178f3-aa32-4b87-8abf-8641c807e3e4","Type":"ContainerDied","Data":"2dad7fbe2a6f463f905031702305b2e950e6ebec15dd47f3de4c6e25e62bf80f"} Apr 16 15:51:31.803281 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:51:31.803259 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w" Apr 16 15:51:31.818218 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:51:31.818064 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c4178f3-aa32-4b87-8abf-8641c807e3e4-kserve-provision-location\") pod \"3c4178f3-aa32-4b87-8abf-8641c807e3e4\" (UID: \"3c4178f3-aa32-4b87-8abf-8641c807e3e4\") " Apr 16 15:51:31.818628 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:51:31.818592 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c4178f3-aa32-4b87-8abf-8641c807e3e4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3c4178f3-aa32-4b87-8abf-8641c807e3e4" (UID: "3c4178f3-aa32-4b87-8abf-8641c807e3e4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:51:31.818858 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:51:31.818836 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c4178f3-aa32-4b87-8abf-8641c807e3e4-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:51:32.760062 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:51:32.760011 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w" event={"ID":"3c4178f3-aa32-4b87-8abf-8641c807e3e4","Type":"ContainerDied","Data":"429fb260405101cdfca9cdf630afdcafa9c6589aa813f26b9ec581f691979fc2"} Apr 16 15:51:32.760490 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:51:32.760076 2576 scope.go:117] "RemoveContainer" containerID="2dad7fbe2a6f463f905031702305b2e950e6ebec15dd47f3de4c6e25e62bf80f" Apr 16 15:51:32.760490 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:51:32.760101 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w" Apr 16 15:51:32.768965 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:51:32.768947 2576 scope.go:117] "RemoveContainer" containerID="8f24710e1d85ab01e2f1503ab3669c4861567bd997f0fb4fd3f56039d94844b8" Apr 16 15:51:32.776469 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:51:32.776444 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w"] Apr 16 15:51:32.782610 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:51:32.782588 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wjp8w"] Apr 16 15:51:34.307891 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:51:34.307855 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c4178f3-aa32-4b87-8abf-8641c807e3e4" path="/var/lib/kubelet/pods/3c4178f3-aa32-4b87-8abf-8641c807e3e4/volumes" Apr 16 15:52:34.710931 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:34.710892 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd"] Apr 16 15:52:34.711411 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:34.711399 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="abbb9426-e596-48d4-8ae2-5d54895036c1" containerName="storage-initializer" Apr 16 15:52:34.711467 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:34.711417 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="abbb9426-e596-48d4-8ae2-5d54895036c1" containerName="storage-initializer" Apr 16 15:52:34.711467 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:34.711439 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="abbb9426-e596-48d4-8ae2-5d54895036c1" containerName="kserve-container" Apr 16 15:52:34.711467 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:34.711445 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="abbb9426-e596-48d4-8ae2-5d54895036c1" containerName="kserve-container" Apr 16 15:52:34.711467 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:34.711459 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c4178f3-aa32-4b87-8abf-8641c807e3e4" containerName="kserve-container" Apr 16 15:52:34.711615 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:34.711468 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4178f3-aa32-4b87-8abf-8641c807e3e4" containerName="kserve-container" Apr 16 15:52:34.711615 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:34.711480 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c4178f3-aa32-4b87-8abf-8641c807e3e4" containerName="storage-initializer" Apr 16 15:52:34.711615 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:34.711486 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4178f3-aa32-4b87-8abf-8641c807e3e4" containerName="storage-initializer" Apr 16 15:52:34.711615 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:34.711549 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="abbb9426-e596-48d4-8ae2-5d54895036c1" containerName="kserve-container" Apr 16 15:52:34.711615 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:34.711559 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c4178f3-aa32-4b87-8abf-8641c807e3e4" containerName="kserve-container" Apr 16 15:52:34.714835 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:34.714809 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd" Apr 16 15:52:34.717186 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:34.717163 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tsb5d\"" Apr 16 15:52:34.723044 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:34.722685 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd"] Apr 16 15:52:34.783439 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:34.783408 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0cb75eb-e910-4dc8-9a37-fe9a45122e14-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd\" (UID: \"b0cb75eb-e910-4dc8-9a37-fe9a45122e14\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd" Apr 16 15:52:34.884752 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:34.884717 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0cb75eb-e910-4dc8-9a37-fe9a45122e14-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd\" (UID: \"b0cb75eb-e910-4dc8-9a37-fe9a45122e14\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd" Apr 16 15:52:34.885123 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:34.885105 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0cb75eb-e910-4dc8-9a37-fe9a45122e14-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd\" (UID: \"b0cb75eb-e910-4dc8-9a37-fe9a45122e14\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd" Apr 16 15:52:35.027486 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:35.027398 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd" Apr 16 15:52:35.153949 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:35.153919 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd"] Apr 16 15:52:35.157313 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:52:35.157286 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0cb75eb_e910_4dc8_9a37_fe9a45122e14.slice/crio-25eff2b6692a62d36ee436373015dac6750a98cc0505065ca804604907cac64f WatchSource:0}: Error finding container 25eff2b6692a62d36ee436373015dac6750a98cc0505065ca804604907cac64f: Status 404 returned error can't find the container with id 25eff2b6692a62d36ee436373015dac6750a98cc0505065ca804604907cac64f Apr 16 15:52:35.975554 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:35.975513 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd" event={"ID":"b0cb75eb-e910-4dc8-9a37-fe9a45122e14","Type":"ContainerStarted","Data":"4b600f7d1dc738a16ac2e82e9ade01d5146c1d3361680e8e30ff999f99270519"} Apr 16 15:52:35.975554 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:35.975551 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd" event={"ID":"b0cb75eb-e910-4dc8-9a37-fe9a45122e14","Type":"ContainerStarted","Data":"25eff2b6692a62d36ee436373015dac6750a98cc0505065ca804604907cac64f"} Apr 16 15:52:38.988520 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:38.988483 2576 generic.go:358] "Generic (PLEG): container finished" podID="b0cb75eb-e910-4dc8-9a37-fe9a45122e14" containerID="4b600f7d1dc738a16ac2e82e9ade01d5146c1d3361680e8e30ff999f99270519" exitCode=0 Apr 16 15:52:38.988888 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:38.988525 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd" event={"ID":"b0cb75eb-e910-4dc8-9a37-fe9a45122e14","Type":"ContainerDied","Data":"4b600f7d1dc738a16ac2e82e9ade01d5146c1d3361680e8e30ff999f99270519"} Apr 16 15:52:39.993075 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:39.993040 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd" event={"ID":"b0cb75eb-e910-4dc8-9a37-fe9a45122e14","Type":"ContainerStarted","Data":"2907b0045b60dd31f72f96844f6ed3d8075be6bb6845d92a5a97cdf8a011b292"} Apr 16 15:52:39.993469 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:39.993273 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd" Apr 16 15:52:40.010383 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:40.010331 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd" podStartSLOduration=6.010318558 podStartE2EDuration="6.010318558s" podCreationTimestamp="2026-04-16 15:52:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:52:40.008668441 +0000 UTC m=+3594.427803429" watchObservedRunningTime="2026-04-16 15:52:40.010318558 +0000 UTC m=+3594.429453532" Apr 16 15:52:46.499079 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:46.499048 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:52:46.501833 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:46.501813 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:52:46.505573 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:46.505555 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:52:46.507765 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:52:46.507746 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:53:11.033873 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:11.033829 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd" podUID="b0cb75eb-e910-4dc8-9a37-fe9a45122e14" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 15:53:21.032398 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:21.032342 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd" podUID="b0cb75eb-e910-4dc8-9a37-fe9a45122e14" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 15:53:30.999586 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:30.999541 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd" Apr 16 15:53:34.852108 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:34.852056 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd"] Apr 16 15:53:34.852673 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:34.852395 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd" podUID="b0cb75eb-e910-4dc8-9a37-fe9a45122e14" containerName="kserve-container" containerID="cri-o://2907b0045b60dd31f72f96844f6ed3d8075be6bb6845d92a5a97cdf8a011b292" gracePeriod=30 Apr 16 15:53:40.996998 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:40.996953 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd" podUID="b0cb75eb-e910-4dc8-9a37-fe9a45122e14" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.54:8080/v2/models/isvc-xgboost-v2-runtime/ready\": dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 15:53:42.610118 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:42.610088 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd" Apr 16 15:53:42.687776 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:42.687695 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0cb75eb-e910-4dc8-9a37-fe9a45122e14-kserve-provision-location\") pod \"b0cb75eb-e910-4dc8-9a37-fe9a45122e14\" (UID: \"b0cb75eb-e910-4dc8-9a37-fe9a45122e14\") " Apr 16 15:53:42.688015 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:42.687991 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0cb75eb-e910-4dc8-9a37-fe9a45122e14-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b0cb75eb-e910-4dc8-9a37-fe9a45122e14" (UID: "b0cb75eb-e910-4dc8-9a37-fe9a45122e14"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:53:42.789183 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:42.789145 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0cb75eb-e910-4dc8-9a37-fe9a45122e14-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:53:43.228632 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:43.228595 2576 generic.go:358] "Generic (PLEG): container finished" podID="b0cb75eb-e910-4dc8-9a37-fe9a45122e14" containerID="2907b0045b60dd31f72f96844f6ed3d8075be6bb6845d92a5a97cdf8a011b292" exitCode=0 Apr 16 15:53:43.228818 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:43.228669 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd" Apr 16 15:53:43.228818 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:43.228665 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd" event={"ID":"b0cb75eb-e910-4dc8-9a37-fe9a45122e14","Type":"ContainerDied","Data":"2907b0045b60dd31f72f96844f6ed3d8075be6bb6845d92a5a97cdf8a011b292"} Apr 16 15:53:43.228818 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:43.228771 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd" event={"ID":"b0cb75eb-e910-4dc8-9a37-fe9a45122e14","Type":"ContainerDied","Data":"25eff2b6692a62d36ee436373015dac6750a98cc0505065ca804604907cac64f"} Apr 16 15:53:43.228818 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:43.228787 2576 scope.go:117] "RemoveContainer" containerID="2907b0045b60dd31f72f96844f6ed3d8075be6bb6845d92a5a97cdf8a011b292" Apr 16 15:53:43.237922 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:43.237905 2576 scope.go:117] "RemoveContainer" containerID="4b600f7d1dc738a16ac2e82e9ade01d5146c1d3361680e8e30ff999f99270519" Apr 16 15:53:43.245404 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:43.245383 2576 scope.go:117] "RemoveContainer" containerID="2907b0045b60dd31f72f96844f6ed3d8075be6bb6845d92a5a97cdf8a011b292" Apr 16 15:53:43.245707 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:53:43.245640 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2907b0045b60dd31f72f96844f6ed3d8075be6bb6845d92a5a97cdf8a011b292\": container with ID starting with 2907b0045b60dd31f72f96844f6ed3d8075be6bb6845d92a5a97cdf8a011b292 not found: ID does not exist" containerID="2907b0045b60dd31f72f96844f6ed3d8075be6bb6845d92a5a97cdf8a011b292" Apr 16 15:53:43.245816 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:43.245718 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2907b0045b60dd31f72f96844f6ed3d8075be6bb6845d92a5a97cdf8a011b292"} err="failed to get container status \"2907b0045b60dd31f72f96844f6ed3d8075be6bb6845d92a5a97cdf8a011b292\": rpc error: code = NotFound desc = could not find container \"2907b0045b60dd31f72f96844f6ed3d8075be6bb6845d92a5a97cdf8a011b292\": container with ID starting with 2907b0045b60dd31f72f96844f6ed3d8075be6bb6845d92a5a97cdf8a011b292 not found: ID does not exist" Apr 16 15:53:43.245816 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:43.245747 2576 scope.go:117] "RemoveContainer" containerID="4b600f7d1dc738a16ac2e82e9ade01d5146c1d3361680e8e30ff999f99270519" Apr 16 15:53:43.246210 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:53:43.246183 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b600f7d1dc738a16ac2e82e9ade01d5146c1d3361680e8e30ff999f99270519\": container with ID starting with 4b600f7d1dc738a16ac2e82e9ade01d5146c1d3361680e8e30ff999f99270519 not found: ID does not exist" containerID="4b600f7d1dc738a16ac2e82e9ade01d5146c1d3361680e8e30ff999f99270519" Apr 16 15:53:43.246297 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:43.246217 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b600f7d1dc738a16ac2e82e9ade01d5146c1d3361680e8e30ff999f99270519"} err="failed to get container status \"4b600f7d1dc738a16ac2e82e9ade01d5146c1d3361680e8e30ff999f99270519\": rpc error: code = NotFound desc = could not find container \"4b600f7d1dc738a16ac2e82e9ade01d5146c1d3361680e8e30ff999f99270519\": container with ID starting with 4b600f7d1dc738a16ac2e82e9ade01d5146c1d3361680e8e30ff999f99270519 not found: ID does not exist" Apr 16 15:53:43.248587 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:43.248567 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd"] Apr 16 15:53:43.250246 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:43.250227 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-g9rnd"] Apr 16 15:53:44.307393 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:53:44.307360 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0cb75eb-e910-4dc8-9a37-fe9a45122e14" path="/var/lib/kubelet/pods/b0cb75eb-e910-4dc8-9a37-fe9a45122e14/volumes" Apr 16 15:54:45.078468 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:45.078429 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt"] Apr 16 15:54:45.079468 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:45.079443 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0cb75eb-e910-4dc8-9a37-fe9a45122e14" containerName="storage-initializer" Apr 16 15:54:45.079468 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:45.079467 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb75eb-e910-4dc8-9a37-fe9a45122e14" containerName="storage-initializer" Apr 16 15:54:45.079623 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:45.079480 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0cb75eb-e910-4dc8-9a37-fe9a45122e14" containerName="kserve-container" Apr 16 15:54:45.079623 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:45.079486 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb75eb-e910-4dc8-9a37-fe9a45122e14" containerName="kserve-container" Apr 16 15:54:45.079623 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:45.079548 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0cb75eb-e910-4dc8-9a37-fe9a45122e14" containerName="kserve-container" Apr 16 15:54:45.082699 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:45.082678 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" Apr 16 15:54:45.085003 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:45.084978 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 16 15:54:45.085130 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:45.085111 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tsb5d\"" Apr 16 15:54:45.088348 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:45.088325 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt"] Apr 16 15:54:45.128858 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:45.128825 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-54d746bb87-gk6gt\" (UID: \"8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" Apr 16 15:54:45.229631 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:45.229593 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-54d746bb87-gk6gt\" (UID: \"8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" Apr 16 15:54:45.229966 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:45.229944 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-54d746bb87-gk6gt\" (UID: \"8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" Apr 16 15:54:45.395147 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:45.395067 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" Apr 16 15:54:45.516199 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:45.516175 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt"] Apr 16 15:54:45.518694 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:54:45.518652 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ab93baa_06a6_41e9_9fe8_a4eee8bd4c8e.slice/crio-8510f83adf5247afa9daaa66cca322405c7874b55c367289017b7fa9f9d619ac WatchSource:0}: Error finding container 8510f83adf5247afa9daaa66cca322405c7874b55c367289017b7fa9f9d619ac: Status 404 returned error can't find the container with id 8510f83adf5247afa9daaa66cca322405c7874b55c367289017b7fa9f9d619ac Apr 16 15:54:46.448460 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:46.448426 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" event={"ID":"8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e","Type":"ContainerStarted","Data":"e239c18401be49cb81bb4c1ec01a3c8f5c6f9352146230c550b42ef95732cd89"} Apr 16 15:54:46.448460 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:46.448461 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" event={"ID":"8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e","Type":"ContainerStarted","Data":"8510f83adf5247afa9daaa66cca322405c7874b55c367289017b7fa9f9d619ac"} Apr 16 15:54:47.453223 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:47.453188 2576 generic.go:358] "Generic (PLEG): container finished" podID="8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e" containerID="e239c18401be49cb81bb4c1ec01a3c8f5c6f9352146230c550b42ef95732cd89" exitCode=0 Apr 16 15:54:47.453813 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:47.453237 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" event={"ID":"8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e","Type":"ContainerDied","Data":"e239c18401be49cb81bb4c1ec01a3c8f5c6f9352146230c550b42ef95732cd89"} Apr 16 15:54:48.463518 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:48.463479 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" event={"ID":"8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e","Type":"ContainerStarted","Data":"dd9f42808aba39a246f296e14419f1255ae186d3f06f1a2f12be34f3d9187d83"} Apr 16 15:54:48.464005 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:48.463670 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" Apr 16 15:54:48.465070 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:48.465044 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" podUID="8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 15:54:48.480505 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:48.480462 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" podStartSLOduration=3.480449314 podStartE2EDuration="3.480449314s" podCreationTimestamp="2026-04-16 15:54:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:54:48.478165947 +0000 UTC m=+3722.897300924" watchObservedRunningTime="2026-04-16 15:54:48.480449314 +0000 UTC m=+3722.899584289" Apr 16 15:54:49.467996 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:49.467954 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" podUID="8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 15:54:59.468409 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:54:59.468349 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" podUID="8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 15:55:09.468974 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:55:09.468926 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" podUID="8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 15:55:19.468794 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:55:19.468744 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" podUID="8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 15:55:29.468796 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:55:29.468700 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" podUID="8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 15:55:39.468623 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:55:39.468583 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" podUID="8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 15:55:49.468172 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:55:49.468130 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" podUID="8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 15:55:59.469200 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:55:59.469169 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" Apr 16 15:56:05.191171 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:05.191133 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt"] Apr 16 15:56:05.191747 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:05.191447 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" podUID="8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e" containerName="kserve-container" containerID="cri-o://dd9f42808aba39a246f296e14419f1255ae186d3f06f1a2f12be34f3d9187d83" gracePeriod=30 Apr 16 15:56:05.305567 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:05.305534 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp"] Apr 16 15:56:05.309277 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:05.309252 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" Apr 16 15:56:05.311402 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:05.311379 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 15:56:05.318128 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:05.318103 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp"] Apr 16 15:56:05.421959 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:05.421926 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6363f16d-7773-4dc6-8145-a7fa957178b6-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp\" (UID: \"6363f16d-7773-4dc6-8145-a7fa957178b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" Apr 16 15:56:05.421959 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:05.421962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6363f16d-7773-4dc6-8145-a7fa957178b6-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp\" (UID: \"6363f16d-7773-4dc6-8145-a7fa957178b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" Apr 16 15:56:05.522640 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:05.522559 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6363f16d-7773-4dc6-8145-a7fa957178b6-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp\" (UID: \"6363f16d-7773-4dc6-8145-a7fa957178b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" Apr 16 15:56:05.522640 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:05.522599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6363f16d-7773-4dc6-8145-a7fa957178b6-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp\" (UID: \"6363f16d-7773-4dc6-8145-a7fa957178b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" Apr 16 15:56:05.523078 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:05.523051 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6363f16d-7773-4dc6-8145-a7fa957178b6-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp\" (UID: \"6363f16d-7773-4dc6-8145-a7fa957178b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" Apr 16 15:56:05.523235 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:05.523219 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6363f16d-7773-4dc6-8145-a7fa957178b6-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp\" (UID: \"6363f16d-7773-4dc6-8145-a7fa957178b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" Apr 16 15:56:05.621271 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:05.621237 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" Apr 16 15:56:05.741465 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:05.741415 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp"] Apr 16 15:56:05.744405 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:56:05.744369 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6363f16d_7773_4dc6_8145_a7fa957178b6.slice/crio-0c44cb4435cd17d449a6f8e30b2365155c790f8d99d4bd20e0c4b7405aa643ab WatchSource:0}: Error finding container 0c44cb4435cd17d449a6f8e30b2365155c790f8d99d4bd20e0c4b7405aa643ab: Status 404 returned error can't find the container with id 0c44cb4435cd17d449a6f8e30b2365155c790f8d99d4bd20e0c4b7405aa643ab Apr 16 15:56:05.746802 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:05.746786 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:56:06.741415 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:06.741378 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" event={"ID":"6363f16d-7773-4dc6-8145-a7fa957178b6","Type":"ContainerStarted","Data":"3ac22602a2a4b98365e6d07ec2ff6ce7efd698471b42af13061ebb7db624de88"} Apr 16 15:56:06.741415 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:06.741415 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" event={"ID":"6363f16d-7773-4dc6-8145-a7fa957178b6","Type":"ContainerStarted","Data":"0c44cb4435cd17d449a6f8e30b2365155c790f8d99d4bd20e0c4b7405aa643ab"} Apr 16 15:56:07.746357 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:07.746309 2576 generic.go:358] "Generic (PLEG): container finished" podID="6363f16d-7773-4dc6-8145-a7fa957178b6" containerID="3ac22602a2a4b98365e6d07ec2ff6ce7efd698471b42af13061ebb7db624de88" exitCode=0 Apr 16 15:56:07.746758 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:07.746403 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" event={"ID":"6363f16d-7773-4dc6-8145-a7fa957178b6","Type":"ContainerDied","Data":"3ac22602a2a4b98365e6d07ec2ff6ce7efd698471b42af13061ebb7db624de88"} Apr 16 15:56:08.751440 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:08.751406 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" event={"ID":"6363f16d-7773-4dc6-8145-a7fa957178b6","Type":"ContainerStarted","Data":"984ba92872c7dde5527ae1bb1535e29b69a817e68621e6318e5f854dc3e538e9"} Apr 16 15:56:08.751859 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:08.751643 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" Apr 16 15:56:08.752943 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:08.752907 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" podUID="6363f16d-7773-4dc6-8145-a7fa957178b6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 16 15:56:08.766815 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:08.766690 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" podStartSLOduration=3.766674699 podStartE2EDuration="3.766674699s" podCreationTimestamp="2026-04-16 15:56:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:56:08.766016439 +0000 UTC m=+3803.185151415" watchObservedRunningTime="2026-04-16 15:56:08.766674699 +0000 UTC m=+3803.185809675" Apr 16 15:56:09.468489 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:09.468446 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" podUID="8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 15:56:09.736605 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:09.736576 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" Apr 16 15:56:09.756726 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:09.756681 2576 generic.go:358] "Generic (PLEG): container finished" podID="8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e" containerID="dd9f42808aba39a246f296e14419f1255ae186d3f06f1a2f12be34f3d9187d83" exitCode=0 Apr 16 15:56:09.757220 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:09.756758 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" Apr 16 15:56:09.757220 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:09.756757 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" event={"ID":"8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e","Type":"ContainerDied","Data":"dd9f42808aba39a246f296e14419f1255ae186d3f06f1a2f12be34f3d9187d83"} Apr 16 15:56:09.757220 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:09.756840 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt" event={"ID":"8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e","Type":"ContainerDied","Data":"8510f83adf5247afa9daaa66cca322405c7874b55c367289017b7fa9f9d619ac"} Apr 16 15:56:09.757220 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:09.756869 2576 scope.go:117] "RemoveContainer" containerID="dd9f42808aba39a246f296e14419f1255ae186d3f06f1a2f12be34f3d9187d83" Apr 16 15:56:09.757494 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:09.757276 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" podUID="6363f16d-7773-4dc6-8145-a7fa957178b6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 16 15:56:09.766491 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:09.766475 2576 scope.go:117] "RemoveContainer" containerID="e239c18401be49cb81bb4c1ec01a3c8f5c6f9352146230c550b42ef95732cd89" Apr 16 15:56:09.774597 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:09.774577 2576 scope.go:117] "RemoveContainer" containerID="dd9f42808aba39a246f296e14419f1255ae186d3f06f1a2f12be34f3d9187d83" Apr 16 15:56:09.774869 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:56:09.774847 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd9f42808aba39a246f296e14419f1255ae186d3f06f1a2f12be34f3d9187d83\": container with ID starting with dd9f42808aba39a246f296e14419f1255ae186d3f06f1a2f12be34f3d9187d83 not found: ID does not exist" containerID="dd9f42808aba39a246f296e14419f1255ae186d3f06f1a2f12be34f3d9187d83" Apr 16 15:56:09.775000 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:09.774875 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd9f42808aba39a246f296e14419f1255ae186d3f06f1a2f12be34f3d9187d83"} err="failed to get container status \"dd9f42808aba39a246f296e14419f1255ae186d3f06f1a2f12be34f3d9187d83\": rpc error: code = NotFound desc = could not find container \"dd9f42808aba39a246f296e14419f1255ae186d3f06f1a2f12be34f3d9187d83\": container with ID starting with dd9f42808aba39a246f296e14419f1255ae186d3f06f1a2f12be34f3d9187d83 not found: ID does not exist" Apr 16 15:56:09.775000 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:09.774895 2576 scope.go:117] "RemoveContainer" containerID="e239c18401be49cb81bb4c1ec01a3c8f5c6f9352146230c550b42ef95732cd89" Apr 16 15:56:09.775166 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:56:09.775147 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e239c18401be49cb81bb4c1ec01a3c8f5c6f9352146230c550b42ef95732cd89\": container with ID starting with e239c18401be49cb81bb4c1ec01a3c8f5c6f9352146230c550b42ef95732cd89 not found: ID does not exist" containerID="e239c18401be49cb81bb4c1ec01a3c8f5c6f9352146230c550b42ef95732cd89" Apr 16 15:56:09.775231 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:09.775171 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e239c18401be49cb81bb4c1ec01a3c8f5c6f9352146230c550b42ef95732cd89"} err="failed to get container status \"e239c18401be49cb81bb4c1ec01a3c8f5c6f9352146230c550b42ef95732cd89\": rpc error: code = NotFound desc = could not find container \"e239c18401be49cb81bb4c1ec01a3c8f5c6f9352146230c550b42ef95732cd89\": container with ID starting with e239c18401be49cb81bb4c1ec01a3c8f5c6f9352146230c550b42ef95732cd89 not found: ID does not exist" Apr 16 15:56:09.864865 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:09.864823 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e-kserve-provision-location\") pod \"8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e\" (UID: \"8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e\") " Apr 16 15:56:09.865233 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:09.865206 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e" (UID: "8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:56:09.966293 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:09.966248 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:56:10.078419 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:10.078387 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt"] Apr 16 15:56:10.081477 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:10.081452 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-54d746bb87-gk6gt"] Apr 16 15:56:10.307783 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:10.307740 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e" path="/var/lib/kubelet/pods/8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e/volumes" Apr 16 15:56:19.758182 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:19.758131 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" podUID="6363f16d-7773-4dc6-8145-a7fa957178b6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 16 15:56:29.757616 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:29.757562 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" podUID="6363f16d-7773-4dc6-8145-a7fa957178b6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 16 15:56:39.757824 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:39.757775 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" podUID="6363f16d-7773-4dc6-8145-a7fa957178b6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 16 15:56:49.758044 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:49.757973 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" podUID="6363f16d-7773-4dc6-8145-a7fa957178b6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 16 15:56:59.757711 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:56:59.757606 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" podUID="6363f16d-7773-4dc6-8145-a7fa957178b6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 16 15:57:09.758358 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:09.758303 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" podUID="6363f16d-7773-4dc6-8145-a7fa957178b6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 16 15:57:15.304852 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:15.304816 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" Apr 16 15:57:25.354774 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:25.354737 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp"] Apr 16 15:57:25.355266 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:25.355128 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" podUID="6363f16d-7773-4dc6-8145-a7fa957178b6" containerName="kserve-container" containerID="cri-o://984ba92872c7dde5527ae1bb1535e29b69a817e68621e6318e5f854dc3e538e9" gracePeriod=30 Apr 16 15:57:26.411807 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:26.411771 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j"] Apr 16 15:57:26.412275 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:26.412223 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e" containerName="storage-initializer" Apr 16 15:57:26.412275 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:26.412241 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e" containerName="storage-initializer" Apr 16 15:57:26.412275 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:26.412268 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e" containerName="kserve-container" Apr 16 15:57:26.412464 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:26.412278 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e" containerName="kserve-container" Apr 16 15:57:26.412464 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:26.412406 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ab93baa-06a6-41e9-9fe8-a4eee8bd4c8e" containerName="kserve-container" Apr 16 15:57:26.415635 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:26.415616 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j" Apr 16 15:57:26.426531 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:26.426505 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j"] Apr 16 15:57:26.542416 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:26.542372 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bdd3b94f-e6c9-4b65-badc-258007abdc25-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j\" (UID: \"bdd3b94f-e6c9-4b65-badc-258007abdc25\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j" Apr 16 15:57:26.643895 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:26.643854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bdd3b94f-e6c9-4b65-badc-258007abdc25-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j\" (UID: \"bdd3b94f-e6c9-4b65-badc-258007abdc25\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j" Apr 16 15:57:26.644271 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:26.644248 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bdd3b94f-e6c9-4b65-badc-258007abdc25-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j\" (UID: \"bdd3b94f-e6c9-4b65-badc-258007abdc25\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j" Apr 16 15:57:26.727683 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:26.727586 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j" Apr 16 15:57:26.850466 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:26.850432 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j"] Apr 16 15:57:26.853827 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:57:26.853796 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdd3b94f_e6c9_4b65_badc_258007abdc25.slice/crio-547c883bdc5ada4621cdee9b61eb2e660c2072d48390de56a4322b6b69cfc840 WatchSource:0}: Error finding container 547c883bdc5ada4621cdee9b61eb2e660c2072d48390de56a4322b6b69cfc840: Status 404 returned error can't find the container with id 547c883bdc5ada4621cdee9b61eb2e660c2072d48390de56a4322b6b69cfc840 Apr 16 15:57:27.027982 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:27.027884 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j" event={"ID":"bdd3b94f-e6c9-4b65-badc-258007abdc25","Type":"ContainerStarted","Data":"f1d20f6b2220a2664912c0394cd5c1859170daa1ff8c188d65c4f267b3641342"} Apr 16 15:57:27.027982 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:27.027934 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j" event={"ID":"bdd3b94f-e6c9-4b65-badc-258007abdc25","Type":"ContainerStarted","Data":"547c883bdc5ada4621cdee9b61eb2e660c2072d48390de56a4322b6b69cfc840"} Apr 16 15:57:29.792857 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:29.792830 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" Apr 16 15:57:29.976564 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:29.976522 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6363f16d-7773-4dc6-8145-a7fa957178b6-cabundle-cert\") pod \"6363f16d-7773-4dc6-8145-a7fa957178b6\" (UID: \"6363f16d-7773-4dc6-8145-a7fa957178b6\") " Apr 16 15:57:29.976747 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:29.976602 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6363f16d-7773-4dc6-8145-a7fa957178b6-kserve-provision-location\") pod \"6363f16d-7773-4dc6-8145-a7fa957178b6\" (UID: \"6363f16d-7773-4dc6-8145-a7fa957178b6\") " Apr 16 15:57:29.976920 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:29.976898 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6363f16d-7773-4dc6-8145-a7fa957178b6-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "6363f16d-7773-4dc6-8145-a7fa957178b6" (UID: "6363f16d-7773-4dc6-8145-a7fa957178b6"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:57:29.976961 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:29.976937 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6363f16d-7773-4dc6-8145-a7fa957178b6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6363f16d-7773-4dc6-8145-a7fa957178b6" (UID: "6363f16d-7773-4dc6-8145-a7fa957178b6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:57:30.041688 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:30.041651 2576 generic.go:358] "Generic (PLEG): container finished" podID="6363f16d-7773-4dc6-8145-a7fa957178b6" containerID="984ba92872c7dde5527ae1bb1535e29b69a817e68621e6318e5f854dc3e538e9" exitCode=0 Apr 16 15:57:30.041852 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:30.041712 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" Apr 16 15:57:30.041852 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:30.041730 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" event={"ID":"6363f16d-7773-4dc6-8145-a7fa957178b6","Type":"ContainerDied","Data":"984ba92872c7dde5527ae1bb1535e29b69a817e68621e6318e5f854dc3e538e9"} Apr 16 15:57:30.041852 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:30.041767 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp" event={"ID":"6363f16d-7773-4dc6-8145-a7fa957178b6","Type":"ContainerDied","Data":"0c44cb4435cd17d449a6f8e30b2365155c790f8d99d4bd20e0c4b7405aa643ab"} Apr 16 15:57:30.041852 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:30.041782 2576 scope.go:117] "RemoveContainer" containerID="984ba92872c7dde5527ae1bb1535e29b69a817e68621e6318e5f854dc3e538e9" Apr 16 15:57:30.050644 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:30.050629 2576 scope.go:117] "RemoveContainer" containerID="3ac22602a2a4b98365e6d07ec2ff6ce7efd698471b42af13061ebb7db624de88" Apr 16 15:57:30.058571 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:30.058554 2576 scope.go:117] "RemoveContainer" containerID="984ba92872c7dde5527ae1bb1535e29b69a817e68621e6318e5f854dc3e538e9" Apr 16 15:57:30.058797 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:57:30.058779 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"984ba92872c7dde5527ae1bb1535e29b69a817e68621e6318e5f854dc3e538e9\": container with ID starting with 984ba92872c7dde5527ae1bb1535e29b69a817e68621e6318e5f854dc3e538e9 not found: ID does not exist" containerID="984ba92872c7dde5527ae1bb1535e29b69a817e68621e6318e5f854dc3e538e9" Apr 16 15:57:30.058840 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:30.058803 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"984ba92872c7dde5527ae1bb1535e29b69a817e68621e6318e5f854dc3e538e9"} err="failed to get container status \"984ba92872c7dde5527ae1bb1535e29b69a817e68621e6318e5f854dc3e538e9\": rpc error: code = NotFound desc = could not find container \"984ba92872c7dde5527ae1bb1535e29b69a817e68621e6318e5f854dc3e538e9\": container with ID starting with 984ba92872c7dde5527ae1bb1535e29b69a817e68621e6318e5f854dc3e538e9 not found: ID does not exist" Apr 16 15:57:30.058840 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:30.058819 2576 scope.go:117] "RemoveContainer" containerID="3ac22602a2a4b98365e6d07ec2ff6ce7efd698471b42af13061ebb7db624de88" Apr 16 15:57:30.059055 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:57:30.059014 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ac22602a2a4b98365e6d07ec2ff6ce7efd698471b42af13061ebb7db624de88\": container with ID starting with 3ac22602a2a4b98365e6d07ec2ff6ce7efd698471b42af13061ebb7db624de88 not found: ID does not exist" containerID="3ac22602a2a4b98365e6d07ec2ff6ce7efd698471b42af13061ebb7db624de88" Apr 16 15:57:30.059111 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:30.059062 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ac22602a2a4b98365e6d07ec2ff6ce7efd698471b42af13061ebb7db624de88"} err="failed to get container status \"3ac22602a2a4b98365e6d07ec2ff6ce7efd698471b42af13061ebb7db624de88\": rpc error: code = NotFound desc = could not find container \"3ac22602a2a4b98365e6d07ec2ff6ce7efd698471b42af13061ebb7db624de88\": container with ID starting with 3ac22602a2a4b98365e6d07ec2ff6ce7efd698471b42af13061ebb7db624de88 not found: ID does not exist" Apr 16 15:57:30.061954 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:30.061932 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp"] Apr 16 15:57:30.065385 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:30.065360 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7dcd76cc75-hsrgp"] Apr 16 15:57:30.077901 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:30.077880 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6363f16d-7773-4dc6-8145-a7fa957178b6-cabundle-cert\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:57:30.077987 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:30.077903 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6363f16d-7773-4dc6-8145-a7fa957178b6-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:57:30.308254 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:30.308160 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6363f16d-7773-4dc6-8145-a7fa957178b6" path="/var/lib/kubelet/pods/6363f16d-7773-4dc6-8145-a7fa957178b6/volumes" Apr 16 15:57:31.047149 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:31.047124 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j_bdd3b94f-e6c9-4b65-badc-258007abdc25/storage-initializer/0.log" Apr 16 15:57:31.047499 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:31.047162 2576 generic.go:358] "Generic (PLEG): container finished" podID="bdd3b94f-e6c9-4b65-badc-258007abdc25" containerID="f1d20f6b2220a2664912c0394cd5c1859170daa1ff8c188d65c4f267b3641342" exitCode=1 Apr 16 15:57:31.047499 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:31.047230 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j" event={"ID":"bdd3b94f-e6c9-4b65-badc-258007abdc25","Type":"ContainerDied","Data":"f1d20f6b2220a2664912c0394cd5c1859170daa1ff8c188d65c4f267b3641342"} Apr 16 15:57:32.051783 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:32.051757 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j_bdd3b94f-e6c9-4b65-badc-258007abdc25/storage-initializer/0.log" Apr 16 15:57:32.052166 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:32.051806 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j" event={"ID":"bdd3b94f-e6c9-4b65-badc-258007abdc25","Type":"ContainerStarted","Data":"8e31af3688630b63f20aaa671196e53b3e3c36a9fba6b155dd87e63f599bfcd4"} Apr 16 15:57:35.064563 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:35.064536 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j_bdd3b94f-e6c9-4b65-badc-258007abdc25/storage-initializer/1.log" Apr 16 15:57:35.064968 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:35.064866 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j_bdd3b94f-e6c9-4b65-badc-258007abdc25/storage-initializer/0.log" Apr 16 15:57:35.064968 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:35.064902 2576 generic.go:358] "Generic (PLEG): container finished" podID="bdd3b94f-e6c9-4b65-badc-258007abdc25" containerID="8e31af3688630b63f20aaa671196e53b3e3c36a9fba6b155dd87e63f599bfcd4" exitCode=1 Apr 16 15:57:35.065059 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:35.064974 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j" event={"ID":"bdd3b94f-e6c9-4b65-badc-258007abdc25","Type":"ContainerDied","Data":"8e31af3688630b63f20aaa671196e53b3e3c36a9fba6b155dd87e63f599bfcd4"} Apr 16 15:57:35.065059 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:35.065011 2576 scope.go:117] "RemoveContainer" containerID="f1d20f6b2220a2664912c0394cd5c1859170daa1ff8c188d65c4f267b3641342" Apr 16 15:57:35.065409 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:35.065385 2576 scope.go:117] "RemoveContainer" containerID="f1d20f6b2220a2664912c0394cd5c1859170daa1ff8c188d65c4f267b3641342" Apr 16 15:57:35.081190 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:57:35.081148 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j_kserve-ci-e2e-test_bdd3b94f-e6c9-4b65-badc-258007abdc25_0 in pod sandbox 547c883bdc5ada4621cdee9b61eb2e660c2072d48390de56a4322b6b69cfc840 from index: no such id: 'f1d20f6b2220a2664912c0394cd5c1859170daa1ff8c188d65c4f267b3641342'" containerID="f1d20f6b2220a2664912c0394cd5c1859170daa1ff8c188d65c4f267b3641342" Apr 16 15:57:35.081336 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:57:35.081211 2576 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j_kserve-ci-e2e-test_bdd3b94f-e6c9-4b65-badc-258007abdc25_0 in pod sandbox 547c883bdc5ada4621cdee9b61eb2e660c2072d48390de56a4322b6b69cfc840 from index: no such id: 'f1d20f6b2220a2664912c0394cd5c1859170daa1ff8c188d65c4f267b3641342'; Skipping pod \"isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j_kserve-ci-e2e-test(bdd3b94f-e6c9-4b65-badc-258007abdc25)\"" logger="UnhandledError" Apr 16 15:57:35.082627 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:57:35.082606 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j_kserve-ci-e2e-test(bdd3b94f-e6c9-4b65-badc-258007abdc25)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j" podUID="bdd3b94f-e6c9-4b65-badc-258007abdc25" Apr 16 15:57:36.070373 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:36.070344 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j_bdd3b94f-e6c9-4b65-badc-258007abdc25/storage-initializer/1.log" Apr 16 15:57:36.430202 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:36.430171 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j"] Apr 16 15:57:36.553696 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:36.553672 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j_bdd3b94f-e6c9-4b65-badc-258007abdc25/storage-initializer/1.log" Apr 16 15:57:36.553809 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:36.553732 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j" Apr 16 15:57:36.631778 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:36.631739 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bdd3b94f-e6c9-4b65-badc-258007abdc25-kserve-provision-location\") pod \"bdd3b94f-e6c9-4b65-badc-258007abdc25\" (UID: \"bdd3b94f-e6c9-4b65-badc-258007abdc25\") " Apr 16 15:57:36.632060 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:36.632010 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdd3b94f-e6c9-4b65-badc-258007abdc25-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bdd3b94f-e6c9-4b65-badc-258007abdc25" (UID: "bdd3b94f-e6c9-4b65-badc-258007abdc25"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:57:36.733106 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:36.733005 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bdd3b94f-e6c9-4b65-badc-258007abdc25-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:57:37.075437 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.075348 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j_bdd3b94f-e6c9-4b65-badc-258007abdc25/storage-initializer/1.log" Apr 16 15:57:37.075437 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.075418 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j" event={"ID":"bdd3b94f-e6c9-4b65-badc-258007abdc25","Type":"ContainerDied","Data":"547c883bdc5ada4621cdee9b61eb2e660c2072d48390de56a4322b6b69cfc840"} Apr 16 15:57:37.075971 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.075459 2576 scope.go:117] "RemoveContainer" containerID="8e31af3688630b63f20aaa671196e53b3e3c36a9fba6b155dd87e63f599bfcd4" Apr 16 15:57:37.075971 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.075474 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j" Apr 16 15:57:37.107609 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.107579 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j"] Apr 16 15:57:37.113285 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.113257 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-76467cdc57-xcm9j"] Apr 16 15:57:37.478258 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.478225 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7"] Apr 16 15:57:37.478767 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.478752 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdd3b94f-e6c9-4b65-badc-258007abdc25" containerName="storage-initializer" Apr 16 15:57:37.478812 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.478771 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdd3b94f-e6c9-4b65-badc-258007abdc25" containerName="storage-initializer" Apr 16 15:57:37.478812 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.478792 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6363f16d-7773-4dc6-8145-a7fa957178b6" containerName="storage-initializer" Apr 16 15:57:37.478812 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.478801 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6363f16d-7773-4dc6-8145-a7fa957178b6" containerName="storage-initializer" Apr 16 15:57:37.478903 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.478815 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6363f16d-7773-4dc6-8145-a7fa957178b6" containerName="kserve-container" Apr 16 15:57:37.478903 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.478825 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6363f16d-7773-4dc6-8145-a7fa957178b6" containerName="kserve-container" Apr 16 15:57:37.478903 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.478852 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdd3b94f-e6c9-4b65-badc-258007abdc25" containerName="storage-initializer" Apr 16 15:57:37.478903 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.478860 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdd3b94f-e6c9-4b65-badc-258007abdc25" containerName="storage-initializer" Apr 16 15:57:37.479038 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.478945 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6363f16d-7773-4dc6-8145-a7fa957178b6" containerName="kserve-container" Apr 16 15:57:37.479038 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.478957 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bdd3b94f-e6c9-4b65-badc-258007abdc25" containerName="storage-initializer" Apr 16 15:57:37.479159 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.479148 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bdd3b94f-e6c9-4b65-badc-258007abdc25" containerName="storage-initializer" Apr 16 15:57:37.483765 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.483744 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" Apr 16 15:57:37.486653 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.486629 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tsb5d\"" Apr 16 15:57:37.486795 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.486656 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 16 15:57:37.486795 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.486665 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 15:57:37.499316 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.499291 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7"] Apr 16 15:57:37.541894 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.541858 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f94a7c3a-e337-419d-b050-295ef17831eb-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7\" (UID: \"f94a7c3a-e337-419d-b050-295ef17831eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" Apr 16 15:57:37.542106 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.541906 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f94a7c3a-e337-419d-b050-295ef17831eb-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7\" (UID: \"f94a7c3a-e337-419d-b050-295ef17831eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" Apr 16 15:57:37.643486 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.643443 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f94a7c3a-e337-419d-b050-295ef17831eb-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7\" (UID: \"f94a7c3a-e337-419d-b050-295ef17831eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" Apr 16 15:57:37.643486 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.643491 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f94a7c3a-e337-419d-b050-295ef17831eb-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7\" (UID: \"f94a7c3a-e337-419d-b050-295ef17831eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" Apr 16 15:57:37.643958 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.643933 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f94a7c3a-e337-419d-b050-295ef17831eb-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7\" (UID: \"f94a7c3a-e337-419d-b050-295ef17831eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" Apr 16 15:57:37.644366 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.644345 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f94a7c3a-e337-419d-b050-295ef17831eb-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7\" (UID: \"f94a7c3a-e337-419d-b050-295ef17831eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" Apr 16 15:57:37.795789 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.795696 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" Apr 16 15:57:37.921705 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:37.921677 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7"] Apr 16 15:57:37.924445 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:57:37.924403 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf94a7c3a_e337_419d_b050_295ef17831eb.slice/crio-8b700c3b61bbb7820381888ccf9100edbbbdd34272ca5927c61c0398da7e7919 WatchSource:0}: Error finding container 8b700c3b61bbb7820381888ccf9100edbbbdd34272ca5927c61c0398da7e7919: Status 404 returned error can't find the container with id 8b700c3b61bbb7820381888ccf9100edbbbdd34272ca5927c61c0398da7e7919 Apr 16 15:57:38.085050 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:38.084945 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" event={"ID":"f94a7c3a-e337-419d-b050-295ef17831eb","Type":"ContainerStarted","Data":"748f136517a906d039ec8c88619e465c4c6b779db616c0b888ca01ae227968a8"} Apr 16 15:57:38.085050 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:38.084978 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" event={"ID":"f94a7c3a-e337-419d-b050-295ef17831eb","Type":"ContainerStarted","Data":"8b700c3b61bbb7820381888ccf9100edbbbdd34272ca5927c61c0398da7e7919"} Apr 16 15:57:38.308372 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:38.308340 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdd3b94f-e6c9-4b65-badc-258007abdc25" path="/var/lib/kubelet/pods/bdd3b94f-e6c9-4b65-badc-258007abdc25/volumes" Apr 16 15:57:39.090192 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:39.090156 2576 generic.go:358] "Generic (PLEG): container finished" podID="f94a7c3a-e337-419d-b050-295ef17831eb" containerID="748f136517a906d039ec8c88619e465c4c6b779db616c0b888ca01ae227968a8" exitCode=0 Apr 16 15:57:39.090576 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:39.090225 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" event={"ID":"f94a7c3a-e337-419d-b050-295ef17831eb","Type":"ContainerDied","Data":"748f136517a906d039ec8c88619e465c4c6b779db616c0b888ca01ae227968a8"} Apr 16 15:57:40.101838 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:40.101805 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" event={"ID":"f94a7c3a-e337-419d-b050-295ef17831eb","Type":"ContainerStarted","Data":"aca291303b7029bc3e4373dba38540127ff1becdbcc09392168c5fd91f2a40a7"} Apr 16 15:57:40.102298 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:40.102001 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" Apr 16 15:57:40.103485 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:40.103460 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" podUID="f94a7c3a-e337-419d-b050-295ef17831eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 15:57:40.119865 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:40.119817 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" podStartSLOduration=3.119801627 podStartE2EDuration="3.119801627s" podCreationTimestamp="2026-04-16 15:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:57:40.117509694 +0000 UTC m=+3894.536644680" watchObservedRunningTime="2026-04-16 15:57:40.119801627 +0000 UTC m=+3894.538936602" Apr 16 15:57:41.105710 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:41.105671 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" podUID="f94a7c3a-e337-419d-b050-295ef17831eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 15:57:46.532626 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:46.532597 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:57:46.535284 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:46.535258 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:57:46.537153 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:46.537133 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 15:57:46.539463 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:46.539445 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 15:57:51.105934 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:57:51.105889 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" podUID="f94a7c3a-e337-419d-b050-295ef17831eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 15:58:01.106495 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:58:01.106445 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" podUID="f94a7c3a-e337-419d-b050-295ef17831eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 15:58:11.105799 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:58:11.105756 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" podUID="f94a7c3a-e337-419d-b050-295ef17831eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 15:58:21.106178 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:58:21.106133 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" podUID="f94a7c3a-e337-419d-b050-295ef17831eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 15:58:31.106432 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:58:31.106331 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" podUID="f94a7c3a-e337-419d-b050-295ef17831eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 15:58:41.105986 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:58:41.105942 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" podUID="f94a7c3a-e337-419d-b050-295ef17831eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 15:58:51.106804 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:58:51.106766 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" Apr 16 15:58:57.545528 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:58:57.545493 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7"] Apr 16 15:58:57.546125 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:58:57.545768 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" podUID="f94a7c3a-e337-419d-b050-295ef17831eb" containerName="kserve-container" containerID="cri-o://aca291303b7029bc3e4373dba38540127ff1becdbcc09392168c5fd91f2a40a7" gracePeriod=30 Apr 16 15:58:58.591266 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:58:58.591222 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv"] Apr 16 15:58:58.595140 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:58:58.595117 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv" Apr 16 15:58:58.605368 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:58:58.605342 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv"] Apr 16 15:58:58.771214 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:58:58.771179 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d41694db-5315-4f07-b194-358b4d0530f8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv\" (UID: \"d41694db-5315-4f07-b194-358b4d0530f8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv" Apr 16 15:58:58.871801 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:58:58.871695 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d41694db-5315-4f07-b194-358b4d0530f8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv\" (UID: \"d41694db-5315-4f07-b194-358b4d0530f8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv" Apr 16 15:58:58.872193 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:58:58.872168 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d41694db-5315-4f07-b194-358b4d0530f8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv\" (UID: \"d41694db-5315-4f07-b194-358b4d0530f8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv" Apr 16 15:58:58.906266 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:58:58.906228 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv" Apr 16 15:58:59.031895 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:58:59.031801 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv"] Apr 16 15:58:59.034747 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:58:59.034716 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd41694db_5315_4f07_b194_358b4d0530f8.slice/crio-2743d14cd8378a470256fb5ac36dcb21a5297974d27f181cc2a7f08ecacbd2ec WatchSource:0}: Error finding container 2743d14cd8378a470256fb5ac36dcb21a5297974d27f181cc2a7f08ecacbd2ec: Status 404 returned error can't find the container with id 2743d14cd8378a470256fb5ac36dcb21a5297974d27f181cc2a7f08ecacbd2ec Apr 16 15:58:59.381774 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:58:59.381737 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv" event={"ID":"d41694db-5315-4f07-b194-358b4d0530f8","Type":"ContainerStarted","Data":"5ec023cf9b12d762964aae1fac1d53edce6547029f6381f20584921f0d763a06"} Apr 16 15:58:59.381774 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:58:59.381775 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv" event={"ID":"d41694db-5315-4f07-b194-358b4d0530f8","Type":"ContainerStarted","Data":"2743d14cd8378a470256fb5ac36dcb21a5297974d27f181cc2a7f08ecacbd2ec"} Apr 16 15:59:01.106033 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:01.105979 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" podUID="f94a7c3a-e337-419d-b050-295ef17831eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 15:59:02.001319 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:02.001294 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" Apr 16 15:59:02.201174 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:02.201140 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f94a7c3a-e337-419d-b050-295ef17831eb-kserve-provision-location\") pod \"f94a7c3a-e337-419d-b050-295ef17831eb\" (UID: \"f94a7c3a-e337-419d-b050-295ef17831eb\") " Apr 16 15:59:02.201174 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:02.201191 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f94a7c3a-e337-419d-b050-295ef17831eb-cabundle-cert\") pod \"f94a7c3a-e337-419d-b050-295ef17831eb\" (UID: \"f94a7c3a-e337-419d-b050-295ef17831eb\") " Apr 16 15:59:02.201645 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:02.201392 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f94a7c3a-e337-419d-b050-295ef17831eb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f94a7c3a-e337-419d-b050-295ef17831eb" (UID: "f94a7c3a-e337-419d-b050-295ef17831eb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:59:02.201645 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:02.201528 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f94a7c3a-e337-419d-b050-295ef17831eb-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "f94a7c3a-e337-419d-b050-295ef17831eb" (UID: "f94a7c3a-e337-419d-b050-295ef17831eb"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:59:02.301777 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:02.301744 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f94a7c3a-e337-419d-b050-295ef17831eb-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:59:02.301777 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:02.301773 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f94a7c3a-e337-419d-b050-295ef17831eb-cabundle-cert\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:59:02.393601 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:02.393561 2576 generic.go:358] "Generic (PLEG): container finished" podID="f94a7c3a-e337-419d-b050-295ef17831eb" containerID="aca291303b7029bc3e4373dba38540127ff1becdbcc09392168c5fd91f2a40a7" exitCode=0 Apr 16 15:59:02.393767 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:02.393638 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" event={"ID":"f94a7c3a-e337-419d-b050-295ef17831eb","Type":"ContainerDied","Data":"aca291303b7029bc3e4373dba38540127ff1becdbcc09392168c5fd91f2a40a7"} Apr 16 15:59:02.393767 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:02.393669 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" Apr 16 15:59:02.393767 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:02.393681 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7" event={"ID":"f94a7c3a-e337-419d-b050-295ef17831eb","Type":"ContainerDied","Data":"8b700c3b61bbb7820381888ccf9100edbbbdd34272ca5927c61c0398da7e7919"} Apr 16 15:59:02.393767 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:02.393697 2576 scope.go:117] "RemoveContainer" containerID="aca291303b7029bc3e4373dba38540127ff1becdbcc09392168c5fd91f2a40a7" Apr 16 15:59:02.402521 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:02.402499 2576 scope.go:117] "RemoveContainer" containerID="748f136517a906d039ec8c88619e465c4c6b779db616c0b888ca01ae227968a8" Apr 16 15:59:02.412558 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:02.409564 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7"] Apr 16 15:59:02.414679 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:02.414650 2576 scope.go:117] "RemoveContainer" containerID="aca291303b7029bc3e4373dba38540127ff1becdbcc09392168c5fd91f2a40a7" Apr 16 15:59:02.414749 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:02.414731 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-65cf754f99-lp4l7"] Apr 16 15:59:02.414999 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:59:02.414977 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aca291303b7029bc3e4373dba38540127ff1becdbcc09392168c5fd91f2a40a7\": container with ID starting with aca291303b7029bc3e4373dba38540127ff1becdbcc09392168c5fd91f2a40a7 not found: ID does not exist" containerID="aca291303b7029bc3e4373dba38540127ff1becdbcc09392168c5fd91f2a40a7" Apr 16 15:59:02.415097 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:02.415012 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aca291303b7029bc3e4373dba38540127ff1becdbcc09392168c5fd91f2a40a7"} err="failed to get container status \"aca291303b7029bc3e4373dba38540127ff1becdbcc09392168c5fd91f2a40a7\": rpc error: code = NotFound desc = could not find container \"aca291303b7029bc3e4373dba38540127ff1becdbcc09392168c5fd91f2a40a7\": container with ID starting with aca291303b7029bc3e4373dba38540127ff1becdbcc09392168c5fd91f2a40a7 not found: ID does not exist" Apr 16 15:59:02.415097 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:02.415055 2576 scope.go:117] "RemoveContainer" containerID="748f136517a906d039ec8c88619e465c4c6b779db616c0b888ca01ae227968a8" Apr 16 15:59:02.415335 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:59:02.415319 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"748f136517a906d039ec8c88619e465c4c6b779db616c0b888ca01ae227968a8\": container with ID starting with 748f136517a906d039ec8c88619e465c4c6b779db616c0b888ca01ae227968a8 not found: ID does not exist" containerID="748f136517a906d039ec8c88619e465c4c6b779db616c0b888ca01ae227968a8" Apr 16 15:59:02.415397 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:02.415342 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"748f136517a906d039ec8c88619e465c4c6b779db616c0b888ca01ae227968a8"} err="failed to get container status \"748f136517a906d039ec8c88619e465c4c6b779db616c0b888ca01ae227968a8\": rpc error: code = NotFound desc = could not find container \"748f136517a906d039ec8c88619e465c4c6b779db616c0b888ca01ae227968a8\": container with ID starting with 748f136517a906d039ec8c88619e465c4c6b779db616c0b888ca01ae227968a8 not found: ID does not exist" Apr 16 15:59:04.307739 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:04.307702 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f94a7c3a-e337-419d-b050-295ef17831eb" path="/var/lib/kubelet/pods/f94a7c3a-e337-419d-b050-295ef17831eb/volumes" Apr 16 15:59:06.409784 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:06.409757 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv_d41694db-5315-4f07-b194-358b4d0530f8/storage-initializer/0.log" Apr 16 15:59:06.410180 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:06.409794 2576 generic.go:358] "Generic (PLEG): container finished" podID="d41694db-5315-4f07-b194-358b4d0530f8" containerID="5ec023cf9b12d762964aae1fac1d53edce6547029f6381f20584921f0d763a06" exitCode=1 Apr 16 15:59:06.410180 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:06.409875 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv" event={"ID":"d41694db-5315-4f07-b194-358b4d0530f8","Type":"ContainerDied","Data":"5ec023cf9b12d762964aae1fac1d53edce6547029f6381f20584921f0d763a06"} Apr 16 15:59:07.415277 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:07.415248 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv_d41694db-5315-4f07-b194-358b4d0530f8/storage-initializer/0.log" Apr 16 15:59:07.415694 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:07.415366 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv" event={"ID":"d41694db-5315-4f07-b194-358b4d0530f8","Type":"ContainerStarted","Data":"60fce722bccef9c45d00df0f7ac1a254646f33a0872289e75545e24917329918"} Apr 16 15:59:08.623261 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:08.623225 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv"] Apr 16 15:59:08.623751 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:08.623525 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv" podUID="d41694db-5315-4f07-b194-358b4d0530f8" containerName="storage-initializer" containerID="cri-o://60fce722bccef9c45d00df0f7ac1a254646f33a0872289e75545e24917329918" gracePeriod=30 Apr 16 15:59:09.274077 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.274052 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv_d41694db-5315-4f07-b194-358b4d0530f8/storage-initializer/1.log" Apr 16 15:59:09.274436 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.274416 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv_d41694db-5315-4f07-b194-358b4d0530f8/storage-initializer/0.log" Apr 16 15:59:09.274544 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.274477 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv" Apr 16 15:59:09.361214 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.361118 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d41694db-5315-4f07-b194-358b4d0530f8-kserve-provision-location\") pod \"d41694db-5315-4f07-b194-358b4d0530f8\" (UID: \"d41694db-5315-4f07-b194-358b4d0530f8\") " Apr 16 15:59:09.361398 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.361346 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d41694db-5315-4f07-b194-358b4d0530f8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d41694db-5315-4f07-b194-358b4d0530f8" (UID: "d41694db-5315-4f07-b194-358b4d0530f8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:59:09.361465 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.361434 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d41694db-5315-4f07-b194-358b4d0530f8-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 15:59:09.423986 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.423957 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv_d41694db-5315-4f07-b194-358b4d0530f8/storage-initializer/1.log" Apr 16 15:59:09.424323 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.424309 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv_d41694db-5315-4f07-b194-358b4d0530f8/storage-initializer/0.log" Apr 16 15:59:09.424395 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.424345 2576 generic.go:358] "Generic (PLEG): container finished" podID="d41694db-5315-4f07-b194-358b4d0530f8" containerID="60fce722bccef9c45d00df0f7ac1a254646f33a0872289e75545e24917329918" exitCode=1 Apr 16 15:59:09.424453 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.424440 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv" Apr 16 15:59:09.424541 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.424437 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv" event={"ID":"d41694db-5315-4f07-b194-358b4d0530f8","Type":"ContainerDied","Data":"60fce722bccef9c45d00df0f7ac1a254646f33a0872289e75545e24917329918"} Apr 16 15:59:09.424595 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.424558 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv" event={"ID":"d41694db-5315-4f07-b194-358b4d0530f8","Type":"ContainerDied","Data":"2743d14cd8378a470256fb5ac36dcb21a5297974d27f181cc2a7f08ecacbd2ec"} Apr 16 15:59:09.424595 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.424574 2576 scope.go:117] "RemoveContainer" containerID="60fce722bccef9c45d00df0f7ac1a254646f33a0872289e75545e24917329918" Apr 16 15:59:09.432989 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.432973 2576 scope.go:117] "RemoveContainer" containerID="5ec023cf9b12d762964aae1fac1d53edce6547029f6381f20584921f0d763a06" Apr 16 15:59:09.440488 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.440464 2576 scope.go:117] "RemoveContainer" containerID="60fce722bccef9c45d00df0f7ac1a254646f33a0872289e75545e24917329918" Apr 16 15:59:09.440719 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:59:09.440701 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60fce722bccef9c45d00df0f7ac1a254646f33a0872289e75545e24917329918\": container with ID starting with 60fce722bccef9c45d00df0f7ac1a254646f33a0872289e75545e24917329918 not found: ID does not exist" containerID="60fce722bccef9c45d00df0f7ac1a254646f33a0872289e75545e24917329918" Apr 16 15:59:09.440766 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.440730 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60fce722bccef9c45d00df0f7ac1a254646f33a0872289e75545e24917329918"} err="failed to get container status \"60fce722bccef9c45d00df0f7ac1a254646f33a0872289e75545e24917329918\": rpc error: code = NotFound desc = could not find container \"60fce722bccef9c45d00df0f7ac1a254646f33a0872289e75545e24917329918\": container with ID starting with 60fce722bccef9c45d00df0f7ac1a254646f33a0872289e75545e24917329918 not found: ID does not exist" Apr 16 15:59:09.440766 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.440750 2576 scope.go:117] "RemoveContainer" containerID="5ec023cf9b12d762964aae1fac1d53edce6547029f6381f20584921f0d763a06" Apr 16 15:59:09.440973 ip-10-0-129-76 kubenswrapper[2576]: E0416 15:59:09.440957 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ec023cf9b12d762964aae1fac1d53edce6547029f6381f20584921f0d763a06\": container with ID starting with 5ec023cf9b12d762964aae1fac1d53edce6547029f6381f20584921f0d763a06 not found: ID does not exist" containerID="5ec023cf9b12d762964aae1fac1d53edce6547029f6381f20584921f0d763a06" Apr 16 15:59:09.441011 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.440980 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ec023cf9b12d762964aae1fac1d53edce6547029f6381f20584921f0d763a06"} err="failed to get container status \"5ec023cf9b12d762964aae1fac1d53edce6547029f6381f20584921f0d763a06\": rpc error: code = NotFound desc = could not find container \"5ec023cf9b12d762964aae1fac1d53edce6547029f6381f20584921f0d763a06\": container with ID starting with 5ec023cf9b12d762964aae1fac1d53edce6547029f6381f20584921f0d763a06 not found: ID does not exist" Apr 16 15:59:09.458214 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.458183 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv"] Apr 16 15:59:09.468961 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.468933 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-746ccb9c65-kspkv"] Apr 16 15:59:09.653472 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.653435 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr"] Apr 16 15:59:09.653939 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.653894 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d41694db-5315-4f07-b194-358b4d0530f8" containerName="storage-initializer" Apr 16 15:59:09.653939 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.653908 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d41694db-5315-4f07-b194-358b4d0530f8" containerName="storage-initializer" Apr 16 15:59:09.653939 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.653918 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f94a7c3a-e337-419d-b050-295ef17831eb" containerName="storage-initializer" Apr 16 15:59:09.653939 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.653924 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94a7c3a-e337-419d-b050-295ef17831eb" containerName="storage-initializer" Apr 16 15:59:09.653939 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.653941 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f94a7c3a-e337-419d-b050-295ef17831eb" containerName="kserve-container" Apr 16 15:59:09.654207 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.653948 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94a7c3a-e337-419d-b050-295ef17831eb" containerName="kserve-container" Apr 16 15:59:09.654207 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.653954 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d41694db-5315-4f07-b194-358b4d0530f8" containerName="storage-initializer" Apr 16 15:59:09.654207 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.653959 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d41694db-5315-4f07-b194-358b4d0530f8" containerName="storage-initializer" Apr 16 15:59:09.654207 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.654046 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f94a7c3a-e337-419d-b050-295ef17831eb" containerName="kserve-container" Apr 16 15:59:09.654207 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.654059 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d41694db-5315-4f07-b194-358b4d0530f8" containerName="storage-initializer" Apr 16 15:59:09.654207 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.654067 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d41694db-5315-4f07-b194-358b4d0530f8" containerName="storage-initializer" Apr 16 15:59:09.658667 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.658650 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" Apr 16 15:59:09.660919 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.660896 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tsb5d\"" Apr 16 15:59:09.661056 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.661016 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 16 15:59:09.661230 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.661211 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 15:59:09.664472 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.664452 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr"] Apr 16 15:59:09.769540 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.769503 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/865a934c-26b3-4273-95ad-05dc8000f605-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr\" (UID: \"865a934c-26b3-4273-95ad-05dc8000f605\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" Apr 16 15:59:09.769718 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.769651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/865a934c-26b3-4273-95ad-05dc8000f605-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr\" (UID: \"865a934c-26b3-4273-95ad-05dc8000f605\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" Apr 16 15:59:09.871132 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.871088 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/865a934c-26b3-4273-95ad-05dc8000f605-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr\" (UID: \"865a934c-26b3-4273-95ad-05dc8000f605\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" Apr 16 15:59:09.871283 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.871140 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/865a934c-26b3-4273-95ad-05dc8000f605-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr\" (UID: \"865a934c-26b3-4273-95ad-05dc8000f605\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" Apr 16 15:59:09.871500 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.871479 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/865a934c-26b3-4273-95ad-05dc8000f605-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr\" (UID: \"865a934c-26b3-4273-95ad-05dc8000f605\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" Apr 16 15:59:09.871733 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.871717 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/865a934c-26b3-4273-95ad-05dc8000f605-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr\" (UID: \"865a934c-26b3-4273-95ad-05dc8000f605\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" Apr 16 15:59:09.970983 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:09.970886 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" Apr 16 15:59:10.093514 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:10.093484 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr"] Apr 16 15:59:10.096497 ip-10-0-129-76 kubenswrapper[2576]: W0416 15:59:10.096467 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod865a934c_26b3_4273_95ad_05dc8000f605.slice/crio-00f6cb0f26fbd0c28f0b5a900d427891613a336ebc55d91c739716bead00d36b WatchSource:0}: Error finding container 00f6cb0f26fbd0c28f0b5a900d427891613a336ebc55d91c739716bead00d36b: Status 404 returned error can't find the container with id 00f6cb0f26fbd0c28f0b5a900d427891613a336ebc55d91c739716bead00d36b Apr 16 15:59:10.307735 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:10.307650 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d41694db-5315-4f07-b194-358b4d0530f8" path="/var/lib/kubelet/pods/d41694db-5315-4f07-b194-358b4d0530f8/volumes" Apr 16 15:59:10.429637 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:10.429600 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" event={"ID":"865a934c-26b3-4273-95ad-05dc8000f605","Type":"ContainerStarted","Data":"cf9c81424ca2006427c887a889baa8481fd73cc69efc4617d7b19371586aa5c6"} Apr 16 15:59:10.429637 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:10.429644 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" event={"ID":"865a934c-26b3-4273-95ad-05dc8000f605","Type":"ContainerStarted","Data":"00f6cb0f26fbd0c28f0b5a900d427891613a336ebc55d91c739716bead00d36b"} Apr 16 15:59:11.435427 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:11.435394 2576 generic.go:358] "Generic (PLEG): container finished" podID="865a934c-26b3-4273-95ad-05dc8000f605" containerID="cf9c81424ca2006427c887a889baa8481fd73cc69efc4617d7b19371586aa5c6" exitCode=0 Apr 16 15:59:11.435816 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:11.435477 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" event={"ID":"865a934c-26b3-4273-95ad-05dc8000f605","Type":"ContainerDied","Data":"cf9c81424ca2006427c887a889baa8481fd73cc69efc4617d7b19371586aa5c6"} Apr 16 15:59:12.440696 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:12.440662 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" event={"ID":"865a934c-26b3-4273-95ad-05dc8000f605","Type":"ContainerStarted","Data":"4d49d949abb55cb28d66e9824d13a105812caa8aa31619d72966df083ba1d3a8"} Apr 16 15:59:12.441161 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:12.440832 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" Apr 16 15:59:12.442320 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:12.442283 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" podUID="865a934c-26b3-4273-95ad-05dc8000f605" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 16 15:59:12.456415 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:12.456240 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" podStartSLOduration=3.456223443 podStartE2EDuration="3.456223443s" podCreationTimestamp="2026-04-16 15:59:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:59:12.456086218 +0000 UTC m=+3986.875221195" watchObservedRunningTime="2026-04-16 15:59:12.456223443 +0000 UTC m=+3986.875358419" Apr 16 15:59:13.445560 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:13.445525 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" podUID="865a934c-26b3-4273-95ad-05dc8000f605" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 16 15:59:23.445794 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:23.445752 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" podUID="865a934c-26b3-4273-95ad-05dc8000f605" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 16 15:59:33.446005 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:33.445955 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" podUID="865a934c-26b3-4273-95ad-05dc8000f605" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 16 15:59:43.446096 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:43.446043 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" podUID="865a934c-26b3-4273-95ad-05dc8000f605" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 16 15:59:53.446333 ip-10-0-129-76 kubenswrapper[2576]: I0416 15:59:53.446286 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" podUID="865a934c-26b3-4273-95ad-05dc8000f605" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 16 16:00:03.446366 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:03.446306 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" podUID="865a934c-26b3-4273-95ad-05dc8000f605" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 16 16:00:13.445814 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:13.445757 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" podUID="865a934c-26b3-4273-95ad-05dc8000f605" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 16 16:00:18.308119 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:18.308090 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" Apr 16 16:00:19.689048 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:19.689003 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr"] Apr 16 16:00:19.692995 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:19.692938 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" podUID="865a934c-26b3-4273-95ad-05dc8000f605" containerName="kserve-container" containerID="cri-o://4d49d949abb55cb28d66e9824d13a105812caa8aa31619d72966df083ba1d3a8" gracePeriod=30 Apr 16 16:00:20.740759 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:20.740716 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82"] Apr 16 16:00:20.744678 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:20.744659 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82" Apr 16 16:00:20.754662 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:20.754627 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82"] Apr 16 16:00:20.913411 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:20.913371 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6feeac88-5812-4814-a868-6a1bf68dea1e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82\" (UID: \"6feeac88-5812-4814-a868-6a1bf68dea1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82" Apr 16 16:00:21.014076 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:21.013944 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6feeac88-5812-4814-a868-6a1bf68dea1e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82\" (UID: \"6feeac88-5812-4814-a868-6a1bf68dea1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82" Apr 16 16:00:21.014432 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:21.014407 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6feeac88-5812-4814-a868-6a1bf68dea1e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82\" (UID: \"6feeac88-5812-4814-a868-6a1bf68dea1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82" Apr 16 16:00:21.056206 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:21.056171 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82" Apr 16 16:00:21.181335 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:21.181273 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82"] Apr 16 16:00:21.184013 ip-10-0-129-76 kubenswrapper[2576]: W0416 16:00:21.183982 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6feeac88_5812_4814_a868_6a1bf68dea1e.slice/crio-ce36b2dc5d3af4e4403ce883988951feae201a22fb80981f1dcdc1f6c33af443 WatchSource:0}: Error finding container ce36b2dc5d3af4e4403ce883988951feae201a22fb80981f1dcdc1f6c33af443: Status 404 returned error can't find the container with id ce36b2dc5d3af4e4403ce883988951feae201a22fb80981f1dcdc1f6c33af443 Apr 16 16:00:21.702772 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:21.702730 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82" event={"ID":"6feeac88-5812-4814-a868-6a1bf68dea1e","Type":"ContainerStarted","Data":"6db0074f6bb2ee38fedf1511f10a3d9da512db08061b2e7c8725ec1b075ddf5a"} Apr 16 16:00:21.702772 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:21.702771 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82" event={"ID":"6feeac88-5812-4814-a868-6a1bf68dea1e","Type":"ContainerStarted","Data":"ce36b2dc5d3af4e4403ce883988951feae201a22fb80981f1dcdc1f6c33af443"} Apr 16 16:00:24.063261 ip-10-0-129-76 kubenswrapper[2576]: E0416 16:00:24.063224 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod865a934c_26b3_4273_95ad_05dc8000f605.slice/crio-conmon-4d49d949abb55cb28d66e9824d13a105812caa8aa31619d72966df083ba1d3a8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod865a934c_26b3_4273_95ad_05dc8000f605.slice/crio-4d49d949abb55cb28d66e9824d13a105812caa8aa31619d72966df083ba1d3a8.scope\": RecentStats: unable to find data in memory cache]" Apr 16 16:00:24.243529 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:24.243504 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" Apr 16 16:00:24.343745 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:24.343709 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/865a934c-26b3-4273-95ad-05dc8000f605-kserve-provision-location\") pod \"865a934c-26b3-4273-95ad-05dc8000f605\" (UID: \"865a934c-26b3-4273-95ad-05dc8000f605\") " Apr 16 16:00:24.343935 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:24.343798 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/865a934c-26b3-4273-95ad-05dc8000f605-cabundle-cert\") pod \"865a934c-26b3-4273-95ad-05dc8000f605\" (UID: \"865a934c-26b3-4273-95ad-05dc8000f605\") " Apr 16 16:00:24.344069 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:24.344041 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/865a934c-26b3-4273-95ad-05dc8000f605-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "865a934c-26b3-4273-95ad-05dc8000f605" (UID: "865a934c-26b3-4273-95ad-05dc8000f605"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:00:24.344140 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:24.344103 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/865a934c-26b3-4273-95ad-05dc8000f605-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "865a934c-26b3-4273-95ad-05dc8000f605" (UID: "865a934c-26b3-4273-95ad-05dc8000f605"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:00:24.444778 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:24.444745 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/865a934c-26b3-4273-95ad-05dc8000f605-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 16:00:24.444778 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:24.444775 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/865a934c-26b3-4273-95ad-05dc8000f605-cabundle-cert\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 16:00:24.717583 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:24.717558 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82_6feeac88-5812-4814-a868-6a1bf68dea1e/storage-initializer/0.log" Apr 16 16:00:24.717773 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:24.717596 2576 generic.go:358] "Generic (PLEG): container finished" podID="6feeac88-5812-4814-a868-6a1bf68dea1e" containerID="6db0074f6bb2ee38fedf1511f10a3d9da512db08061b2e7c8725ec1b075ddf5a" exitCode=1 Apr 16 16:00:24.717773 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:24.717678 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82" event={"ID":"6feeac88-5812-4814-a868-6a1bf68dea1e","Type":"ContainerDied","Data":"6db0074f6bb2ee38fedf1511f10a3d9da512db08061b2e7c8725ec1b075ddf5a"} Apr 16 16:00:24.718976 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:24.718953 2576 generic.go:358] "Generic (PLEG): container finished" podID="865a934c-26b3-4273-95ad-05dc8000f605" containerID="4d49d949abb55cb28d66e9824d13a105812caa8aa31619d72966df083ba1d3a8" exitCode=0 Apr 16 16:00:24.719118 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:24.719034 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" Apr 16 16:00:24.719118 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:24.719061 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" event={"ID":"865a934c-26b3-4273-95ad-05dc8000f605","Type":"ContainerDied","Data":"4d49d949abb55cb28d66e9824d13a105812caa8aa31619d72966df083ba1d3a8"} Apr 16 16:00:24.719118 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:24.719091 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr" event={"ID":"865a934c-26b3-4273-95ad-05dc8000f605","Type":"ContainerDied","Data":"00f6cb0f26fbd0c28f0b5a900d427891613a336ebc55d91c739716bead00d36b"} Apr 16 16:00:24.719245 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:24.719108 2576 scope.go:117] "RemoveContainer" containerID="4d49d949abb55cb28d66e9824d13a105812caa8aa31619d72966df083ba1d3a8" Apr 16 16:00:24.729064 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:24.729041 2576 scope.go:117] "RemoveContainer" containerID="cf9c81424ca2006427c887a889baa8481fd73cc69efc4617d7b19371586aa5c6" Apr 16 16:00:24.745733 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:24.745708 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr"] Apr 16 16:00:24.746845 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:24.746822 2576 scope.go:117] "RemoveContainer" containerID="4d49d949abb55cb28d66e9824d13a105812caa8aa31619d72966df083ba1d3a8" Apr 16 16:00:24.747198 ip-10-0-129-76 kubenswrapper[2576]: E0416 16:00:24.747175 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d49d949abb55cb28d66e9824d13a105812caa8aa31619d72966df083ba1d3a8\": container with ID starting with 4d49d949abb55cb28d66e9824d13a105812caa8aa31619d72966df083ba1d3a8 not found: ID does not exist" containerID="4d49d949abb55cb28d66e9824d13a105812caa8aa31619d72966df083ba1d3a8" Apr 16 16:00:24.747275 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:24.747214 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d49d949abb55cb28d66e9824d13a105812caa8aa31619d72966df083ba1d3a8"} err="failed to get container status \"4d49d949abb55cb28d66e9824d13a105812caa8aa31619d72966df083ba1d3a8\": rpc error: code = NotFound desc = could not find container \"4d49d949abb55cb28d66e9824d13a105812caa8aa31619d72966df083ba1d3a8\": container with ID starting with 4d49d949abb55cb28d66e9824d13a105812caa8aa31619d72966df083ba1d3a8 not found: ID does not exist" Apr 16 16:00:24.747275 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:24.747239 2576 scope.go:117] "RemoveContainer" containerID="cf9c81424ca2006427c887a889baa8481fd73cc69efc4617d7b19371586aa5c6" Apr 16 16:00:24.747517 ip-10-0-129-76 kubenswrapper[2576]: E0416 16:00:24.747497 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf9c81424ca2006427c887a889baa8481fd73cc69efc4617d7b19371586aa5c6\": container with ID starting with cf9c81424ca2006427c887a889baa8481fd73cc69efc4617d7b19371586aa5c6 not found: ID does not exist" containerID="cf9c81424ca2006427c887a889baa8481fd73cc69efc4617d7b19371586aa5c6" Apr 16 16:00:24.747571 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:24.747524 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf9c81424ca2006427c887a889baa8481fd73cc69efc4617d7b19371586aa5c6"} err="failed to get container status \"cf9c81424ca2006427c887a889baa8481fd73cc69efc4617d7b19371586aa5c6\": rpc error: code = NotFound desc = could not find container \"cf9c81424ca2006427c887a889baa8481fd73cc69efc4617d7b19371586aa5c6\": container with ID starting with cf9c81424ca2006427c887a889baa8481fd73cc69efc4617d7b19371586aa5c6 not found: ID does not exist" Apr 16 16:00:24.747643 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:24.747626 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-64ddbd787d-g8vmr"] Apr 16 16:00:25.723808 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:25.723778 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82_6feeac88-5812-4814-a868-6a1bf68dea1e/storage-initializer/0.log" Apr 16 16:00:25.724337 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:25.723893 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82" event={"ID":"6feeac88-5812-4814-a868-6a1bf68dea1e","Type":"ContainerStarted","Data":"d01cc5240d2f466a6fc9614057419af1fd62d677d1ff5fce3c08615a4b525199"} Apr 16 16:00:26.307864 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:26.307825 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="865a934c-26b3-4273-95ad-05dc8000f605" path="/var/lib/kubelet/pods/865a934c-26b3-4273-95ad-05dc8000f605/volumes" Apr 16 16:00:29.738750 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:29.738723 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82_6feeac88-5812-4814-a868-6a1bf68dea1e/storage-initializer/1.log" Apr 16 16:00:29.739148 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:29.739119 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82_6feeac88-5812-4814-a868-6a1bf68dea1e/storage-initializer/0.log" Apr 16 16:00:29.739190 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:29.739152 2576 generic.go:358] "Generic (PLEG): container finished" podID="6feeac88-5812-4814-a868-6a1bf68dea1e" containerID="d01cc5240d2f466a6fc9614057419af1fd62d677d1ff5fce3c08615a4b525199" exitCode=1 Apr 16 16:00:29.739244 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:29.739220 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82" event={"ID":"6feeac88-5812-4814-a868-6a1bf68dea1e","Type":"ContainerDied","Data":"d01cc5240d2f466a6fc9614057419af1fd62d677d1ff5fce3c08615a4b525199"} Apr 16 16:00:29.739288 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:29.739269 2576 scope.go:117] "RemoveContainer" containerID="6db0074f6bb2ee38fedf1511f10a3d9da512db08061b2e7c8725ec1b075ddf5a" Apr 16 16:00:29.739546 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:29.739532 2576 scope.go:117] "RemoveContainer" containerID="6db0074f6bb2ee38fedf1511f10a3d9da512db08061b2e7c8725ec1b075ddf5a" Apr 16 16:00:29.750232 ip-10-0-129-76 kubenswrapper[2576]: E0416 16:00:29.750205 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82_kserve-ci-e2e-test_6feeac88-5812-4814-a868-6a1bf68dea1e_0 in pod sandbox ce36b2dc5d3af4e4403ce883988951feae201a22fb80981f1dcdc1f6c33af443 from index: no such id: '6db0074f6bb2ee38fedf1511f10a3d9da512db08061b2e7c8725ec1b075ddf5a'" containerID="6db0074f6bb2ee38fedf1511f10a3d9da512db08061b2e7c8725ec1b075ddf5a" Apr 16 16:00:29.750325 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:29.750241 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6db0074f6bb2ee38fedf1511f10a3d9da512db08061b2e7c8725ec1b075ddf5a"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82_kserve-ci-e2e-test_6feeac88-5812-4814-a868-6a1bf68dea1e_0 in pod sandbox ce36b2dc5d3af4e4403ce883988951feae201a22fb80981f1dcdc1f6c33af443 from index: no such id: '6db0074f6bb2ee38fedf1511f10a3d9da512db08061b2e7c8725ec1b075ddf5a'" Apr 16 16:00:29.750477 ip-10-0-129-76 kubenswrapper[2576]: E0416 16:00:29.750455 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82_kserve-ci-e2e-test(6feeac88-5812-4814-a868-6a1bf68dea1e)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82" podUID="6feeac88-5812-4814-a868-6a1bf68dea1e" Apr 16 16:00:30.745101 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:30.745075 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82_6feeac88-5812-4814-a868-6a1bf68dea1e/storage-initializer/1.log" Apr 16 16:00:30.753306 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:30.753274 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82"] Apr 16 16:00:30.889017 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:30.888995 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82_6feeac88-5812-4814-a868-6a1bf68dea1e/storage-initializer/1.log" Apr 16 16:00:30.889182 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:30.889076 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82" Apr 16 16:00:31.008835 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:31.008741 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6feeac88-5812-4814-a868-6a1bf68dea1e-kserve-provision-location\") pod \"6feeac88-5812-4814-a868-6a1bf68dea1e\" (UID: \"6feeac88-5812-4814-a868-6a1bf68dea1e\") " Apr 16 16:00:31.009063 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:31.009003 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6feeac88-5812-4814-a868-6a1bf68dea1e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6feeac88-5812-4814-a868-6a1bf68dea1e" (UID: "6feeac88-5812-4814-a868-6a1bf68dea1e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:00:31.110121 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:31.110085 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6feeac88-5812-4814-a868-6a1bf68dea1e-kserve-provision-location\") on node \"ip-10-0-129-76.ec2.internal\" DevicePath \"\"" Apr 16 16:00:31.749840 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:31.749811 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82_6feeac88-5812-4814-a868-6a1bf68dea1e/storage-initializer/1.log" Apr 16 16:00:31.750279 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:31.749888 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82" event={"ID":"6feeac88-5812-4814-a868-6a1bf68dea1e","Type":"ContainerDied","Data":"ce36b2dc5d3af4e4403ce883988951feae201a22fb80981f1dcdc1f6c33af443"} Apr 16 16:00:31.750279 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:31.749917 2576 scope.go:117] "RemoveContainer" containerID="d01cc5240d2f466a6fc9614057419af1fd62d677d1ff5fce3c08615a4b525199" Apr 16 16:00:31.750279 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:31.749919 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82" Apr 16 16:00:31.784051 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:31.783350 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82"] Apr 16 16:00:31.791004 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:31.790731 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b99d6b556-zqc82"] Apr 16 16:00:32.307471 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:00:32.307437 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6feeac88-5812-4814-a868-6a1bf68dea1e" path="/var/lib/kubelet/pods/6feeac88-5812-4814-a868-6a1bf68dea1e/volumes" Apr 16 16:01:01.179130 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:01.179098 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-6zlvj_9da141d8-7c00-4479-bb5d-0cc7c31814ff/global-pull-secret-syncer/0.log" Apr 16 16:01:01.383466 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:01.383429 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-tb9c9_a6e086d8-b850-425d-9896-6df3cec2442b/konnectivity-agent/0.log" Apr 16 16:01:01.457570 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:01.457541 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-76.ec2.internal_fa97968d5b4634bd4f9419795593b093/haproxy/0.log" Apr 16 16:01:04.800944 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:04.800911 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_27e1b5f9-340a-4274-9148-50c65175772e/alertmanager/0.log" Apr 16 16:01:04.827815 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:04.827787 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_27e1b5f9-340a-4274-9148-50c65175772e/config-reloader/0.log" Apr 16 16:01:04.855618 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:04.855588 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_27e1b5f9-340a-4274-9148-50c65175772e/kube-rbac-proxy-web/0.log" Apr 16 16:01:04.873863 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:04.873807 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_27e1b5f9-340a-4274-9148-50c65175772e/kube-rbac-proxy/0.log" Apr 16 16:01:04.893244 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:04.893220 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_27e1b5f9-340a-4274-9148-50c65175772e/kube-rbac-proxy-metric/0.log" Apr 16 16:01:04.914546 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:04.914521 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_27e1b5f9-340a-4274-9148-50c65175772e/prom-label-proxy/0.log" Apr 16 16:01:04.940594 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:04.940564 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_27e1b5f9-340a-4274-9148-50c65175772e/init-config-reloader/0.log" Apr 16 16:01:04.989744 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:04.989715 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-9pd4w_9b07e8fb-9184-409a-ac6c-ab62ef5c0a79/cluster-monitoring-operator/0.log" Apr 16 16:01:05.016336 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:05.016313 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-7xvvx_460ad184-72b0-4b47-b454-93b01b7a7648/kube-state-metrics/0.log" Apr 16 16:01:05.036571 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:05.036547 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-7xvvx_460ad184-72b0-4b47-b454-93b01b7a7648/kube-rbac-proxy-main/0.log" Apr 16 16:01:05.059575 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:05.059507 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-7xvvx_460ad184-72b0-4b47-b454-93b01b7a7648/kube-rbac-proxy-self/0.log" Apr 16 16:01:05.084879 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:05.084854 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6b4dc95984-98mbp_4a4f7a19-49a1-4031-ae9b-2947eb0b2a2c/metrics-server/0.log" Apr 16 16:01:05.111460 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:05.111428 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-vm2nt_b40e3801-d782-46b4-9a70-170dbfac4af1/monitoring-plugin/0.log" Apr 16 16:01:05.326198 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:05.326118 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qpqgz_528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd/node-exporter/0.log" Apr 16 16:01:05.351480 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:05.351453 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qpqgz_528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd/kube-rbac-proxy/0.log" Apr 16 16:01:05.374822 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:05.374795 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qpqgz_528cc92a-aa51-4bd0-9c9c-c21ddf5d16bd/init-textfile/0.log" Apr 16 16:01:05.399173 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:05.399149 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-b4xpk_2cfd5d54-0a18-4111-bb16-ee0e795d6f34/kube-rbac-proxy-main/0.log" Apr 16 16:01:05.420535 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:05.420507 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-b4xpk_2cfd5d54-0a18-4111-bb16-ee0e795d6f34/kube-rbac-proxy-self/0.log" Apr 16 16:01:05.440741 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:05.440712 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-b4xpk_2cfd5d54-0a18-4111-bb16-ee0e795d6f34/openshift-state-metrics/0.log" Apr 16 16:01:05.709893 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:05.709865 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-864cbb5958-7xfqt_0fe47cdc-0294-4716-8f99-0a0ff91f3da3/telemeter-client/0.log" Apr 16 16:01:05.732294 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:05.732267 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-864cbb5958-7xfqt_0fe47cdc-0294-4716-8f99-0a0ff91f3da3/reload/0.log" Apr 16 16:01:05.751603 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:05.751572 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-864cbb5958-7xfqt_0fe47cdc-0294-4716-8f99-0a0ff91f3da3/kube-rbac-proxy/0.log" Apr 16 16:01:05.786765 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:05.786736 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6bf54c7b45-tvjxb_64e3ae53-70bf-4635-8294-d5d8634be750/thanos-query/0.log" Apr 16 16:01:05.808742 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:05.808717 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6bf54c7b45-tvjxb_64e3ae53-70bf-4635-8294-d5d8634be750/kube-rbac-proxy-web/0.log" Apr 16 16:01:05.831604 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:05.831577 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6bf54c7b45-tvjxb_64e3ae53-70bf-4635-8294-d5d8634be750/kube-rbac-proxy/0.log" Apr 16 16:01:05.858782 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:05.858753 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6bf54c7b45-tvjxb_64e3ae53-70bf-4635-8294-d5d8634be750/prom-label-proxy/0.log" Apr 16 16:01:05.877002 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:05.876974 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6bf54c7b45-tvjxb_64e3ae53-70bf-4635-8294-d5d8634be750/kube-rbac-proxy-rules/0.log" Apr 16 16:01:05.900113 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:05.900088 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6bf54c7b45-tvjxb_64e3ae53-70bf-4635-8294-d5d8634be750/kube-rbac-proxy-metrics/0.log" Apr 16 16:01:07.068292 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:07.068268 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-swmsg_dd701306-bfea-4f3a-a4b0-47ea87d026f6/networking-console-plugin/0.log" Apr 16 16:01:07.450564 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:07.450536 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/1.log" Apr 16 16:01:07.456404 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:07.456381 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dhmf_d238297c-7c93-4211-8678-2ecfa5f39967/console-operator/2.log" Apr 16 16:01:08.253137 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.253105 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-hgj2w_62bdf3db-4656-4d51-9053-16e6c9a90d0a/volume-data-source-validator/0.log" Apr 16 16:01:08.278201 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.278160 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4"] Apr 16 16:01:08.278628 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.278608 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6feeac88-5812-4814-a868-6a1bf68dea1e" containerName="storage-initializer" Apr 16 16:01:08.278718 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.278630 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6feeac88-5812-4814-a868-6a1bf68dea1e" containerName="storage-initializer" Apr 16 16:01:08.278718 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.278660 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="865a934c-26b3-4273-95ad-05dc8000f605" containerName="kserve-container" Apr 16 16:01:08.278718 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.278669 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="865a934c-26b3-4273-95ad-05dc8000f605" containerName="kserve-container" Apr 16 16:01:08.278718 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.278681 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6feeac88-5812-4814-a868-6a1bf68dea1e" containerName="storage-initializer" Apr 16 16:01:08.278718 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.278690 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6feeac88-5812-4814-a868-6a1bf68dea1e" containerName="storage-initializer" Apr 16 16:01:08.278718 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.278718 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="865a934c-26b3-4273-95ad-05dc8000f605" containerName="storage-initializer" Apr 16 16:01:08.279005 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.278727 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="865a934c-26b3-4273-95ad-05dc8000f605" containerName="storage-initializer" Apr 16 16:01:08.279005 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.278816 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6feeac88-5812-4814-a868-6a1bf68dea1e" containerName="storage-initializer" Apr 16 16:01:08.279005 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.278830 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="865a934c-26b3-4273-95ad-05dc8000f605" containerName="kserve-container" Apr 16 16:01:08.282164 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.282149 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4" Apr 16 16:01:08.284479 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.284459 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gvdx8\"/\"openshift-service-ca.crt\"" Apr 16 16:01:08.284731 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.284718 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gvdx8\"/\"kube-root-ca.crt\"" Apr 16 16:01:08.285496 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.285477 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gvdx8\"/\"default-dockercfg-sldsr\"" Apr 16 16:01:08.290222 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.290200 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4"] Apr 16 16:01:08.437898 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.437864 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2bcf1e39-f30a-437b-9d2b-e4536791d850-proc\") pod \"perf-node-gather-daemonset-7tdb4\" (UID: \"2bcf1e39-f30a-437b-9d2b-e4536791d850\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4" Apr 16 16:01:08.438117 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.437914 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mrbs\" (UniqueName: \"kubernetes.io/projected/2bcf1e39-f30a-437b-9d2b-e4536791d850-kube-api-access-2mrbs\") pod \"perf-node-gather-daemonset-7tdb4\" (UID: \"2bcf1e39-f30a-437b-9d2b-e4536791d850\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4" Apr 16 16:01:08.438117 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.437981 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2bcf1e39-f30a-437b-9d2b-e4536791d850-sys\") pod \"perf-node-gather-daemonset-7tdb4\" (UID: \"2bcf1e39-f30a-437b-9d2b-e4536791d850\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4" Apr 16 16:01:08.438117 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.437999 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2bcf1e39-f30a-437b-9d2b-e4536791d850-lib-modules\") pod \"perf-node-gather-daemonset-7tdb4\" (UID: \"2bcf1e39-f30a-437b-9d2b-e4536791d850\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4" Apr 16 16:01:08.438117 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.438050 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2bcf1e39-f30a-437b-9d2b-e4536791d850-podres\") pod \"perf-node-gather-daemonset-7tdb4\" (UID: \"2bcf1e39-f30a-437b-9d2b-e4536791d850\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4" Apr 16 16:01:08.538901 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.538811 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2bcf1e39-f30a-437b-9d2b-e4536791d850-proc\") pod \"perf-node-gather-daemonset-7tdb4\" (UID: \"2bcf1e39-f30a-437b-9d2b-e4536791d850\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4" Apr 16 16:01:08.538901 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.538864 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mrbs\" (UniqueName: \"kubernetes.io/projected/2bcf1e39-f30a-437b-9d2b-e4536791d850-kube-api-access-2mrbs\") pod \"perf-node-gather-daemonset-7tdb4\" (UID: \"2bcf1e39-f30a-437b-9d2b-e4536791d850\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4" Apr 16 16:01:08.539181 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.538919 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2bcf1e39-f30a-437b-9d2b-e4536791d850-sys\") pod \"perf-node-gather-daemonset-7tdb4\" (UID: \"2bcf1e39-f30a-437b-9d2b-e4536791d850\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4" Apr 16 16:01:08.539181 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.538937 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2bcf1e39-f30a-437b-9d2b-e4536791d850-lib-modules\") pod \"perf-node-gather-daemonset-7tdb4\" (UID: \"2bcf1e39-f30a-437b-9d2b-e4536791d850\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4" Apr 16 16:01:08.539181 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.538950 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2bcf1e39-f30a-437b-9d2b-e4536791d850-proc\") pod \"perf-node-gather-daemonset-7tdb4\" (UID: \"2bcf1e39-f30a-437b-9d2b-e4536791d850\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4" Apr 16 16:01:08.539181 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.539064 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2bcf1e39-f30a-437b-9d2b-e4536791d850-podres\") pod \"perf-node-gather-daemonset-7tdb4\" (UID: \"2bcf1e39-f30a-437b-9d2b-e4536791d850\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4" Apr 16 16:01:08.539181 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.539065 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2bcf1e39-f30a-437b-9d2b-e4536791d850-sys\") pod \"perf-node-gather-daemonset-7tdb4\" (UID: \"2bcf1e39-f30a-437b-9d2b-e4536791d850\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4" Apr 16 16:01:08.539181 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.539108 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2bcf1e39-f30a-437b-9d2b-e4536791d850-lib-modules\") pod \"perf-node-gather-daemonset-7tdb4\" (UID: \"2bcf1e39-f30a-437b-9d2b-e4536791d850\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4" Apr 16 16:01:08.539181 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.539176 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2bcf1e39-f30a-437b-9d2b-e4536791d850-podres\") pod \"perf-node-gather-daemonset-7tdb4\" (UID: \"2bcf1e39-f30a-437b-9d2b-e4536791d850\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4" Apr 16 16:01:08.547190 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.547164 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mrbs\" (UniqueName: \"kubernetes.io/projected/2bcf1e39-f30a-437b-9d2b-e4536791d850-kube-api-access-2mrbs\") pod \"perf-node-gather-daemonset-7tdb4\" (UID: \"2bcf1e39-f30a-437b-9d2b-e4536791d850\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4" Apr 16 16:01:08.592561 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.592523 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4" Apr 16 16:01:08.717586 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.717547 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4"] Apr 16 16:01:08.719963 ip-10-0-129-76 kubenswrapper[2576]: W0416 16:01:08.719934 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2bcf1e39_f30a_437b_9d2b_e4536791d850.slice/crio-09756419e68cba0374a6517bd0368d2768f1b376c16109e08e3bfb3f8eaf4a22 WatchSource:0}: Error finding container 09756419e68cba0374a6517bd0368d2768f1b376c16109e08e3bfb3f8eaf4a22: Status 404 returned error can't find the container with id 09756419e68cba0374a6517bd0368d2768f1b376c16109e08e3bfb3f8eaf4a22 Apr 16 16:01:08.721576 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.721559 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:01:08.884364 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.884328 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4" event={"ID":"2bcf1e39-f30a-437b-9d2b-e4536791d850","Type":"ContainerStarted","Data":"5222ee83768e500af4c8e2e8e68b87acd7cb3b9d26603292f968c2b79fc77b4d"} Apr 16 16:01:08.884364 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.884367 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4" event={"ID":"2bcf1e39-f30a-437b-9d2b-e4536791d850","Type":"ContainerStarted","Data":"09756419e68cba0374a6517bd0368d2768f1b376c16109e08e3bfb3f8eaf4a22"} Apr 16 16:01:08.884620 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.884454 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4" Apr 16 16:01:08.899485 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.899440 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4" podStartSLOduration=0.899427105 podStartE2EDuration="899.427105ms" podCreationTimestamp="2026-04-16 16:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:01:08.89832257 +0000 UTC m=+4103.317457538" watchObservedRunningTime="2026-04-16 16:01:08.899427105 +0000 UTC m=+4103.318562079" Apr 16 16:01:08.950349 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.950319 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rsgph_574ca2b9-aeca-4a60-8152-838c7e3d1902/dns/0.log" Apr 16 16:01:08.970987 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.970959 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rsgph_574ca2b9-aeca-4a60-8152-838c7e3d1902/kube-rbac-proxy/0.log" Apr 16 16:01:08.991660 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:08.991636 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mjsr6_69a10374-32da-4de3-b491-3854f69f1613/dns-node-resolver/0.log" Apr 16 16:01:09.406132 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:09.406090 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-654d7bdccf-4mwkc_59cfc831-0e33-47bf-91f5-3c4c514090ec/registry/0.log" Apr 16 16:01:09.449265 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:09.449235 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rxvdm_ab74fce9-eb83-4941-97e9-42f6ed125bf5/node-ca/0.log" Apr 16 16:01:10.407194 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:10.407157 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-94b8j_f82e5609-2a2d-49f8-aae5-da767543bb3d/serve-healthcheck-canary/0.log" Apr 16 16:01:10.804765 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:10.804692 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-b8j7n_8be3d31a-0693-4554-ab6c-0e45affa2eee/kube-rbac-proxy/0.log" Apr 16 16:01:10.823036 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:10.823001 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-b8j7n_8be3d31a-0693-4554-ab6c-0e45affa2eee/exporter/0.log" Apr 16 16:01:10.840308 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:10.840282 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-b8j7n_8be3d31a-0693-4554-ab6c-0e45affa2eee/extractor/0.log" Apr 16 16:01:12.835220 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:12.835172 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-7669bdc57-5tbbk_ce99f83c-1d37-49b5-930d-a5d043dcc6e3/manager/0.log" Apr 16 16:01:12.875087 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:12.875057 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-2jxbh_f31ce00d-1f54-4549-befe-7b377443d8b2/server/0.log" Apr 16 16:01:13.349730 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:13.349692 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-7zsmf_51f30a24-dd29-4fa2-a9bf-e8c4a58aab7a/seaweedfs/0.log" Apr 16 16:01:13.393197 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:13.393165 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-rvntz_0e9169df-66a6-46a5-b2de-67e2ef9a9606/seaweedfs-tls-serving/0.log" Apr 16 16:01:14.898039 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:14.897993 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-7tdb4" Apr 16 16:01:17.703893 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:17.703861 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-gkt5f_e6d80c27-15aa-4aea-8508-8913412eba90/kube-storage-version-migrator-operator/1.log" Apr 16 16:01:17.704656 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:17.704640 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-gkt5f_e6d80c27-15aa-4aea-8508-8913412eba90/kube-storage-version-migrator-operator/0.log" Apr 16 16:01:18.652547 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:18.652517 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8kcqd_31294a51-df01-4523-afff-845ceb6be0cc/kube-multus-additional-cni-plugins/0.log" Apr 16 16:01:18.671267 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:18.671238 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8kcqd_31294a51-df01-4523-afff-845ceb6be0cc/egress-router-binary-copy/0.log" Apr 16 16:01:18.692612 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:18.692589 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8kcqd_31294a51-df01-4523-afff-845ceb6be0cc/cni-plugins/0.log" Apr 16 16:01:18.711602 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:18.711579 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8kcqd_31294a51-df01-4523-afff-845ceb6be0cc/bond-cni-plugin/0.log" Apr 16 16:01:18.732763 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:18.732735 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8kcqd_31294a51-df01-4523-afff-845ceb6be0cc/routeoverride-cni/0.log" Apr 16 16:01:18.750891 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:18.750864 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8kcqd_31294a51-df01-4523-afff-845ceb6be0cc/whereabouts-cni-bincopy/0.log" Apr 16 16:01:18.768391 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:18.768367 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8kcqd_31294a51-df01-4523-afff-845ceb6be0cc/whereabouts-cni/0.log" Apr 16 16:01:19.119305 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:19.119229 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cdg2t_a36d1747-2a52-4941-aa0e-8d1fe90b9b00/kube-multus/0.log" Apr 16 16:01:19.279871 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:19.279840 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9p5t7_deecc941-e868-4306-99e5-4f30afef0f95/network-metrics-daemon/0.log" Apr 16 16:01:19.295588 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:19.295562 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9p5t7_deecc941-e868-4306-99e5-4f30afef0f95/kube-rbac-proxy/0.log" Apr 16 16:01:20.361056 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:20.361013 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-controller/0.log" Apr 16 16:01:20.375075 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:20.375054 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/0.log" Apr 16 16:01:20.393080 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:20.393055 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovn-acl-logging/1.log" Apr 16 16:01:20.411674 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:20.411624 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/kube-rbac-proxy-node/0.log" Apr 16 16:01:20.431307 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:20.431279 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 16:01:20.447777 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:20.447752 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/northd/0.log" Apr 16 16:01:20.466654 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:20.466633 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/nbdb/0.log" Apr 16 16:01:20.483801 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:20.483779 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/sbdb/0.log" Apr 16 16:01:20.585936 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:20.585902 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ddt96_85fdf4e1-8738-483f-a40e-a9112c7098d5/ovnkube-controller/0.log" Apr 16 16:01:21.964926 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:21.964896 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-6nklq_d6e0e8e5-d659-4175-b96f-52c250d77fd0/network-check-target-container/0.log" Apr 16 16:01:22.882414 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:22.882382 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-v7nk5_616aaf5a-f208-4fe5-97a1-96f1815fe9ac/iptables-alerter/0.log" Apr 16 16:01:23.514846 ip-10-0-129-76 kubenswrapper[2576]: I0416 16:01:23.514818 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-btrdx_281d16c8-10bf-4c91-91f2-472d3584db2f/tuned/0.log"