Apr 24 23:53:37.249707 ip-10-0-129-98 systemd[1]: Starting Kubernetes Kubelet... Apr 24 23:53:37.681917 ip-10-0-129-98 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:37.681917 ip-10-0-129-98 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 23:53:37.681917 ip-10-0-129-98 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:37.684273 ip-10-0-129-98 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 23:53:37.684273 ip-10-0-129-98 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:37.685710 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.685627 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 23:53:37.688873 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688860 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:37.688873 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688874 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:37.688934 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688879 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:37.688934 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688882 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:37.688934 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688885 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:37.688934 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688888 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:37.688934 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688892 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:37.688934 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688895 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:37.688934 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688897 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:37.688934 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688900 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:37.688934 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688903 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:37.688934 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688905 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:37.688934 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688908 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:37.688934 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688910 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:37.688934 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688914 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:37.688934 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688917 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:37.688934 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688919 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:37.688934 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688922 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:37.688934 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688924 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:37.688934 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688927 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:37.688934 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688929 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:37.688934 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688932 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:37.689407 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688934 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:37.689407 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688937 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:37.689407 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688939 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:37.689407 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688942 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:37.689407 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688945 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:37.689407 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688948 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:37.689407 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688951 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:37.689407 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688955 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:37.689407 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688958 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:37.689407 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688961 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:37.689407 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688964 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:37.689407 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688966 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:37.689407 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688969 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:37.689407 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688971 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:37.689407 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688973 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:37.689407 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688976 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:37.689407 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688979 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:37.689407 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688981 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:37.689407 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688984 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:37.689875 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688986 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:37.689875 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688988 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:37.689875 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688991 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:37.689875 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688993 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:37.689875 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688996 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:37.689875 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.688998 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:37.689875 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689001 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:37.689875 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689004 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:37.689875 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689006 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:37.689875 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689008 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:37.689875 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689011 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:37.689875 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689013 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:37.689875 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689016 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:37.689875 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689018 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:37.689875 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689022 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:37.689875 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689025 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:37.689875 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689028 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:37.689875 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689030 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:37.689875 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689033 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:37.689875 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689036 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:37.690361 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689038 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:37.690361 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689041 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:37.690361 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689043 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:37.690361 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689046 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:37.690361 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689050 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:37.690361 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689053 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:37.690361 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689055 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:37.690361 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689058 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:37.690361 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689060 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:37.690361 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689062 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:37.690361 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689065 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:37.690361 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689067 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:37.690361 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689069 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:37.690361 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689072 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:37.690361 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689075 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:37.690361 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689077 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:37.690361 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689080 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:37.690361 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689082 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:37.690361 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689085 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:37.690361 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689088 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:37.690896 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689090 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:37.690896 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689093 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:37.690896 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689095 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:37.690896 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689100 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:37.690896 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689103 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:37.690896 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689552 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:37.690896 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689559 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:37.690896 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689563 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:37.690896 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689566 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:37.690896 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689569 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:37.690896 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689572 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:37.690896 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689576 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:37.690896 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689579 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:37.690896 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689582 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:37.690896 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689585 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:37.690896 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689587 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:37.690896 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689590 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:37.690896 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689593 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:37.690896 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689595 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:37.691355 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689598 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:37.691355 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689600 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:37.691355 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689603 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:37.691355 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689605 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:37.691355 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689608 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:37.691355 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689610 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:37.691355 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689613 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:37.691355 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689615 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:37.691355 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689618 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:37.691355 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689621 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:37.691355 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689623 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:37.691355 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689626 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:37.691355 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689629 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:37.691355 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689631 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:37.691355 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689634 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:37.691355 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689636 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:37.691355 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689639 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:37.691355 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689643 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:37.691355 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689646 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:37.691841 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689651 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:37.691841 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689654 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:37.691841 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689656 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:37.691841 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689659 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:37.691841 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689661 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:37.691841 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689664 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:37.691841 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689666 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:37.691841 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689670 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:37.691841 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689672 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:37.691841 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689674 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:37.691841 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689677 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:37.691841 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689680 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:37.691841 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689682 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:37.691841 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689685 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:37.691841 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689687 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:37.691841 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689690 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:37.691841 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689692 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:37.691841 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689695 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:37.691841 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689697 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:37.691841 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689700 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:37.692351 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689702 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:37.692351 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689705 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:37.692351 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689707 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:37.692351 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689710 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:37.692351 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689712 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:37.692351 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689715 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:37.692351 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689718 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:37.692351 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689721 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:37.692351 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689724 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:37.692351 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689726 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:37.692351 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689728 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:37.692351 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689731 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:37.692351 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689735 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:37.692351 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689739 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:37.692351 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689742 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:37.692351 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689745 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:37.692351 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689748 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:37.692351 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689750 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:37.692351 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689753 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:37.692826 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689756 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:37.692826 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689758 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:37.692826 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689761 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:37.692826 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689764 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:37.692826 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689766 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:37.692826 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689769 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:37.692826 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689772 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:37.692826 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689774 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:37.692826 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689777 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:37.692826 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689779 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:37.692826 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689782 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:37.692826 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689784 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:37.692826 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689787 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:37.692826 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.689789 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:37.692826 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.689868 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 23:53:37.692826 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.689878 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 23:53:37.692826 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.689888 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 23:53:37.692826 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.689893 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 23:53:37.692826 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.689898 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 23:53:37.692826 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.689901 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 23:53:37.692826 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.689907 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.689912 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.689915 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.689919 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.689922 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.689925 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.689929 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.689932 2578 flags.go:64] FLAG: --cgroup-root="" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.689934 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.689937 2578 flags.go:64] FLAG: --client-ca-file="" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.689940 2578 flags.go:64] FLAG: --cloud-config="" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.689943 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.689946 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690827 2578 flags.go:64] FLAG: --cluster-domain="" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690831 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690834 2578 flags.go:64] FLAG: --config-dir="" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690837 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690841 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690846 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690849 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690852 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690855 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690858 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690862 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 23:53:37.693436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690865 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690868 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690871 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690875 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690879 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690882 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690884 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690888 2578 flags.go:64] FLAG: --enable-server="true" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690891 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690896 2578 flags.go:64] FLAG: --event-burst="100" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690900 2578 flags.go:64] FLAG: --event-qps="50" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690903 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690906 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690910 2578 flags.go:64] FLAG: --eviction-hard="" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690914 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690917 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690920 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690923 2578 flags.go:64] FLAG: --eviction-soft="" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690926 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690929 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690931 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690934 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690937 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690940 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690943 2578 flags.go:64] FLAG: --feature-gates="" Apr 24 23:53:37.694025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690946 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690950 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690953 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690956 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690959 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690962 2578 flags.go:64] FLAG: --help="false" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690965 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-129-98.ec2.internal" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690968 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690971 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690974 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690977 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690980 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690983 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690986 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690988 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690991 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690995 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.690998 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691001 2578 flags.go:64] FLAG: --kube-reserved="" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691004 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691007 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691010 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691013 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691016 2578 flags.go:64] FLAG: --lock-file="" Apr 24 23:53:37.694652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691018 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691021 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691024 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691030 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691033 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691036 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691038 2578 flags.go:64] FLAG: --logging-format="text" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691041 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691044 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691047 2578 flags.go:64] FLAG: --manifest-url="" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691050 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691054 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691058 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691062 2578 flags.go:64] FLAG: --max-pods="110" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691065 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691067 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691071 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691073 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691076 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691079 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691082 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691089 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691092 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691095 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 23:53:37.695223 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691098 2578 flags.go:64] FLAG: --pod-cidr="" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691101 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691107 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691110 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691113 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691116 2578 flags.go:64] FLAG: --port="10250" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691120 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691122 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0544aba995927530d" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691125 2578 flags.go:64] FLAG: --qos-reserved="" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691128 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691131 2578 flags.go:64] FLAG: --register-node="true" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691134 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691136 2578 flags.go:64] FLAG: --register-with-taints="" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691140 2578 flags.go:64] FLAG: --registry-burst="10" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691142 2578 flags.go:64] FLAG: --registry-qps="5" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691145 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691148 2578 flags.go:64] FLAG: --reserved-memory="" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691152 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691154 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691157 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691160 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691163 2578 flags.go:64] FLAG: --runonce="false" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691166 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691169 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691172 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 24 23:53:37.695840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691175 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691177 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691181 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691183 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691186 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691189 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691191 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691194 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691197 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691200 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691203 2578 flags.go:64] FLAG: --system-cgroups="" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691207 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691212 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691215 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691218 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691222 2578 flags.go:64] FLAG: --tls-min-version="" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691225 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691228 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691231 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691234 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691236 2578 flags.go:64] FLAG: --v="2" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691240 2578 flags.go:64] FLAG: --version="false" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691244 2578 flags.go:64] FLAG: --vmodule="" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691249 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.691252 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 23:53:37.696451 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691355 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:37.697058 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691359 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:37.697058 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691363 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:37.697058 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691366 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:37.697058 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691369 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:37.697058 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691371 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:37.697058 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691374 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:37.697058 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691376 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:37.697058 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691379 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:37.697058 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691381 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:37.697058 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691383 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:37.697058 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691386 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:37.697058 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691388 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:37.697058 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691391 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:37.697058 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691393 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:37.697058 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691396 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:37.697058 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691399 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:37.697058 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691401 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:37.697058 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691405 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:37.697058 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691407 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:37.697520 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691423 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:37.697520 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691426 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:37.697520 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691429 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:37.697520 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691431 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:37.697520 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691434 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:37.697520 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691437 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:37.697520 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691439 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:37.697520 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691442 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:37.697520 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691445 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:37.697520 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691448 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:37.697520 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691450 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:37.697520 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691453 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:37.697520 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691455 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:37.697520 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691458 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:37.697520 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691462 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:37.697520 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691465 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:37.697520 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691468 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:37.697520 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691471 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:37.697520 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691474 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:37.698007 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691477 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:37.698007 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691480 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:37.698007 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691482 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:37.698007 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691485 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:37.698007 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691487 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:37.698007 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691490 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:37.698007 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691493 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:37.698007 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691495 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:37.698007 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691498 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:37.698007 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691501 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:37.698007 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691505 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:37.698007 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691508 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:37.698007 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691511 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:37.698007 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691514 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:37.698007 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691516 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:37.698007 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691519 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:37.698007 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691522 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:37.698007 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691525 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:37.698007 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691527 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:37.698007 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691530 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:37.698551 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691532 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:37.698551 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691534 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:37.698551 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691537 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:37.698551 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691539 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:37.698551 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691542 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:37.698551 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691544 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:37.698551 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691546 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:37.698551 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691549 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:37.698551 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691551 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:37.698551 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691553 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:37.698551 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691556 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:37.698551 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691563 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:37.698551 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691565 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:37.698551 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691568 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:37.698551 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691570 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:37.698551 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691572 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:37.698551 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691575 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:37.698551 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691577 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:37.698551 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691580 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:37.698551 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691582 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:37.699171 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691584 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:37.699171 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691587 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:37.699171 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691589 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:37.699171 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691593 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:37.699171 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691595 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:37.699171 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691598 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:37.699171 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.691601 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:37.699171 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.692626 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:37.701694 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.701673 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 23:53:37.701694 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.701695 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 23:53:37.701802 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701744 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:37.701802 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701749 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:37.701802 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701753 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:37.701802 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701756 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:37.701802 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701759 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:37.701802 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701762 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:37.701802 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701765 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:37.701802 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701768 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:37.701802 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701771 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:37.701802 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701774 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:37.701802 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701776 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:37.701802 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701779 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:37.701802 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701782 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:37.701802 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701784 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:37.701802 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701787 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:37.701802 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701790 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:37.701802 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701792 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:37.701802 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701795 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:37.701802 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701797 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:37.701802 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701799 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:37.702287 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701802 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:37.702287 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701804 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:37.702287 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701809 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:37.702287 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701814 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:37.702287 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701818 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:37.702287 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701821 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:37.702287 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701824 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:37.702287 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701827 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:37.702287 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701830 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:37.702287 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701832 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:37.702287 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701835 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:37.702287 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701837 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:37.702287 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701840 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:37.702287 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701843 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:37.702287 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701846 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:37.702287 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701848 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:37.702287 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701851 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:37.702287 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701853 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:37.702287 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701855 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:37.702287 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701858 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:37.702807 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701860 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:37.702807 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701863 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:37.702807 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701865 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:37.702807 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701869 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:37.702807 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701872 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:37.702807 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701875 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:37.702807 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701878 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:37.702807 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701881 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:37.702807 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701883 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:37.702807 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701885 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:37.702807 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701888 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:37.702807 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701890 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:37.702807 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701893 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:37.702807 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701896 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:37.702807 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701898 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:37.702807 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701901 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:37.702807 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701903 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:37.702807 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701906 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:37.702807 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701908 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:37.703268 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701911 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:37.703268 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701913 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:37.703268 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701916 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:37.703268 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701918 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:37.703268 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701921 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:37.703268 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701923 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:37.703268 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701926 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:37.703268 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701928 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:37.703268 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701930 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:37.703268 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701933 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:37.703268 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701935 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:37.703268 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701938 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:37.703268 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701940 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:37.703268 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701942 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:37.703268 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701945 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:37.703268 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701947 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:37.703268 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701949 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:37.703268 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701953 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:37.703268 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701955 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:37.703268 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701958 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:37.703781 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701960 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:37.703781 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701963 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:37.703781 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701965 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:37.703781 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701968 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:37.703781 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701970 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:37.703781 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701972 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:37.703781 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.701975 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:37.703781 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.701981 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:37.703781 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702077 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:37.703781 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702083 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:37.703781 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702086 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:37.703781 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702089 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:37.703781 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702092 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:37.703781 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702094 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:37.703781 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702097 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:37.704164 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702100 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:37.704164 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702103 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:37.704164 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702105 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:37.704164 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702108 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:37.704164 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702110 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:37.704164 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702113 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:37.704164 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702115 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:37.704164 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702118 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:37.704164 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702120 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:37.704164 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702123 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:37.704164 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702125 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:37.704164 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702128 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:37.704164 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702130 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:37.704164 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702133 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:37.704164 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702135 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:37.704164 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702138 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:37.704164 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702141 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:37.704164 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702143 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:37.704164 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702146 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:37.704164 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702148 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:37.704672 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702151 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:37.704672 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702153 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:37.704672 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702156 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:37.704672 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702159 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:37.704672 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702161 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:37.704672 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702164 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:37.704672 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702166 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:37.704672 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702169 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:37.704672 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702171 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:37.704672 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702174 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:37.704672 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702178 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:37.704672 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702182 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:37.704672 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702184 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:37.704672 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702187 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:37.704672 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702189 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:37.704672 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702192 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:37.704672 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702194 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:37.704672 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702196 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:37.704672 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702199 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:37.705167 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702201 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:37.705167 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702204 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:37.705167 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702206 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:37.705167 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702209 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:37.705167 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702211 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:37.705167 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702213 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:37.705167 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702216 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:37.705167 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702219 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:37.705167 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702223 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:37.705167 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702226 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:37.705167 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702228 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:37.705167 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702231 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:37.705167 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702233 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:37.705167 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702236 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:37.705167 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702238 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:37.705167 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702241 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:37.705167 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702243 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:37.705167 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702246 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:37.705167 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702248 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:37.705167 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702251 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:37.705677 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702253 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:37.705677 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702256 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:37.705677 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702258 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:37.705677 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702260 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:37.705677 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702263 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:37.705677 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702265 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:37.705677 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702267 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:37.705677 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702270 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:37.705677 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702273 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:37.705677 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702275 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:37.705677 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702277 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:37.705677 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702280 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:37.705677 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702282 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:37.705677 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702284 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:37.705677 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702287 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:37.705677 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702289 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:37.705677 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702292 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:37.705677 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702294 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:37.705677 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702297 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:37.706139 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:37.702299 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:37.706139 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.702305 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:37.706139 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.703074 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 23:53:37.706629 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.706615 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 23:53:37.707952 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.707941 2578 server.go:1019] "Starting client certificate rotation" Apr 24 23:53:37.708055 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.708039 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 23:53:37.708095 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.708086 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 23:53:37.736026 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.736011 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 23:53:37.738536 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.738517 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 23:53:37.753449 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.753434 2578 log.go:25] "Validated CRI v1 runtime API" Apr 24 23:53:37.759851 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.759836 2578 log.go:25] "Validated CRI v1 image API" Apr 24 23:53:37.761180 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.761156 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 23:53:37.764007 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.763978 2578 fs.go:135] Filesystem UUIDs: map[2bd16d11-4e70-45b5-8f7b-67aad71275ea:/dev/nvme0n1p4 5f1bcb57-88f8-440e-bf29-2e89aa61bcac:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 24 23:53:37.764080 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.764006 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 23:53:37.764173 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.764158 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 23:53:37.771646 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.771530 2578 manager.go:217] Machine: {Timestamp:2026-04-24 23:53:37.769466477 +0000 UTC m=+0.403448408 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3142028 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2e8fb9d7170796cfe03b8030fa0edd SystemUUID:ec2e8fb9-d717-0796-cfe0-3b8030fa0edd BootID:de7033d6-a397-4633-9602-f31b9f9835a9 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a2:70:0c:34:4f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a2:70:0c:34:4f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:e2:ff:2a:e0:6f:70 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 23:53:37.771646 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.771634 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 23:53:37.771779 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.771740 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 23:53:37.774551 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.774530 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 23:53:37.774682 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.774555 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-98.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 23:53:37.774731 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.774692 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 23:53:37.774731 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.774701 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 23:53:37.774731 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.774714 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 23:53:37.775444 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.775429 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 23:53:37.776383 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.776372 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 24 23:53:37.776642 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.776632 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 23:53:37.778978 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.778968 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 24 23:53:37.779017 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.778982 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 23:53:37.779017 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.778994 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 23:53:37.779017 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.779003 2578 kubelet.go:397] "Adding apiserver pod source" Apr 24 23:53:37.779017 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.779011 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 23:53:37.780058 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.780047 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 23:53:37.780093 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.780065 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 23:53:37.783096 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.783081 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 23:53:37.784628 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.784615 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 23:53:37.785637 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.785618 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wx5c5" Apr 24 23:53:37.786747 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.786735 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 23:53:37.786803 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.786753 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 23:53:37.786803 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.786760 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 23:53:37.786803 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.786767 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 23:53:37.786803 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.786776 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 23:53:37.786803 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.786785 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 23:53:37.786803 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.786792 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 23:53:37.786803 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.786798 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 23:53:37.786803 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.786807 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 23:53:37.787155 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.786817 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 23:53:37.787155 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.786825 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 23:53:37.787155 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.786835 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 23:53:37.788534 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.788522 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 23:53:37.788534 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.788534 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 23:53:37.792172 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.792146 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-98.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 23:53:37.792172 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:37.792153 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-98.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 23:53:37.792295 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:37.792208 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 23:53:37.792464 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.792452 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 23:53:37.792502 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.792485 2578 server.go:1295] "Started kubelet" Apr 24 23:53:37.792583 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.792561 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 23:53:37.792691 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.792641 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 23:53:37.792741 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.792716 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 23:53:37.793283 ip-10-0-129-98 systemd[1]: Started Kubernetes Kubelet. Apr 24 23:53:37.794140 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.793942 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 23:53:37.794274 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.794254 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wx5c5" Apr 24 23:53:37.795322 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.795307 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 24 23:53:37.800496 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:37.799526 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-98.ec2.internal.18a97021a5cfd26f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-98.ec2.internal,UID:ip-10-0-129-98.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-98.ec2.internal,},FirstTimestamp:2026-04-24 23:53:37.792463471 +0000 UTC m=+0.426445397,LastTimestamp:2026-04-24 23:53:37.792463471 +0000 UTC m=+0.426445397,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-98.ec2.internal,}" Apr 24 23:53:37.802498 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.802480 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 23:53:37.802498 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.802489 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 23:53:37.803308 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.803184 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 23:53:37.803308 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.803197 2578 factory.go:55] Registering systemd factory Apr 24 23:53:37.803308 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.803221 2578 factory.go:223] Registration of the systemd container factory successfully Apr 24 23:53:37.803517 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:37.803367 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 24 23:53:37.803836 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.803730 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 23:53:37.803836 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.803750 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 23:53:37.803836 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.803789 2578 factory.go:153] Registering CRI-O factory Apr 24 23:53:37.803836 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.803804 2578 factory.go:223] Registration of the crio container factory successfully Apr 24 23:53:37.804195 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.804184 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 23:53:37.804299 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.804287 2578 factory.go:103] Registering Raw factory Apr 24 23:53:37.804375 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.804367 2578 manager.go:1196] Started watching for new ooms in manager Apr 24 23:53:37.804985 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.804971 2578 manager.go:319] Starting recovery of all containers Apr 24 23:53:37.806929 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.806908 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 24 23:53:37.806929 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.806927 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 24 23:53:37.809266 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.809245 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:37.810718 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:37.810535 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 23:53:37.812657 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.812626 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 23:53:37.812806 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:37.812782 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-98.ec2.internal\" not found" node="ip-10-0-129-98.ec2.internal" Apr 24 23:53:37.818530 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.818517 2578 manager.go:324] Recovery completed Apr 24 23:53:37.822557 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.822542 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:37.825199 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.825185 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:37.825271 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.825213 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:37.825271 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.825226 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:37.825748 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.825735 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 23:53:37.825748 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.825747 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 23:53:37.825825 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.825762 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 24 23:53:37.828474 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.828457 2578 policy_none.go:49] "None policy: Start" Apr 24 23:53:37.828550 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.828485 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 23:53:37.828550 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.828496 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 24 23:53:37.862942 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.862926 2578 manager.go:341] "Starting Device Plugin manager" Apr 24 23:53:37.863037 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:37.863001 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 23:53:37.863037 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.863016 2578 server.go:85] "Starting device plugin registration server" Apr 24 23:53:37.863240 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.863227 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 23:53:37.863302 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.863244 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 23:53:37.863351 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.863324 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 23:53:37.863448 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.863405 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 23:53:37.863539 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.863453 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 23:53:37.863943 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:37.863927 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 23:53:37.864014 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:37.863970 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-98.ec2.internal\" not found" Apr 24 23:53:37.907113 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.907084 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 23:53:37.907113 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.907114 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 23:53:37.908207 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.907131 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 23:53:37.908207 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.907138 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 23:53:37.908207 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:37.907166 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 23:53:37.909552 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.909532 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:37.963576 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.963528 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:37.964319 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.964301 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:37.964391 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.964336 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:37.964391 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.964353 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:37.964391 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.964382 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-98.ec2.internal" Apr 24 23:53:37.973137 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:37.973123 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-98.ec2.internal" Apr 24 23:53:37.973185 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:37.973144 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-98.ec2.internal\": node \"ip-10-0-129-98.ec2.internal\" not found" Apr 24 23:53:37.986169 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:37.986148 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 24 23:53:38.007504 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.007485 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-98.ec2.internal"] Apr 24 23:53:38.007556 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.007548 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:38.008836 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.008813 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:38.008886 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.008846 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:38.008886 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.008859 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:38.010201 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.010189 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:38.010340 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.010327 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" Apr 24 23:53:38.010377 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.010355 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:38.010841 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.010825 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:38.010929 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.010864 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:38.010929 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.010879 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:38.011008 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.010877 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:38.011047 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.011024 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:38.011047 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.011035 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:38.012041 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.012027 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-98.ec2.internal" Apr 24 23:53:38.012090 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.012051 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:38.012708 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.012689 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:38.012787 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.012729 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:38.012787 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.012750 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:38.048135 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:38.048114 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-98.ec2.internal\" not found" node="ip-10-0-129-98.ec2.internal" Apr 24 23:53:38.052471 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:38.052456 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-98.ec2.internal\" not found" node="ip-10-0-129-98.ec2.internal" Apr 24 23:53:38.086513 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:38.086491 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 24 23:53:38.107613 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.107594 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/32cc11a6fe1288d8e923d33bdeaf02c1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal\" (UID: \"32cc11a6fe1288d8e923d33bdeaf02c1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" Apr 24 23:53:38.107684 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.107617 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32cc11a6fe1288d8e923d33bdeaf02c1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal\" (UID: \"32cc11a6fe1288d8e923d33bdeaf02c1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" Apr 24 23:53:38.107684 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.107637 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/87dc53c55f73620bf5df44e2826c141e-config\") pod \"kube-apiserver-proxy-ip-10-0-129-98.ec2.internal\" (UID: \"87dc53c55f73620bf5df44e2826c141e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-98.ec2.internal" Apr 24 23:53:38.187121 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:38.187091 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 24 23:53:38.208451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.208404 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/32cc11a6fe1288d8e923d33bdeaf02c1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal\" (UID: \"32cc11a6fe1288d8e923d33bdeaf02c1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" Apr 24 23:53:38.208523 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.208459 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32cc11a6fe1288d8e923d33bdeaf02c1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal\" (UID: \"32cc11a6fe1288d8e923d33bdeaf02c1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" Apr 24 23:53:38.208523 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.208478 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/87dc53c55f73620bf5df44e2826c141e-config\") pod \"kube-apiserver-proxy-ip-10-0-129-98.ec2.internal\" (UID: \"87dc53c55f73620bf5df44e2826c141e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-98.ec2.internal" Apr 24 23:53:38.208523 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.208503 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/87dc53c55f73620bf5df44e2826c141e-config\") pod \"kube-apiserver-proxy-ip-10-0-129-98.ec2.internal\" (UID: \"87dc53c55f73620bf5df44e2826c141e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-98.ec2.internal" Apr 24 23:53:38.208523 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.208506 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/32cc11a6fe1288d8e923d33bdeaf02c1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal\" (UID: \"32cc11a6fe1288d8e923d33bdeaf02c1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" Apr 24 23:53:38.208662 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.208506 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32cc11a6fe1288d8e923d33bdeaf02c1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal\" (UID: \"32cc11a6fe1288d8e923d33bdeaf02c1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" Apr 24 23:53:38.287861 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:38.287837 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 24 23:53:38.350394 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.350360 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" Apr 24 23:53:38.355017 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.355001 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-98.ec2.internal" Apr 24 23:53:38.388757 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:38.388729 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 24 23:53:38.489295 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:38.489265 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 24 23:53:38.589787 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:38.589724 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 24 23:53:38.690276 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:38.690248 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 24 23:53:38.707587 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.707566 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 23:53:38.707705 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.707689 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 23:53:38.707756 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.707729 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 23:53:38.749775 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.749748 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:38.779673 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.779654 2578 apiserver.go:52] "Watching apiserver" Apr 24 23:53:38.792830 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.792503 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 23:53:38.792903 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.792857 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-dwll4","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6","openshift-image-registry/node-ca-vdjkm","openshift-multus/network-metrics-daemon-rhtrz","openshift-network-operator/iptables-alerter-kngps","openshift-ovn-kubernetes/ovnkube-node-t4f4q","openshift-cluster-node-tuning-operator/tuned-tmhfx","openshift-dns/node-resolver-s2pf5","openshift-multus/multus-additional-cni-plugins-7zs6q","openshift-multus/multus-ktchn","openshift-network-diagnostics/network-check-target-w2qd9"] Apr 24 23:53:38.794607 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.794589 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dwll4" Apr 24 23:53:38.796578 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.796560 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.796874 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.796854 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 23:53:38.796874 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.796863 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-l24q2\"" Apr 24 23:53:38.796978 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.796869 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 23:53:38.797703 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.797683 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s2pf5" Apr 24 23:53:38.798278 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.798240 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 23:48:37 +0000 UTC" deadline="2027-09-25 17:12:55.468289642 +0000 UTC" Apr 24 23:53:38.798278 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.798275 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12449h19m16.670017789s" Apr 24 23:53:38.798862 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.798639 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.798862 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.798656 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 23:53:38.798862 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.798721 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:53:38.798862 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.798649 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-tzfnr\"" Apr 24 23:53:38.799485 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.799469 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 23:53:38.799600 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.799584 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 23:53:38.799889 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.799866 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-d9czf\"" Apr 24 23:53:38.799976 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.799960 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:38.800392 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.800375 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 23:53:38.800489 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.800477 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 23:53:38.800753 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.800738 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 23:53:38.800753 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.800748 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 23:53:38.801089 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.801073 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-fgsl7\"" Apr 24 23:53:38.801247 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.801231 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vdjkm" Apr 24 23:53:38.801366 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.801337 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:53:38.801528 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:38.801504 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:53:38.801858 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.801839 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 23:53:38.802403 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.802379 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 23:53:38.802515 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.802487 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-5b882\"" Apr 24 23:53:38.802572 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.802555 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 23:53:38.802726 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.802704 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 23:53:38.803010 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.802993 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" Apr 24 23:53:38.803251 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.803231 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 23:53:38.803318 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.803259 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-n62l4\"" Apr 24 23:53:38.803511 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.803493 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 23:53:38.804366 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.804347 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 23:53:38.805150 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.805128 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kngps" Apr 24 23:53:38.805646 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.805293 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.806781 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.806765 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:38.807862 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.807842 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:53:38.807953 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:38.807890 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:53:38.809268 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.809249 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 23:53:38.809337 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.809298 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-q4glt\"" Apr 24 23:53:38.809834 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.809816 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 23:53:38.809935 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.809867 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 23:53:38.809935 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.809818 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 23:53:38.809935 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.809901 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-n52cn\"" Apr 24 23:53:38.809935 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.809919 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 23:53:38.811000 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.810384 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 23:53:38.811000 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.810388 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 23:53:38.811000 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.810602 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 23:53:38.811000 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.810783 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 23:53:38.811000 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.810844 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 23:53:38.811269 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.811214 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:53:38.811393 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.811375 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-r5stf\"" Apr 24 23:53:38.811980 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.811948 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/31f651c0-8e2e-4e85-b153-94f4291085b1-iptables-alerter-script\") pod \"iptables-alerter-kngps\" (UID: \"31f651c0-8e2e-4e85-b153-94f4291085b1\") " pod="openshift-network-operator/iptables-alerter-kngps" Apr 24 23:53:38.812048 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.811994 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-sys\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.812048 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812025 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-var-lib-kubelet\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.812144 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812056 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-host-var-lib-cni-multus\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.812144 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812089 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-host-run-netns\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.812144 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812139 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b50de4c3-3440-4c81-81ac-23466ec3f726-hosts-file\") pod \"node-resolver-s2pf5\" (UID: \"b50de4c3-3440-4c81-81ac-23466ec3f726\") " pod="openshift-dns/node-resolver-s2pf5" Apr 24 23:53:38.812290 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812168 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-host-run-netns\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.812290 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812196 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/220c5498-d45f-48c2-a25e-01ac23225100-host\") pod \"node-ca-vdjkm\" (UID: \"220c5498-d45f-48c2-a25e-01ac23225100\") " pod="openshift-image-registry/node-ca-vdjkm" Apr 24 23:53:38.812290 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812234 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4193b598-cb84-4f01-b039-cd235fe68381-sys-fs\") pod \"aws-ebs-csi-driver-node-7w9t6\" (UID: \"4193b598-cb84-4f01-b039-cd235fe68381\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:38.812290 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812258 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-var-lib-openvswitch\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.812290 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812287 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0939cda-0079-43e5-b1be-4f8099b11f56-ovn-node-metrics-cert\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.812528 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812317 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj86q\" (UniqueName: \"kubernetes.io/projected/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-kube-api-access-tj86q\") pod \"network-metrics-daemon-rhtrz\" (UID: \"d3fe756c-b2b5-42bc-8234-bd6d59e5dd29\") " pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:53:38.812528 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812363 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/31f651c0-8e2e-4e85-b153-94f4291085b1-host-slash\") pod \"iptables-alerter-kngps\" (UID: \"31f651c0-8e2e-4e85-b153-94f4291085b1\") " pod="openshift-network-operator/iptables-alerter-kngps" Apr 24 23:53:38.812528 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812392 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-etc-modprobe-d\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.812528 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812436 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-host\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.812528 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812459 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-etc-kubernetes\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.812528 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812488 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-host-cni-bin\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.812528 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812514 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-etc-sysctl-conf\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.812820 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812541 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-etc-systemd\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.812820 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812569 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-multus-cni-dir\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.812820 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812598 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-cnibin\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.812820 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812629 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-host-cni-netd\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.812820 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812678 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvlcd\" (UniqueName: \"kubernetes.io/projected/31f651c0-8e2e-4e85-b153-94f4291085b1-kube-api-access-mvlcd\") pod \"iptables-alerter-kngps\" (UID: \"31f651c0-8e2e-4e85-b153-94f4291085b1\") " pod="openshift-network-operator/iptables-alerter-kngps" Apr 24 23:53:38.812820 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812706 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-log-socket\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.812820 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812734 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/edeca547-37b0-442b-95dc-712808101f9a-tmp\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.812820 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812760 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4193b598-cb84-4f01-b039-cd235fe68381-device-dir\") pod \"aws-ebs-csi-driver-node-7w9t6\" (UID: \"4193b598-cb84-4f01-b039-cd235fe68381\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:38.812820 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812783 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-etc-kubernetes\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.812820 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812811 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4193b598-cb84-4f01-b039-cd235fe68381-registration-dir\") pod \"aws-ebs-csi-driver-node-7w9t6\" (UID: \"4193b598-cb84-4f01-b039-cd235fe68381\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:38.813238 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812838 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-host-kubelet\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.813238 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812867 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-run-ovn\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.813238 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812895 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs\") pod \"network-metrics-daemon-rhtrz\" (UID: \"d3fe756c-b2b5-42bc-8234-bd6d59e5dd29\") " pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:53:38.813238 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812919 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2xl6\" (UniqueName: \"kubernetes.io/projected/edeca547-37b0-442b-95dc-712808101f9a-kube-api-access-j2xl6\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.813238 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.812949 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-host-var-lib-cni-bin\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.813238 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.813022 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-host-var-lib-kubelet\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.813238 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.813062 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-multus-conf-dir\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.813238 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.813111 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-host-run-multus-certs\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.813238 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.813137 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4193b598-cb84-4f01-b039-cd235fe68381-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7w9t6\" (UID: \"4193b598-cb84-4f01-b039-cd235fe68381\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:38.813644 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.813271 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-etc-sysctl-d\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.813644 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.813358 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-lib-modules\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.813644 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.813434 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-os-release\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.813644 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.813483 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-multus-socket-dir-parent\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.813644 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.813529 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0939cda-0079-43e5-b1be-4f8099b11f56-env-overrides\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.813644 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.813582 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-cni-binary-copy\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.813644 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.813627 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4193b598-cb84-4f01-b039-cd235fe68381-etc-selinux\") pod \"aws-ebs-csi-driver-node-7w9t6\" (UID: \"4193b598-cb84-4f01-b039-cd235fe68381\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:38.813930 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.813669 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-run-systemd\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.813930 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.813709 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zcs7\" (UniqueName: \"kubernetes.io/projected/220c5498-d45f-48c2-a25e-01ac23225100-kube-api-access-6zcs7\") pod \"node-ca-vdjkm\" (UID: \"220c5498-d45f-48c2-a25e-01ac23225100\") " pod="openshift-image-registry/node-ca-vdjkm" Apr 24 23:53:38.813930 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.813806 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-host-run-k8s-cni-cncf-io\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.813930 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.813842 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-run\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.813930 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.813903 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/edeca547-37b0-442b-95dc-712808101f9a-etc-tuned\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.814142 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.813985 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gltvf\" (UniqueName: \"kubernetes.io/projected/b50de4c3-3440-4c81-81ac-23466ec3f726-kube-api-access-gltvf\") pod \"node-resolver-s2pf5\" (UID: \"b50de4c3-3440-4c81-81ac-23466ec3f726\") " pod="openshift-dns/node-resolver-s2pf5" Apr 24 23:53:38.814142 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.814045 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-system-cni-dir\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.814357 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.814341 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/35b60d0c-8fbf-4b7b-a0e6-31a46f1a96d2-konnectivity-ca\") pod \"konnectivity-agent-dwll4\" (UID: \"35b60d0c-8fbf-4b7b-a0e6-31a46f1a96d2\") " pod="kube-system/konnectivity-agent-dwll4" Apr 24 23:53:38.814436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.814381 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-etc-openvswitch\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.814436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.814428 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-multus-daemon-config\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.814520 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.814454 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpvkh\" (UniqueName: \"kubernetes.io/projected/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-kube-api-access-zpvkh\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.814520 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.814497 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/35b60d0c-8fbf-4b7b-a0e6-31a46f1a96d2-agent-certs\") pod \"konnectivity-agent-dwll4\" (UID: \"35b60d0c-8fbf-4b7b-a0e6-31a46f1a96d2\") " pod="kube-system/konnectivity-agent-dwll4" Apr 24 23:53:38.814600 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.814541 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-host-slash\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.814600 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.814565 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-run-openvswitch\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.814600 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.814587 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-node-log\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.814713 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.814611 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-host-run-ovn-kubernetes\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.814713 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.814634 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.814713 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.814671 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4193b598-cb84-4f01-b039-cd235fe68381-socket-dir\") pod \"aws-ebs-csi-driver-node-7w9t6\" (UID: \"4193b598-cb84-4f01-b039-cd235fe68381\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:38.814713 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.814698 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psbcv\" (UniqueName: \"kubernetes.io/projected/4193b598-cb84-4f01-b039-cd235fe68381-kube-api-access-psbcv\") pod \"aws-ebs-csi-driver-node-7w9t6\" (UID: \"4193b598-cb84-4f01-b039-cd235fe68381\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:38.814854 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.814723 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/220c5498-d45f-48c2-a25e-01ac23225100-serviceca\") pod \"node-ca-vdjkm\" (UID: \"220c5498-d45f-48c2-a25e-01ac23225100\") " pod="openshift-image-registry/node-ca-vdjkm" Apr 24 23:53:38.814854 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.814738 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-hostroot\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.814854 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.814752 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0939cda-0079-43e5-b1be-4f8099b11f56-ovnkube-config\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.814854 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.814767 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e0939cda-0079-43e5-b1be-4f8099b11f56-ovnkube-script-lib\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.814854 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.814811 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdj97\" (UniqueName: \"kubernetes.io/projected/e0939cda-0079-43e5-b1be-4f8099b11f56-kube-api-access-sdj97\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.814854 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.814827 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-etc-sysconfig\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.815040 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.814856 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b50de4c3-3440-4c81-81ac-23466ec3f726-tmp-dir\") pod \"node-resolver-s2pf5\" (UID: \"b50de4c3-3440-4c81-81ac-23466ec3f726\") " pod="openshift-dns/node-resolver-s2pf5" Apr 24 23:53:38.815040 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.814879 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-systemd-units\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.820296 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.820274 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 23:53:38.820400 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.820351 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-98.ec2.internal" Apr 24 23:53:38.820530 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.820464 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal"] Apr 24 23:53:38.822393 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.822373 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 23:53:38.829866 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.829849 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 23:53:38.830037 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.830018 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-129-98.ec2.internal"] Apr 24 23:53:38.842978 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.842958 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-xt7xt" Apr 24 23:53:38.850586 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.850569 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-xt7xt" Apr 24 23:53:38.904230 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.904206 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 23:53:38.915033 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915002 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tj86q\" (UniqueName: \"kubernetes.io/projected/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-kube-api-access-tj86q\") pod \"network-metrics-daemon-rhtrz\" (UID: \"d3fe756c-b2b5-42bc-8234-bd6d59e5dd29\") " pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:53:38.915134 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915044 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/31f651c0-8e2e-4e85-b153-94f4291085b1-host-slash\") pod \"iptables-alerter-kngps\" (UID: \"31f651c0-8e2e-4e85-b153-94f4291085b1\") " pod="openshift-network-operator/iptables-alerter-kngps" Apr 24 23:53:38.915134 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915069 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-etc-modprobe-d\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.915134 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915093 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-host\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.915285 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915247 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-etc-kubernetes\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.915285 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915277 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-host-cni-bin\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.915390 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915301 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-host\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.915390 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915308 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d89d33b9-52c1-474f-a5b8-221754ae1cc6-system-cni-dir\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:38.915390 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915337 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-etc-sysctl-conf\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.915390 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915362 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-etc-systemd\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.915390 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915370 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/31f651c0-8e2e-4e85-b153-94f4291085b1-host-slash\") pod \"iptables-alerter-kngps\" (UID: \"31f651c0-8e2e-4e85-b153-94f4291085b1\") " pod="openshift-network-operator/iptables-alerter-kngps" Apr 24 23:53:38.915390 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915386 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-multus-cni-dir\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.915694 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915432 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-cnibin\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.915694 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915439 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-etc-kubernetes\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.915694 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915464 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-host-cni-netd\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.915694 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915481 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-etc-modprobe-d\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.915694 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915491 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvlcd\" (UniqueName: \"kubernetes.io/projected/31f651c0-8e2e-4e85-b153-94f4291085b1-kube-api-access-mvlcd\") pod \"iptables-alerter-kngps\" (UID: \"31f651c0-8e2e-4e85-b153-94f4291085b1\") " pod="openshift-network-operator/iptables-alerter-kngps" Apr 24 23:53:38.915694 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915494 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-etc-systemd\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.915694 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915373 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-host-cni-bin\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.915694 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915536 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-host-cni-netd\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.915694 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915593 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-etc-sysctl-conf\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.915694 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915587 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-multus-cni-dir\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.915694 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915638 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-log-socket\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.915694 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915647 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-cnibin\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.915694 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915664 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/edeca547-37b0-442b-95dc-712808101f9a-tmp\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.915694 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915683 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-log-socket\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.915694 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915698 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4193b598-cb84-4f01-b039-cd235fe68381-device-dir\") pod \"aws-ebs-csi-driver-node-7w9t6\" (UID: \"4193b598-cb84-4f01-b039-cd235fe68381\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:38.916356 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915728 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d89d33b9-52c1-474f-a5b8-221754ae1cc6-os-release\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:38.916356 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915756 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d89d33b9-52c1-474f-a5b8-221754ae1cc6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:38.916356 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915786 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-etc-kubernetes\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.916356 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915791 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4193b598-cb84-4f01-b039-cd235fe68381-device-dir\") pod \"aws-ebs-csi-driver-node-7w9t6\" (UID: \"4193b598-cb84-4f01-b039-cd235fe68381\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:38.916356 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915827 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4193b598-cb84-4f01-b039-cd235fe68381-registration-dir\") pod \"aws-ebs-csi-driver-node-7w9t6\" (UID: \"4193b598-cb84-4f01-b039-cd235fe68381\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:38.916356 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915848 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-etc-kubernetes\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.916356 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915886 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-host-kubelet\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.916356 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915912 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-run-ovn\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.916356 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915918 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 23:53:38.916356 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915937 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs\") pod \"network-metrics-daemon-rhtrz\" (UID: \"d3fe756c-b2b5-42bc-8234-bd6d59e5dd29\") " pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:53:38.916356 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915940 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-host-kubelet\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.916356 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915979 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2xl6\" (UniqueName: \"kubernetes.io/projected/edeca547-37b0-442b-95dc-712808101f9a-kube-api-access-j2xl6\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.916356 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916002 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-host-var-lib-cni-bin\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.916356 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.915979 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-run-ovn\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.916356 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916046 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-host-var-lib-kubelet\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.916356 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:38.916051 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:38.916356 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916072 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-multus-conf-dir\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.916356 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916105 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-host-run-multus-certs\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.917004 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:38.916137 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs podName:d3fe756c-b2b5-42bc-8234-bd6d59e5dd29 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:39.416105029 +0000 UTC m=+2.050086957 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs") pod "network-metrics-daemon-rhtrz" (UID: "d3fe756c-b2b5-42bc-8234-bd6d59e5dd29") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:38.917004 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916151 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-host-run-multus-certs\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.917004 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916177 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-host-var-lib-cni-bin\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.917004 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916212 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4193b598-cb84-4f01-b039-cd235fe68381-registration-dir\") pod \"aws-ebs-csi-driver-node-7w9t6\" (UID: \"4193b598-cb84-4f01-b039-cd235fe68381\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:38.917004 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:38.916252 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87dc53c55f73620bf5df44e2826c141e.slice/crio-ab58bc99f9027988a04e67b21ba3b1a5285db2458895e1170412bf9072b402ac WatchSource:0}: Error finding container ab58bc99f9027988a04e67b21ba3b1a5285db2458895e1170412bf9072b402ac: Status 404 returned error can't find the container with id ab58bc99f9027988a04e67b21ba3b1a5285db2458895e1170412bf9072b402ac Apr 24 23:53:38.917004 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916299 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-host-var-lib-kubelet\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.917004 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916343 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4193b598-cb84-4f01-b039-cd235fe68381-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7w9t6\" (UID: \"4193b598-cb84-4f01-b039-cd235fe68381\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:38.917004 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916396 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsfsm\" (UniqueName: \"kubernetes.io/projected/d89d33b9-52c1-474f-a5b8-221754ae1cc6-kube-api-access-xsfsm\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:38.917004 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916445 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-etc-sysctl-d\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.917004 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916450 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-multus-conf-dir\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.917004 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916490 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-lib-modules\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.917004 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916510 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4193b598-cb84-4f01-b039-cd235fe68381-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7w9t6\" (UID: \"4193b598-cb84-4f01-b039-cd235fe68381\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:38.917004 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916514 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-os-release\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.917004 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916551 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-multus-socket-dir-parent\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.917004 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916573 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-os-release\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.917004 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916577 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0939cda-0079-43e5-b1be-4f8099b11f56-env-overrides\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.917502 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916635 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-multus-socket-dir-parent\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.917502 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916652 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-etc-sysctl-d\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.917502 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916663 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-cni-binary-copy\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.917502 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916687 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4193b598-cb84-4f01-b039-cd235fe68381-etc-selinux\") pod \"aws-ebs-csi-driver-node-7w9t6\" (UID: \"4193b598-cb84-4f01-b039-cd235fe68381\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:38.917502 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916728 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-run-systemd\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.917502 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916751 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zcs7\" (UniqueName: \"kubernetes.io/projected/220c5498-d45f-48c2-a25e-01ac23225100-kube-api-access-6zcs7\") pod \"node-ca-vdjkm\" (UID: \"220c5498-d45f-48c2-a25e-01ac23225100\") " pod="openshift-image-registry/node-ca-vdjkm" Apr 24 23:53:38.917502 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916776 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-host-run-k8s-cni-cncf-io\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.917502 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916802 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-lib-modules\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.917502 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916816 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-run\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.917502 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916840 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/edeca547-37b0-442b-95dc-712808101f9a-etc-tuned\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.917502 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916851 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4193b598-cb84-4f01-b039-cd235fe68381-etc-selinux\") pod \"aws-ebs-csi-driver-node-7w9t6\" (UID: \"4193b598-cb84-4f01-b039-cd235fe68381\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:38.917502 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916881 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gltvf\" (UniqueName: \"kubernetes.io/projected/b50de4c3-3440-4c81-81ac-23466ec3f726-kube-api-access-gltvf\") pod \"node-resolver-s2pf5\" (UID: \"b50de4c3-3440-4c81-81ac-23466ec3f726\") " pod="openshift-dns/node-resolver-s2pf5" Apr 24 23:53:38.917502 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:38.916903 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32cc11a6fe1288d8e923d33bdeaf02c1.slice/crio-11b0023c78a7e2a29e71b8c1d2b64fc5e2487e3a4aa7ecd9894a04cb36c19821 WatchSource:0}: Error finding container 11b0023c78a7e2a29e71b8c1d2b64fc5e2487e3a4aa7ecd9894a04cb36c19821: Status 404 returned error can't find the container with id 11b0023c78a7e2a29e71b8c1d2b64fc5e2487e3a4aa7ecd9894a04cb36c19821 Apr 24 23:53:38.917502 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.916907 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-system-cni-dir\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.917502 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917049 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/35b60d0c-8fbf-4b7b-a0e6-31a46f1a96d2-konnectivity-ca\") pod \"konnectivity-agent-dwll4\" (UID: \"35b60d0c-8fbf-4b7b-a0e6-31a46f1a96d2\") " pod="kube-system/konnectivity-agent-dwll4" Apr 24 23:53:38.917502 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917080 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-etc-openvswitch\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.917502 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917074 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0939cda-0079-43e5-b1be-4f8099b11f56-env-overrides\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.918448 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917108 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-multus-daemon-config\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.918448 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917134 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpvkh\" (UniqueName: \"kubernetes.io/projected/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-kube-api-access-zpvkh\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.918448 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917162 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/35b60d0c-8fbf-4b7b-a0e6-31a46f1a96d2-agent-certs\") pod \"konnectivity-agent-dwll4\" (UID: \"35b60d0c-8fbf-4b7b-a0e6-31a46f1a96d2\") " pod="kube-system/konnectivity-agent-dwll4" Apr 24 23:53:38.918448 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917164 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-run-systemd\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.918448 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917246 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-host-slash\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.918448 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917288 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-run-openvswitch\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.918448 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917323 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-node-log\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.918448 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917349 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-host-run-ovn-kubernetes\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.918448 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917382 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.918448 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917435 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4193b598-cb84-4f01-b039-cd235fe68381-socket-dir\") pod \"aws-ebs-csi-driver-node-7w9t6\" (UID: \"4193b598-cb84-4f01-b039-cd235fe68381\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:38.918448 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917466 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psbcv\" (UniqueName: \"kubernetes.io/projected/4193b598-cb84-4f01-b039-cd235fe68381-kube-api-access-psbcv\") pod \"aws-ebs-csi-driver-node-7w9t6\" (UID: \"4193b598-cb84-4f01-b039-cd235fe68381\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:38.918448 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917494 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/220c5498-d45f-48c2-a25e-01ac23225100-serviceca\") pod \"node-ca-vdjkm\" (UID: \"220c5498-d45f-48c2-a25e-01ac23225100\") " pod="openshift-image-registry/node-ca-vdjkm" Apr 24 23:53:38.918448 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917585 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-hostroot\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.918448 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917614 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0939cda-0079-43e5-b1be-4f8099b11f56-ovnkube-config\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.918448 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917637 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e0939cda-0079-43e5-b1be-4f8099b11f56-ovnkube-script-lib\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.918448 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917671 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4193b598-cb84-4f01-b039-cd235fe68381-socket-dir\") pod \"aws-ebs-csi-driver-node-7w9t6\" (UID: \"4193b598-cb84-4f01-b039-cd235fe68381\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:38.918448 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917685 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-cni-binary-copy\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.919287 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917693 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdj97\" (UniqueName: \"kubernetes.io/projected/e0939cda-0079-43e5-b1be-4f8099b11f56-kube-api-access-sdj97\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.919287 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917732 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d89d33b9-52c1-474f-a5b8-221754ae1cc6-cni-binary-copy\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:38.919287 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917752 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-run-openvswitch\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.919287 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917765 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xgh5\" (UniqueName: \"kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5\") pod \"network-check-target-w2qd9\" (UID: \"ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1\") " pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:53:38.919287 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917792 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-etc-sysconfig\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.919287 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917798 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-hostroot\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.919287 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917820 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b50de4c3-3440-4c81-81ac-23466ec3f726-tmp-dir\") pod \"node-resolver-s2pf5\" (UID: \"b50de4c3-3440-4c81-81ac-23466ec3f726\") " pod="openshift-dns/node-resolver-s2pf5" Apr 24 23:53:38.919287 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917858 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-systemd-units\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.919287 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.917910 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d89d33b9-52c1-474f-a5b8-221754ae1cc6-cnibin\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:38.919287 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.918013 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d89d33b9-52c1-474f-a5b8-221754ae1cc6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:38.919287 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.918079 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/31f651c0-8e2e-4e85-b153-94f4291085b1-iptables-alerter-script\") pod \"iptables-alerter-kngps\" (UID: \"31f651c0-8e2e-4e85-b153-94f4291085b1\") " pod="openshift-network-operator/iptables-alerter-kngps" Apr 24 23:53:38.919287 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.918108 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-sys\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.919287 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.918114 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b50de4c3-3440-4c81-81ac-23466ec3f726-tmp-dir\") pod \"node-resolver-s2pf5\" (UID: \"b50de4c3-3440-4c81-81ac-23466ec3f726\") " pod="openshift-dns/node-resolver-s2pf5" Apr 24 23:53:38.919287 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.918137 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-var-lib-kubelet\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.919287 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.918172 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/220c5498-d45f-48c2-a25e-01ac23225100-serviceca\") pod \"node-ca-vdjkm\" (UID: \"220c5498-d45f-48c2-a25e-01ac23225100\") " pod="openshift-image-registry/node-ca-vdjkm" Apr 24 23:53:38.919287 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.918170 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-host-var-lib-cni-multus\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.919287 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.918225 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-node-log\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.920174 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.918273 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-host-run-ovn-kubernetes\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.920174 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.918318 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.920174 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.918597 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-host-run-k8s-cni-cncf-io\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.920174 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.918697 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0939cda-0079-43e5-b1be-4f8099b11f56-ovnkube-config\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.920174 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.919232 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-host-var-lib-cni-multus\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.920174 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.919278 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-systemd-units\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.920174 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.919299 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/edeca547-37b0-442b-95dc-712808101f9a-tmp\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.920174 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.919824 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e0939cda-0079-43e5-b1be-4f8099b11f56-ovnkube-script-lib\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.920819 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.920801 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-sys\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.920912 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.920793 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/31f651c0-8e2e-4e85-b153-94f4291085b1-iptables-alerter-script\") pod \"iptables-alerter-kngps\" (UID: \"31f651c0-8e2e-4e85-b153-94f4291085b1\") " pod="openshift-network-operator/iptables-alerter-kngps" Apr 24 23:53:38.920989 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.920958 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/35b60d0c-8fbf-4b7b-a0e6-31a46f1a96d2-agent-certs\") pod \"konnectivity-agent-dwll4\" (UID: \"35b60d0c-8fbf-4b7b-a0e6-31a46f1a96d2\") " pod="kube-system/konnectivity-agent-dwll4" Apr 24 23:53:38.920989 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.920977 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-var-lib-kubelet\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.921136 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.921076 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-run\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.921577 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.921558 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/35b60d0c-8fbf-4b7b-a0e6-31a46f1a96d2-konnectivity-ca\") pod \"konnectivity-agent-dwll4\" (UID: \"35b60d0c-8fbf-4b7b-a0e6-31a46f1a96d2\") " pod="kube-system/konnectivity-agent-dwll4" Apr 24 23:53:38.921687 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.921671 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-system-cni-dir\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.922018 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.921996 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-host-slash\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.922109 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.922095 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/edeca547-37b0-442b-95dc-712808101f9a-etc-sysconfig\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.924098 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.922660 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-host-run-netns\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.924098 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.922705 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b50de4c3-3440-4c81-81ac-23466ec3f726-hosts-file\") pod \"node-resolver-s2pf5\" (UID: \"b50de4c3-3440-4c81-81ac-23466ec3f726\") " pod="openshift-dns/node-resolver-s2pf5" Apr 24 23:53:38.924098 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.922733 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-host-run-netns\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.924098 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.922764 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/220c5498-d45f-48c2-a25e-01ac23225100-host\") pod \"node-ca-vdjkm\" (UID: \"220c5498-d45f-48c2-a25e-01ac23225100\") " pod="openshift-image-registry/node-ca-vdjkm" Apr 24 23:53:38.924098 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.922773 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-etc-openvswitch\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.924098 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.922848 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/220c5498-d45f-48c2-a25e-01ac23225100-host\") pod \"node-ca-vdjkm\" (UID: \"220c5498-d45f-48c2-a25e-01ac23225100\") " pod="openshift-image-registry/node-ca-vdjkm" Apr 24 23:53:38.924098 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.922849 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b50de4c3-3440-4c81-81ac-23466ec3f726-hosts-file\") pod \"node-resolver-s2pf5\" (UID: \"b50de4c3-3440-4c81-81ac-23466ec3f726\") " pod="openshift-dns/node-resolver-s2pf5" Apr 24 23:53:38.924098 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.922886 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4193b598-cb84-4f01-b039-cd235fe68381-sys-fs\") pod \"aws-ebs-csi-driver-node-7w9t6\" (UID: \"4193b598-cb84-4f01-b039-cd235fe68381\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:38.924098 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.922919 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-host-run-netns\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.924098 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.922958 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-var-lib-openvswitch\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.924098 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.922996 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0939cda-0079-43e5-b1be-4f8099b11f56-ovn-node-metrics-cert\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.924098 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.923002 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-host-run-netns\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.924098 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.923056 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0939cda-0079-43e5-b1be-4f8099b11f56-var-lib-openvswitch\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.924098 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.923113 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4193b598-cb84-4f01-b039-cd235fe68381-sys-fs\") pod \"aws-ebs-csi-driver-node-7w9t6\" (UID: \"4193b598-cb84-4f01-b039-cd235fe68381\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:38.924098 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.923301 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d89d33b9-52c1-474f-a5b8-221754ae1cc6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:38.924979 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.924400 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-multus-daemon-config\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.924979 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.924860 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvlcd\" (UniqueName: \"kubernetes.io/projected/31f651c0-8e2e-4e85-b153-94f4291085b1-kube-api-access-mvlcd\") pod \"iptables-alerter-kngps\" (UID: \"31f651c0-8e2e-4e85-b153-94f4291085b1\") " pod="openshift-network-operator/iptables-alerter-kngps" Apr 24 23:53:38.925849 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.925824 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/edeca547-37b0-442b-95dc-712808101f9a-etc-tuned\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.927034 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.927006 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj86q\" (UniqueName: \"kubernetes.io/projected/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-kube-api-access-tj86q\") pod \"network-metrics-daemon-rhtrz\" (UID: \"d3fe756c-b2b5-42bc-8234-bd6d59e5dd29\") " pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:53:38.927254 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.927228 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gltvf\" (UniqueName: \"kubernetes.io/projected/b50de4c3-3440-4c81-81ac-23466ec3f726-kube-api-access-gltvf\") pod \"node-resolver-s2pf5\" (UID: \"b50de4c3-3440-4c81-81ac-23466ec3f726\") " pod="openshift-dns/node-resolver-s2pf5" Apr 24 23:53:38.927581 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.927558 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0939cda-0079-43e5-b1be-4f8099b11f56-ovn-node-metrics-cert\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:38.928373 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.928352 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:53:38.928661 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.928577 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psbcv\" (UniqueName: \"kubernetes.io/projected/4193b598-cb84-4f01-b039-cd235fe68381-kube-api-access-psbcv\") pod \"aws-ebs-csi-driver-node-7w9t6\" (UID: \"4193b598-cb84-4f01-b039-cd235fe68381\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:38.929002 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.928979 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2xl6\" (UniqueName: \"kubernetes.io/projected/edeca547-37b0-442b-95dc-712808101f9a-kube-api-access-j2xl6\") pod \"tuned-tmhfx\" (UID: \"edeca547-37b0-442b-95dc-712808101f9a\") " pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:38.930530 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.930514 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zcs7\" (UniqueName: \"kubernetes.io/projected/220c5498-d45f-48c2-a25e-01ac23225100-kube-api-access-6zcs7\") pod \"node-ca-vdjkm\" (UID: \"220c5498-d45f-48c2-a25e-01ac23225100\") " pod="openshift-image-registry/node-ca-vdjkm" Apr 24 23:53:38.930993 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.930978 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpvkh\" (UniqueName: \"kubernetes.io/projected/8566f65b-b13b-4b52-8b4d-8dcbd70b502a-kube-api-access-zpvkh\") pod \"multus-ktchn\" (UID: \"8566f65b-b13b-4b52-8b4d-8dcbd70b502a\") " pod="openshift-multus/multus-ktchn" Apr 24 23:53:38.931513 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:38.931496 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdj97\" (UniqueName: \"kubernetes.io/projected/e0939cda-0079-43e5-b1be-4f8099b11f56-kube-api-access-sdj97\") pod \"ovnkube-node-t4f4q\" (UID: \"e0939cda-0079-43e5-b1be-4f8099b11f56\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:39.024329 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.024297 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d89d33b9-52c1-474f-a5b8-221754ae1cc6-cni-binary-copy\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:39.024329 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.024324 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xgh5\" (UniqueName: \"kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5\") pod \"network-check-target-w2qd9\" (UID: \"ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1\") " pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:53:39.024559 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.024340 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d89d33b9-52c1-474f-a5b8-221754ae1cc6-cnibin\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:39.024559 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.024387 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d89d33b9-52c1-474f-a5b8-221754ae1cc6-cnibin\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:39.024559 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.024460 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d89d33b9-52c1-474f-a5b8-221754ae1cc6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:39.024559 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.024526 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d89d33b9-52c1-474f-a5b8-221754ae1cc6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:39.024753 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.024565 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d89d33b9-52c1-474f-a5b8-221754ae1cc6-system-cni-dir\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:39.024753 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.024612 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d89d33b9-52c1-474f-a5b8-221754ae1cc6-os-release\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:39.024753 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.024632 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d89d33b9-52c1-474f-a5b8-221754ae1cc6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:39.024753 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.024637 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d89d33b9-52c1-474f-a5b8-221754ae1cc6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:39.024753 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.024702 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d89d33b9-52c1-474f-a5b8-221754ae1cc6-os-release\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:39.024753 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.024722 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xsfsm\" (UniqueName: \"kubernetes.io/projected/d89d33b9-52c1-474f-a5b8-221754ae1cc6-kube-api-access-xsfsm\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:39.024753 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.024739 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d89d33b9-52c1-474f-a5b8-221754ae1cc6-system-cni-dir\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:39.025083 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.024891 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d89d33b9-52c1-474f-a5b8-221754ae1cc6-cni-binary-copy\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:39.025083 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.025071 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d89d33b9-52c1-474f-a5b8-221754ae1cc6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:39.025708 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.025693 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d89d33b9-52c1-474f-a5b8-221754ae1cc6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:39.030215 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:39.030197 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:39.030275 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:39.030219 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:39.030275 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:39.030233 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8xgh5 for pod openshift-network-diagnostics/network-check-target-w2qd9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:39.030404 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:39.030391 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5 podName:ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:39.530372795 +0000 UTC m=+2.164354726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8xgh5" (UniqueName: "kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5") pod "network-check-target-w2qd9" (UID: "ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:39.032352 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.032336 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsfsm\" (UniqueName: \"kubernetes.io/projected/d89d33b9-52c1-474f-a5b8-221754ae1cc6-kube-api-access-xsfsm\") pod \"multus-additional-cni-plugins-7zs6q\" (UID: \"d89d33b9-52c1-474f-a5b8-221754ae1cc6\") " pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:39.103184 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.103117 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:39.124060 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.124037 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dwll4" Apr 24 23:53:39.130444 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:39.130405 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35b60d0c_8fbf_4b7b_a0e6_31a46f1a96d2.slice/crio-adba9af593cd1d0004bc2dd2b9a0ad90f65c0bebfe8809f7e87d6b2184a3ba8b WatchSource:0}: Error finding container adba9af593cd1d0004bc2dd2b9a0ad90f65c0bebfe8809f7e87d6b2184a3ba8b: Status 404 returned error can't find the container with id adba9af593cd1d0004bc2dd2b9a0ad90f65c0bebfe8809f7e87d6b2184a3ba8b Apr 24 23:53:39.133264 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.133246 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:53:39.139400 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:39.139380 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0939cda_0079_43e5_b1be_4f8099b11f56.slice/crio-64d9ed7b2a69445a451663255e36cda87da0e8acb1b4ae7e53a787b11851e1d5 WatchSource:0}: Error finding container 64d9ed7b2a69445a451663255e36cda87da0e8acb1b4ae7e53a787b11851e1d5: Status 404 returned error can't find the container with id 64d9ed7b2a69445a451663255e36cda87da0e8acb1b4ae7e53a787b11851e1d5 Apr 24 23:53:39.146316 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.146302 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" Apr 24 23:53:39.151728 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:39.151709 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedeca547_37b0_442b_95dc_712808101f9a.slice/crio-f233c264280b36e92c2f697b95da8abbee474a554800bf2e8d9d2b09cc2e5a0f WatchSource:0}: Error finding container f233c264280b36e92c2f697b95da8abbee474a554800bf2e8d9d2b09cc2e5a0f: Status 404 returned error can't find the container with id f233c264280b36e92c2f697b95da8abbee474a554800bf2e8d9d2b09cc2e5a0f Apr 24 23:53:39.155659 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.155646 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s2pf5" Apr 24 23:53:39.161361 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:39.161344 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb50de4c3_3440_4c81_81ac_23466ec3f726.slice/crio-82ece14b764c72a0f521cbc74a4a5d26bf06d998b06b3ec48c138d960d2d0f33 WatchSource:0}: Error finding container 82ece14b764c72a0f521cbc74a4a5d26bf06d998b06b3ec48c138d960d2d0f33: Status 404 returned error can't find the container with id 82ece14b764c72a0f521cbc74a4a5d26bf06d998b06b3ec48c138d960d2d0f33 Apr 24 23:53:39.168324 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.168308 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ktchn" Apr 24 23:53:39.173822 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:39.173801 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8566f65b_b13b_4b52_8b4d_8dcbd70b502a.slice/crio-f62bd8bffd524eb1f2e4acf476b5b790d5346776b552c1cfb5b4514bd8492837 WatchSource:0}: Error finding container f62bd8bffd524eb1f2e4acf476b5b790d5346776b552c1cfb5b4514bd8492837: Status 404 returned error can't find the container with id f62bd8bffd524eb1f2e4acf476b5b790d5346776b552c1cfb5b4514bd8492837 Apr 24 23:53:39.186110 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.186094 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" Apr 24 23:53:39.194100 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:39.194084 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4193b598_cb84_4f01_b039_cd235fe68381.slice/crio-55052ec6570ca74fd3c70ea259f2bfe9774845fe5e778346d540d348aea51a08 WatchSource:0}: Error finding container 55052ec6570ca74fd3c70ea259f2bfe9774845fe5e778346d540d348aea51a08: Status 404 returned error can't find the container with id 55052ec6570ca74fd3c70ea259f2bfe9774845fe5e778346d540d348aea51a08 Apr 24 23:53:39.206642 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.206626 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vdjkm" Apr 24 23:53:39.212165 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:39.211951 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod220c5498_d45f_48c2_a25e_01ac23225100.slice/crio-0de6ea305ecc607f58f479755f19b7e92cf63fa8d35b344ff41f5619e1e7dab8 WatchSource:0}: Error finding container 0de6ea305ecc607f58f479755f19b7e92cf63fa8d35b344ff41f5619e1e7dab8: Status 404 returned error can't find the container with id 0de6ea305ecc607f58f479755f19b7e92cf63fa8d35b344ff41f5619e1e7dab8 Apr 24 23:53:39.215896 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.215879 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kngps" Apr 24 23:53:39.223200 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:39.223171 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31f651c0_8e2e_4e85_b153_94f4291085b1.slice/crio-f15b925fd326c462e380c3a82187dc8eb9ef51d4d2a5e6b5c5158f5e923ef4bb WatchSource:0}: Error finding container f15b925fd326c462e380c3a82187dc8eb9ef51d4d2a5e6b5c5158f5e923ef4bb: Status 404 returned error can't find the container with id f15b925fd326c462e380c3a82187dc8eb9ef51d4d2a5e6b5c5158f5e923ef4bb Apr 24 23:53:39.226768 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.226605 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7zs6q" Apr 24 23:53:39.235886 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:53:39.235860 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd89d33b9_52c1_474f_a5b8_221754ae1cc6.slice/crio-9e862c44985cac9d759956e98c52d575190748c5a9a4fa2f1018d38c7e1d2013 WatchSource:0}: Error finding container 9e862c44985cac9d759956e98c52d575190748c5a9a4fa2f1018d38c7e1d2013: Status 404 returned error can't find the container with id 9e862c44985cac9d759956e98c52d575190748c5a9a4fa2f1018d38c7e1d2013 Apr 24 23:53:39.428889 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.428803 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs\") pod \"network-metrics-daemon-rhtrz\" (UID: \"d3fe756c-b2b5-42bc-8234-bd6d59e5dd29\") " pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:53:39.429041 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:39.428977 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:39.429041 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:39.429038 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs podName:d3fe756c-b2b5-42bc-8234-bd6d59e5dd29 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:40.429018756 +0000 UTC m=+3.063000678 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs") pod "network-metrics-daemon-rhtrz" (UID: "d3fe756c-b2b5-42bc-8234-bd6d59e5dd29") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:39.616531 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.616501 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:39.629777 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.629671 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xgh5\" (UniqueName: \"kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5\") pod \"network-check-target-w2qd9\" (UID: \"ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1\") " pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:53:39.629958 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:39.629867 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:39.629958 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:39.629887 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:39.629958 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:39.629899 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8xgh5 for pod openshift-network-diagnostics/network-check-target-w2qd9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:39.629958 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:39.629956 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5 podName:ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:40.629937625 +0000 UTC m=+3.263919540 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-8xgh5" (UniqueName: "kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5") pod "network-check-target-w2qd9" (UID: "ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:39.851171 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.851129 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 23:48:38 +0000 UTC" deadline="2028-01-08 05:28:53.79507372 +0000 UTC" Apr 24 23:53:39.851171 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.851169 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14957h35m13.943908641s" Apr 24 23:53:39.910825 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.910142 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:53:39.910825 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:39.910262 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:53:39.937251 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.937167 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" event={"ID":"4193b598-cb84-4f01-b039-cd235fe68381","Type":"ContainerStarted","Data":"55052ec6570ca74fd3c70ea259f2bfe9774845fe5e778346d540d348aea51a08"} Apr 24 23:53:39.943347 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.943274 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s2pf5" event={"ID":"b50de4c3-3440-4c81-81ac-23466ec3f726","Type":"ContainerStarted","Data":"82ece14b764c72a0f521cbc74a4a5d26bf06d998b06b3ec48c138d960d2d0f33"} Apr 24 23:53:39.951664 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.951590 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" event={"ID":"edeca547-37b0-442b-95dc-712808101f9a","Type":"ContainerStarted","Data":"f233c264280b36e92c2f697b95da8abbee474a554800bf2e8d9d2b09cc2e5a0f"} Apr 24 23:53:39.956211 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.956139 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dwll4" event={"ID":"35b60d0c-8fbf-4b7b-a0e6-31a46f1a96d2","Type":"ContainerStarted","Data":"adba9af593cd1d0004bc2dd2b9a0ad90f65c0bebfe8809f7e87d6b2184a3ba8b"} Apr 24 23:53:39.963560 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.963526 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7zs6q" event={"ID":"d89d33b9-52c1-474f-a5b8-221754ae1cc6","Type":"ContainerStarted","Data":"9e862c44985cac9d759956e98c52d575190748c5a9a4fa2f1018d38c7e1d2013"} Apr 24 23:53:39.969479 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.969442 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vdjkm" event={"ID":"220c5498-d45f-48c2-a25e-01ac23225100","Type":"ContainerStarted","Data":"0de6ea305ecc607f58f479755f19b7e92cf63fa8d35b344ff41f5619e1e7dab8"} Apr 24 23:53:39.977581 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.977524 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ktchn" event={"ID":"8566f65b-b13b-4b52-8b4d-8dcbd70b502a","Type":"ContainerStarted","Data":"f62bd8bffd524eb1f2e4acf476b5b790d5346776b552c1cfb5b4514bd8492837"} Apr 24 23:53:39.987033 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.986999 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" event={"ID":"e0939cda-0079-43e5-b1be-4f8099b11f56","Type":"ContainerStarted","Data":"64d9ed7b2a69445a451663255e36cda87da0e8acb1b4ae7e53a787b11851e1d5"} Apr 24 23:53:39.991048 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:39.991022 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-98.ec2.internal" event={"ID":"87dc53c55f73620bf5df44e2826c141e","Type":"ContainerStarted","Data":"ab58bc99f9027988a04e67b21ba3b1a5285db2458895e1170412bf9072b402ac"} Apr 24 23:53:40.007265 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:40.007207 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" event={"ID":"32cc11a6fe1288d8e923d33bdeaf02c1","Type":"ContainerStarted","Data":"11b0023c78a7e2a29e71b8c1d2b64fc5e2487e3a4aa7ecd9894a04cb36c19821"} Apr 24 23:53:40.011628 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:40.011488 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kngps" event={"ID":"31f651c0-8e2e-4e85-b153-94f4291085b1","Type":"ContainerStarted","Data":"f15b925fd326c462e380c3a82187dc8eb9ef51d4d2a5e6b5c5158f5e923ef4bb"} Apr 24 23:53:40.171202 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:40.170937 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:40.438436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:40.437766 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs\") pod \"network-metrics-daemon-rhtrz\" (UID: \"d3fe756c-b2b5-42bc-8234-bd6d59e5dd29\") " pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:53:40.438436 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:40.437933 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:40.438436 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:40.437997 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs podName:d3fe756c-b2b5-42bc-8234-bd6d59e5dd29 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:42.437979323 +0000 UTC m=+5.071961239 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs") pod "network-metrics-daemon-rhtrz" (UID: "d3fe756c-b2b5-42bc-8234-bd6d59e5dd29") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:40.639124 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:40.639087 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xgh5\" (UniqueName: \"kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5\") pod \"network-check-target-w2qd9\" (UID: \"ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1\") " pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:53:40.639308 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:40.639269 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:40.639308 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:40.639288 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:40.639308 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:40.639301 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8xgh5 for pod openshift-network-diagnostics/network-check-target-w2qd9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:40.639496 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:40.639356 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5 podName:ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:42.639338746 +0000 UTC m=+5.273320666 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-8xgh5" (UniqueName: "kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5") pod "network-check-target-w2qd9" (UID: "ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:40.852368 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:40.852298 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 23:48:38 +0000 UTC" deadline="2027-12-31 01:34:51.668228934 +0000 UTC" Apr 24 23:53:40.852368 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:40.852338 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14761h41m10.815894984s" Apr 24 23:53:40.907840 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:40.907800 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:53:40.908004 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:40.907928 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:53:41.907641 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:41.907607 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:53:41.908096 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:41.907735 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:53:42.456477 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:42.456428 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs\") pod \"network-metrics-daemon-rhtrz\" (UID: \"d3fe756c-b2b5-42bc-8234-bd6d59e5dd29\") " pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:53:42.456658 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:42.456575 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:42.456658 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:42.456638 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs podName:d3fe756c-b2b5-42bc-8234-bd6d59e5dd29 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:46.456622025 +0000 UTC m=+9.090603939 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs") pod "network-metrics-daemon-rhtrz" (UID: "d3fe756c-b2b5-42bc-8234-bd6d59e5dd29") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:42.659252 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:42.658644 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xgh5\" (UniqueName: \"kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5\") pod \"network-check-target-w2qd9\" (UID: \"ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1\") " pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:53:42.659252 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:42.658804 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:42.659252 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:42.658827 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:42.659252 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:42.658839 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8xgh5 for pod openshift-network-diagnostics/network-check-target-w2qd9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:42.659252 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:42.658905 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5 podName:ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:46.65888589 +0000 UTC m=+9.292867809 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-8xgh5" (UniqueName: "kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5") pod "network-check-target-w2qd9" (UID: "ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:42.907891 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:42.907853 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:53:42.908346 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:42.908009 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:53:43.908597 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:43.908547 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:53:43.909108 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:43.908681 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:53:44.907589 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:44.907555 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:53:44.907768 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:44.907670 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:53:45.909925 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:45.909888 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:53:45.910376 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:45.910030 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:53:46.490829 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:46.490793 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs\") pod \"network-metrics-daemon-rhtrz\" (UID: \"d3fe756c-b2b5-42bc-8234-bd6d59e5dd29\") " pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:53:46.491015 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:46.490951 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:46.491085 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:46.491026 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs podName:d3fe756c-b2b5-42bc-8234-bd6d59e5dd29 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:54.491005495 +0000 UTC m=+17.124987422 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs") pod "network-metrics-daemon-rhtrz" (UID: "d3fe756c-b2b5-42bc-8234-bd6d59e5dd29") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:46.692539 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:46.692491 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xgh5\" (UniqueName: \"kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5\") pod \"network-check-target-w2qd9\" (UID: \"ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1\") " pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:53:46.692716 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:46.692660 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:46.692716 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:46.692683 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:46.692716 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:46.692699 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8xgh5 for pod openshift-network-diagnostics/network-check-target-w2qd9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:46.692883 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:46.692756 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5 podName:ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:54.692737254 +0000 UTC m=+17.326719176 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-8xgh5" (UniqueName: "kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5") pod "network-check-target-w2qd9" (UID: "ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:46.908383 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:46.908345 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:53:46.908545 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:46.908511 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:53:47.733105 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:47.733069 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-26g8h"] Apr 24 23:53:47.736039 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:47.735952 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:53:47.736039 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:47.736031 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:53:47.800686 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:47.800655 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret\") pod \"global-pull-secret-syncer-26g8h\" (UID: \"532cdfc7-fd38-495f-b85d-70daea2998a1\") " pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:53:47.800843 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:47.800698 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/532cdfc7-fd38-495f-b85d-70daea2998a1-kubelet-config\") pod \"global-pull-secret-syncer-26g8h\" (UID: \"532cdfc7-fd38-495f-b85d-70daea2998a1\") " pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:53:47.800843 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:47.800715 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/532cdfc7-fd38-495f-b85d-70daea2998a1-dbus\") pod \"global-pull-secret-syncer-26g8h\" (UID: \"532cdfc7-fd38-495f-b85d-70daea2998a1\") " pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:53:47.901800 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:47.901752 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret\") pod \"global-pull-secret-syncer-26g8h\" (UID: \"532cdfc7-fd38-495f-b85d-70daea2998a1\") " pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:53:47.901800 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:47.901806 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/532cdfc7-fd38-495f-b85d-70daea2998a1-kubelet-config\") pod \"global-pull-secret-syncer-26g8h\" (UID: \"532cdfc7-fd38-495f-b85d-70daea2998a1\") " pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:53:47.901999 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:47.901830 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/532cdfc7-fd38-495f-b85d-70daea2998a1-dbus\") pod \"global-pull-secret-syncer-26g8h\" (UID: \"532cdfc7-fd38-495f-b85d-70daea2998a1\") " pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:53:47.901999 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:47.901866 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:47.901999 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:47.901937 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret podName:532cdfc7-fd38-495f-b85d-70daea2998a1 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:48.40191462 +0000 UTC m=+11.035896550 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret") pod "global-pull-secret-syncer-26g8h" (UID: "532cdfc7-fd38-495f-b85d-70daea2998a1") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:47.901999 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:47.901974 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/532cdfc7-fd38-495f-b85d-70daea2998a1-dbus\") pod \"global-pull-secret-syncer-26g8h\" (UID: \"532cdfc7-fd38-495f-b85d-70daea2998a1\") " pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:53:47.902205 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:47.902024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/532cdfc7-fd38-495f-b85d-70daea2998a1-kubelet-config\") pod \"global-pull-secret-syncer-26g8h\" (UID: \"532cdfc7-fd38-495f-b85d-70daea2998a1\") " pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:53:47.908369 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:47.908341 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:53:47.908518 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:47.908448 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:53:48.405586 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:48.405546 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret\") pod \"global-pull-secret-syncer-26g8h\" (UID: \"532cdfc7-fd38-495f-b85d-70daea2998a1\") " pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:53:48.405756 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:48.405683 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:48.405756 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:48.405751 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret podName:532cdfc7-fd38-495f-b85d-70daea2998a1 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:49.405733571 +0000 UTC m=+12.039715492 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret") pod "global-pull-secret-syncer-26g8h" (UID: "532cdfc7-fd38-495f-b85d-70daea2998a1") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:48.908386 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:48.908349 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:53:48.908790 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:48.908349 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:53:48.908790 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:48.908490 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:53:48.908790 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:48.908546 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:53:49.414067 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:49.414034 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret\") pod \"global-pull-secret-syncer-26g8h\" (UID: \"532cdfc7-fd38-495f-b85d-70daea2998a1\") " pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:53:49.414243 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:49.414169 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:49.414243 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:49.414232 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret podName:532cdfc7-fd38-495f-b85d-70daea2998a1 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:51.414213851 +0000 UTC m=+14.048195779 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret") pod "global-pull-secret-syncer-26g8h" (UID: "532cdfc7-fd38-495f-b85d-70daea2998a1") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:49.907945 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:49.907907 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:53:49.908178 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:49.908031 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:53:50.908282 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:50.908250 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:53:50.908722 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:50.908250 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:53:50.908722 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:50.908369 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:53:50.908722 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:50.908445 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:53:51.428024 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:51.427993 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret\") pod \"global-pull-secret-syncer-26g8h\" (UID: \"532cdfc7-fd38-495f-b85d-70daea2998a1\") " pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:53:51.428224 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:51.428151 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:51.428284 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:51.428224 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret podName:532cdfc7-fd38-495f-b85d-70daea2998a1 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:55.428208012 +0000 UTC m=+18.062189931 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret") pod "global-pull-secret-syncer-26g8h" (UID: "532cdfc7-fd38-495f-b85d-70daea2998a1") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:51.907376 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:51.907344 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:53:51.907655 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:51.907481 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:53:52.908208 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:52.908112 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:53:52.908613 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:52.908256 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:53:52.908613 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:52.908313 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:53:52.908613 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:52.908453 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:53:53.907495 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:53.907455 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:53:53.907675 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:53.907586 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:53:54.548836 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:54.548801 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs\") pod \"network-metrics-daemon-rhtrz\" (UID: \"d3fe756c-b2b5-42bc-8234-bd6d59e5dd29\") " pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:53:54.549287 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:54.548974 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:54.549287 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:54.549046 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs podName:d3fe756c-b2b5-42bc-8234-bd6d59e5dd29 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:10.54902556 +0000 UTC m=+33.183007479 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs") pod "network-metrics-daemon-rhtrz" (UID: "d3fe756c-b2b5-42bc-8234-bd6d59e5dd29") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:54.749928 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:54.749861 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xgh5\" (UniqueName: \"kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5\") pod \"network-check-target-w2qd9\" (UID: \"ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1\") " pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:53:54.750149 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:54.749990 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:54.750149 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:54.750017 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:54.750149 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:54.750030 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8xgh5 for pod openshift-network-diagnostics/network-check-target-w2qd9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:54.750149 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:54.750083 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5 podName:ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:10.750064727 +0000 UTC m=+33.384046648 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-8xgh5" (UniqueName: "kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5") pod "network-check-target-w2qd9" (UID: "ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:54.908054 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:54.907975 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:53:54.908233 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:54.907975 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:53:54.908233 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:54.908121 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:53:54.908233 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:54.908197 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:53:55.454619 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:55.454585 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret\") pod \"global-pull-secret-syncer-26g8h\" (UID: \"532cdfc7-fd38-495f-b85d-70daea2998a1\") " pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:53:55.454789 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:55.454750 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:55.454831 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:55.454824 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret podName:532cdfc7-fd38-495f-b85d-70daea2998a1 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:03.454803823 +0000 UTC m=+26.088785759 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret") pod "global-pull-secret-syncer-26g8h" (UID: "532cdfc7-fd38-495f-b85d-70daea2998a1") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:55.907543 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:55.907508 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:53:55.908000 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:55.907626 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:53:56.908157 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:56.908123 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:53:56.908569 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:56.908247 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:53:56.908569 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:56.908302 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:53:56.908569 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:56.908398 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:53:57.909847 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:57.909816 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:53:57.910836 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:57.909913 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:53:58.047931 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:58.047912 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 24 23:53:58.048269 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:58.048247 2578 generic.go:358] "Generic (PLEG): container finished" podID="e0939cda-0079-43e5-b1be-4f8099b11f56" containerID="de159ec94431b651743a2d477ad1901e01b68a7ab6219744f1d63741590e4934" exitCode=1 Apr 24 23:53:58.048331 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:58.048303 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" event={"ID":"e0939cda-0079-43e5-b1be-4f8099b11f56","Type":"ContainerStarted","Data":"d8f2de8e8459aea09623880e976e69f61c2bcc56eb43b546cf1e1480b2ec7b14"} Apr 24 23:53:58.048331 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:58.048322 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" event={"ID":"e0939cda-0079-43e5-b1be-4f8099b11f56","Type":"ContainerStarted","Data":"fc979c76beeaa7a51cc285993becaa556d18a38c0c55dcd4f12d43bfe7c3e2df"} Apr 24 23:53:58.048331 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:58.048331 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" event={"ID":"e0939cda-0079-43e5-b1be-4f8099b11f56","Type":"ContainerStarted","Data":"c2b8991ea762cfb8715eaf8fa9339187658c371aafb2eed1c45fbc7ab84f9b82"} Apr 24 23:53:58.048459 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:58.048338 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" event={"ID":"e0939cda-0079-43e5-b1be-4f8099b11f56","Type":"ContainerDied","Data":"de159ec94431b651743a2d477ad1901e01b68a7ab6219744f1d63741590e4934"} Apr 24 23:53:58.048459 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:58.048347 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" event={"ID":"e0939cda-0079-43e5-b1be-4f8099b11f56","Type":"ContainerStarted","Data":"b63932143348efa385e23beb8110e76b4783856a52cd69fb09e9f802e5c3ffb1"} Apr 24 23:53:58.049708 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:58.049680 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-98.ec2.internal" event={"ID":"87dc53c55f73620bf5df44e2826c141e","Type":"ContainerStarted","Data":"8e025d3f15f63d3ad77b2f35e4a2417f829d243ac4784b3ef11b7a5c28b57f5b"} Apr 24 23:53:58.051073 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:58.051039 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" event={"ID":"edeca547-37b0-442b-95dc-712808101f9a","Type":"ContainerStarted","Data":"d0996462b0cfb8da28033fbe63b84272e5ed61ca65a0cd1de95f4b4e0a30c3cd"} Apr 24 23:53:58.052250 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:58.052228 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ktchn" event={"ID":"8566f65b-b13b-4b52-8b4d-8dcbd70b502a","Type":"ContainerStarted","Data":"81bb70ad2cdac82be42fc20c76ac39d7f94db0d33e10d000c41d54dc252b3da0"} Apr 24 23:53:58.065921 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:58.065883 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-98.ec2.internal" podStartSLOduration=20.065871483 podStartE2EDuration="20.065871483s" podCreationTimestamp="2026-04-24 23:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:53:58.065436035 +0000 UTC m=+20.699417963" watchObservedRunningTime="2026-04-24 23:53:58.065871483 +0000 UTC m=+20.699853501" Apr 24 23:53:58.082772 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:58.082725 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-tmhfx" podStartSLOduration=2.8584912559999998 podStartE2EDuration="21.082709611s" podCreationTimestamp="2026-04-24 23:53:37 +0000 UTC" firstStartedPulling="2026-04-24 23:53:39.153269259 +0000 UTC m=+1.787251173" lastFinishedPulling="2026-04-24 23:53:57.377487597 +0000 UTC m=+20.011469528" observedRunningTime="2026-04-24 23:53:58.082156709 +0000 UTC m=+20.716138662" watchObservedRunningTime="2026-04-24 23:53:58.082709611 +0000 UTC m=+20.716691552" Apr 24 23:53:58.100141 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:58.100104 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ktchn" podStartSLOduration=2.619785122 podStartE2EDuration="21.100091923s" podCreationTimestamp="2026-04-24 23:53:37 +0000 UTC" firstStartedPulling="2026-04-24 23:53:39.175130226 +0000 UTC m=+1.809112140" lastFinishedPulling="2026-04-24 23:53:57.655437027 +0000 UTC m=+20.289418941" observedRunningTime="2026-04-24 23:53:58.099343526 +0000 UTC m=+20.733325463" watchObservedRunningTime="2026-04-24 23:53:58.100091923 +0000 UTC m=+20.734073859" Apr 24 23:53:58.907559 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:58.907528 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:53:58.907746 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:58.907590 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:53:58.907746 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:58.907696 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:53:58.907999 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:58.907968 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:53:59.054839 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:59.054807 2578 generic.go:358] "Generic (PLEG): container finished" podID="d89d33b9-52c1-474f-a5b8-221754ae1cc6" containerID="1ef97ac46ae6b05df094efaa6304f506c10bd16cc9d1c3b46769de0f009f6c19" exitCode=0 Apr 24 23:53:59.055273 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:59.054849 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7zs6q" event={"ID":"d89d33b9-52c1-474f-a5b8-221754ae1cc6","Type":"ContainerDied","Data":"1ef97ac46ae6b05df094efaa6304f506c10bd16cc9d1c3b46769de0f009f6c19"} Apr 24 23:53:59.056226 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:59.056181 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vdjkm" event={"ID":"220c5498-d45f-48c2-a25e-01ac23225100","Type":"ContainerStarted","Data":"52a494480ea1af0671ea272790ba1479447dc77200a3310c04b84c7ea5edd85a"} Apr 24 23:53:59.058722 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:59.058705 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 24 23:53:59.059070 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:59.059051 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" event={"ID":"e0939cda-0079-43e5-b1be-4f8099b11f56","Type":"ContainerStarted","Data":"244acb0040fa7d18e2bc1d1699b94c2e553ae312c5022e663417ecff210112a2"} Apr 24 23:53:59.060795 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:59.060777 2578 generic.go:358] "Generic (PLEG): container finished" podID="32cc11a6fe1288d8e923d33bdeaf02c1" containerID="4108c68f48c6c66bd16b61acd06070117c73951e92708580e895cdff0ffd491d" exitCode=0 Apr 24 23:53:59.060871 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:59.060831 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" event={"ID":"32cc11a6fe1288d8e923d33bdeaf02c1","Type":"ContainerDied","Data":"4108c68f48c6c66bd16b61acd06070117c73951e92708580e895cdff0ffd491d"} Apr 24 23:53:59.062110 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:59.062088 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kngps" event={"ID":"31f651c0-8e2e-4e85-b153-94f4291085b1","Type":"ContainerStarted","Data":"7822c065e92f1fe092257e924faf102c24e643109f8af952efd010625df2395a"} Apr 24 23:53:59.063454 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:59.063388 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" event={"ID":"4193b598-cb84-4f01-b039-cd235fe68381","Type":"ContainerStarted","Data":"b0947fa29326a3fee9df9cf6962f32c8add7d9ee1ac6999b5e87504d89b06067"} Apr 24 23:53:59.064570 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:59.064547 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s2pf5" event={"ID":"b50de4c3-3440-4c81-81ac-23466ec3f726","Type":"ContainerStarted","Data":"d4961b7ff356acee50a39a1a2266665ddf1213f113470f65cf6e62be98ad6664"} Apr 24 23:53:59.065829 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:59.065806 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dwll4" event={"ID":"35b60d0c-8fbf-4b7b-a0e6-31a46f1a96d2","Type":"ContainerStarted","Data":"60a614401d51513c85fca5489736038fc9ebc527c6c3f3c8d6d0fbefe01c31b0"} Apr 24 23:53:59.105752 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:59.105676 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-dwll4" podStartSLOduration=3.811447624 podStartE2EDuration="22.105660956s" podCreationTimestamp="2026-04-24 23:53:37 +0000 UTC" firstStartedPulling="2026-04-24 23:53:39.131857162 +0000 UTC m=+1.765839076" lastFinishedPulling="2026-04-24 23:53:57.426070481 +0000 UTC m=+20.060052408" observedRunningTime="2026-04-24 23:53:59.104996328 +0000 UTC m=+21.738978266" watchObservedRunningTime="2026-04-24 23:53:59.105660956 +0000 UTC m=+21.739642893" Apr 24 23:53:59.118591 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:59.118547 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-s2pf5" podStartSLOduration=3.841920602 podStartE2EDuration="22.118519348s" podCreationTimestamp="2026-04-24 23:53:37 +0000 UTC" firstStartedPulling="2026-04-24 23:53:39.163079202 +0000 UTC m=+1.797061116" lastFinishedPulling="2026-04-24 23:53:57.439677933 +0000 UTC m=+20.073659862" observedRunningTime="2026-04-24 23:53:59.117949732 +0000 UTC m=+21.751931669" watchObservedRunningTime="2026-04-24 23:53:59.118519348 +0000 UTC m=+21.752501283" Apr 24 23:53:59.130828 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:59.130788 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vdjkm" podStartSLOduration=2.917965862 podStartE2EDuration="21.130775251s" podCreationTimestamp="2026-04-24 23:53:38 +0000 UTC" firstStartedPulling="2026-04-24 23:53:39.21346408 +0000 UTC m=+1.847445994" lastFinishedPulling="2026-04-24 23:53:57.426273468 +0000 UTC m=+20.060255383" observedRunningTime="2026-04-24 23:53:59.130336397 +0000 UTC m=+21.764318355" watchObservedRunningTime="2026-04-24 23:53:59.130775251 +0000 UTC m=+21.764757191" Apr 24 23:53:59.144645 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:59.144612 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-kngps" podStartSLOduration=2.992160718 podStartE2EDuration="21.14459972s" podCreationTimestamp="2026-04-24 23:53:38 +0000 UTC" firstStartedPulling="2026-04-24 23:53:39.22506551 +0000 UTC m=+1.859047424" lastFinishedPulling="2026-04-24 23:53:57.377504512 +0000 UTC m=+20.011486426" observedRunningTime="2026-04-24 23:53:59.144460058 +0000 UTC m=+21.778441994" watchObservedRunningTime="2026-04-24 23:53:59.14459972 +0000 UTC m=+21.778581657" Apr 24 23:53:59.180451 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:59.180425 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 23:53:59.875923 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:59.875807 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T23:53:59.180447353Z","UUID":"860310e2-5b2c-422a-bd17-05d911a028d7","Handler":null,"Name":"","Endpoint":""} Apr 24 23:53:59.879353 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:59.879328 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 23:53:59.879503 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:59.879362 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 23:53:59.907978 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:53:59.907946 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:53:59.908110 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:53:59.908051 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:54:00.070436 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:00.070115 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" event={"ID":"32cc11a6fe1288d8e923d33bdeaf02c1","Type":"ContainerStarted","Data":"ff45f9681260329f43c43912207910b171e05786a5ba0ab097fe35ef777f2e76"} Apr 24 23:54:00.073330 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:00.073306 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" event={"ID":"4193b598-cb84-4f01-b039-cd235fe68381","Type":"ContainerStarted","Data":"c99e747eb148dc94e5e479754423fea55f361de5a8365fefe6843b652d248510"} Apr 24 23:54:00.073439 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:00.073343 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" event={"ID":"4193b598-cb84-4f01-b039-cd235fe68381","Type":"ContainerStarted","Data":"f303bb99254c5f7dfcc641b34cc04716c7134c7a5a3f158f5b6f5f2dd039a205"} Apr 24 23:54:00.085836 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:00.085792 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" podStartSLOduration=22.085779121 podStartE2EDuration="22.085779121s" podCreationTimestamp="2026-04-24 23:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:54:00.085407052 +0000 UTC m=+22.719388999" watchObservedRunningTime="2026-04-24 23:54:00.085779121 +0000 UTC m=+22.719761056" Apr 24 23:54:00.102220 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:00.102178 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7w9t6" podStartSLOduration=1.36878732 podStartE2EDuration="22.102166367s" podCreationTimestamp="2026-04-24 23:53:38 +0000 UTC" firstStartedPulling="2026-04-24 23:53:39.195340949 +0000 UTC m=+1.829322862" lastFinishedPulling="2026-04-24 23:53:59.928719981 +0000 UTC m=+22.562701909" observedRunningTime="2026-04-24 23:54:00.101677953 +0000 UTC m=+22.735659889" watchObservedRunningTime="2026-04-24 23:54:00.102166367 +0000 UTC m=+22.736148302" Apr 24 23:54:00.908234 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:00.908197 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:54:00.908496 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:00.908318 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:54:00.908496 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:00.908375 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:54:00.908624 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:00.908520 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:54:01.077883 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:01.077855 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 24 23:54:01.078396 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:01.078239 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" event={"ID":"e0939cda-0079-43e5-b1be-4f8099b11f56","Type":"ContainerStarted","Data":"0f25ea582024f2c3b286d0b9c00f1be9b027ec37315cce2374f1660544e62109"} Apr 24 23:54:01.908258 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:01.908222 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:54:01.908460 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:01.908342 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:54:02.907397 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:02.907258 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:54:02.907964 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:02.907259 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:54:02.907964 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:02.907496 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:54:02.907964 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:02.907857 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:54:03.083780 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:03.083748 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7zs6q" event={"ID":"d89d33b9-52c1-474f-a5b8-221754ae1cc6","Type":"ContainerStarted","Data":"208a4cd810895a8ca31e5509907b70a12335bcc1b33ee4e06b0cf9bb122eb923"} Apr 24 23:54:03.086745 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:03.086715 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 24 23:54:03.087023 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:03.086998 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" event={"ID":"e0939cda-0079-43e5-b1be-4f8099b11f56","Type":"ContainerStarted","Data":"9fdc2e28d3cb53369bc02bd4a3574d38aa70db07e4ecdaa1437f945775796a54"} Apr 24 23:54:03.087304 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:03.087275 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:54:03.087354 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:03.087315 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:54:03.087519 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:03.087501 2578 scope.go:117] "RemoveContainer" containerID="de159ec94431b651743a2d477ad1901e01b68a7ab6219744f1d63741590e4934" Apr 24 23:54:03.104124 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:03.104102 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:54:03.423582 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:03.423504 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-dwll4" Apr 24 23:54:03.424079 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:03.424062 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-dwll4" Apr 24 23:54:03.516871 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:03.516842 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret\") pod \"global-pull-secret-syncer-26g8h\" (UID: \"532cdfc7-fd38-495f-b85d-70daea2998a1\") " pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:54:03.516979 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:03.516953 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:03.517044 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:03.517011 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret podName:532cdfc7-fd38-495f-b85d-70daea2998a1 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:19.51699337 +0000 UTC m=+42.150975286 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret") pod "global-pull-secret-syncer-26g8h" (UID: "532cdfc7-fd38-495f-b85d-70daea2998a1") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:03.908287 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:03.908260 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:54:03.908664 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:03.908382 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:54:04.091524 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:04.091403 2578 generic.go:358] "Generic (PLEG): container finished" podID="d89d33b9-52c1-474f-a5b8-221754ae1cc6" containerID="208a4cd810895a8ca31e5509907b70a12335bcc1b33ee4e06b0cf9bb122eb923" exitCode=0 Apr 24 23:54:04.091524 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:04.091497 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7zs6q" event={"ID":"d89d33b9-52c1-474f-a5b8-221754ae1cc6","Type":"ContainerDied","Data":"208a4cd810895a8ca31e5509907b70a12335bcc1b33ee4e06b0cf9bb122eb923"} Apr 24 23:54:04.106954 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:04.106935 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 24 23:54:04.107334 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:04.107302 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" event={"ID":"e0939cda-0079-43e5-b1be-4f8099b11f56","Type":"ContainerStarted","Data":"281fad92d1a88d1c9c2db57a164922467c4d1afba1870dfe569982bd7d4f2e92"} Apr 24 23:54:04.107617 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:04.107574 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-dwll4" Apr 24 23:54:04.107617 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:04.107599 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:54:04.108111 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:04.108092 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-dwll4" Apr 24 23:54:04.126456 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:04.124016 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:54:04.158758 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:04.158697 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" podStartSLOduration=7.821314053 podStartE2EDuration="26.158679875s" podCreationTimestamp="2026-04-24 23:53:38 +0000 UTC" firstStartedPulling="2026-04-24 23:53:39.140884811 +0000 UTC m=+1.774866725" lastFinishedPulling="2026-04-24 23:53:57.478250633 +0000 UTC m=+20.112232547" observedRunningTime="2026-04-24 23:54:04.157988746 +0000 UTC m=+26.791970684" watchObservedRunningTime="2026-04-24 23:54:04.158679875 +0000 UTC m=+26.792661812" Apr 24 23:54:04.859667 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:04.859498 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-w2qd9"] Apr 24 23:54:04.859770 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:04.859742 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:54:04.859858 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:04.859831 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:54:04.863205 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:04.863184 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rhtrz"] Apr 24 23:54:04.863297 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:04.863287 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:54:04.863387 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:04.863373 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:54:04.870668 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:04.870646 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-26g8h"] Apr 24 23:54:04.870763 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:04.870754 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:54:04.870849 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:04.870830 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:54:05.111352 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:05.111322 2578 generic.go:358] "Generic (PLEG): container finished" podID="d89d33b9-52c1-474f-a5b8-221754ae1cc6" containerID="0771a10f1b63a7b4f37f43315c0ecd54108f232f7470408d947fb1822c12a4d8" exitCode=0 Apr 24 23:54:05.111829 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:05.111427 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7zs6q" event={"ID":"d89d33b9-52c1-474f-a5b8-221754ae1cc6","Type":"ContainerDied","Data":"0771a10f1b63a7b4f37f43315c0ecd54108f232f7470408d947fb1822c12a4d8"} Apr 24 23:54:06.115700 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:06.115668 2578 generic.go:358] "Generic (PLEG): container finished" podID="d89d33b9-52c1-474f-a5b8-221754ae1cc6" containerID="1da78901478ff4ab0016562bb7e9ef0d255be2536fa08355a5fe3a460b5abecf" exitCode=0 Apr 24 23:54:06.116142 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:06.115746 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7zs6q" event={"ID":"d89d33b9-52c1-474f-a5b8-221754ae1cc6","Type":"ContainerDied","Data":"1da78901478ff4ab0016562bb7e9ef0d255be2536fa08355a5fe3a460b5abecf"} Apr 24 23:54:06.907856 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:06.907813 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:54:06.908076 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:06.907863 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:54:06.908076 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:06.907924 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:54:06.908076 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:06.907945 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:54:06.908076 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:06.907995 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:54:06.908076 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:06.908073 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:54:08.907773 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:08.907727 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:54:08.907773 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:08.907759 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:54:08.908337 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:08.907845 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:54:08.908337 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:08.907868 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:54:08.908337 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:08.907953 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:54:08.908337 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:08.908030 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:54:10.567530 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:10.567469 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs\") pod \"network-metrics-daemon-rhtrz\" (UID: \"d3fe756c-b2b5-42bc-8234-bd6d59e5dd29\") " pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:54:10.567977 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:10.567642 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:10.567977 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:10.567718 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs podName:d3fe756c-b2b5-42bc-8234-bd6d59e5dd29 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:42.567696519 +0000 UTC m=+65.201678440 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs") pod "network-metrics-daemon-rhtrz" (UID: "d3fe756c-b2b5-42bc-8234-bd6d59e5dd29") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:10.769002 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:10.768934 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xgh5\" (UniqueName: \"kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5\") pod \"network-check-target-w2qd9\" (UID: \"ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1\") " pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:54:10.769157 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:10.769072 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:54:10.769157 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:10.769093 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:54:10.769157 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:10.769102 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8xgh5 for pod openshift-network-diagnostics/network-check-target-w2qd9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:10.769326 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:10.769162 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5 podName:ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:42.769147199 +0000 UTC m=+65.403129129 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-8xgh5" (UniqueName: "kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5") pod "network-check-target-w2qd9" (UID: "ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:10.907827 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:10.907743 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:54:10.907827 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:10.907770 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:54:10.908034 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:10.907865 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:54:10.908034 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:10.907951 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:54:10.908034 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:10.907987 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:54:10.908169 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:10.908073 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:54:12.907596 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:12.907391 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:54:12.908070 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:12.907391 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:54:12.908070 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:12.907685 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:54:12.908070 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:12.907397 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:54:12.908070 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:12.907748 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:54:12.908070 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:12.907815 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:54:13.131126 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:13.131090 2578 generic.go:358] "Generic (PLEG): container finished" podID="d89d33b9-52c1-474f-a5b8-221754ae1cc6" containerID="7ca72ca11f541cd28a2e0003c004ce9d32ca57fd12a9f74223e679710252d36c" exitCode=0 Apr 24 23:54:13.131296 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:13.131127 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7zs6q" event={"ID":"d89d33b9-52c1-474f-a5b8-221754ae1cc6","Type":"ContainerDied","Data":"7ca72ca11f541cd28a2e0003c004ce9d32ca57fd12a9f74223e679710252d36c"} Apr 24 23:54:14.135125 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:14.135097 2578 generic.go:358] "Generic (PLEG): container finished" podID="d89d33b9-52c1-474f-a5b8-221754ae1cc6" containerID="25d4af6ef1aaf197c4081f4429899ca4c9d4d5d89ecbe31bda99b694e5294888" exitCode=0 Apr 24 23:54:14.135509 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:14.135132 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7zs6q" event={"ID":"d89d33b9-52c1-474f-a5b8-221754ae1cc6","Type":"ContainerDied","Data":"25d4af6ef1aaf197c4081f4429899ca4c9d4d5d89ecbe31bda99b694e5294888"} Apr 24 23:54:14.908084 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:14.908047 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:54:14.908084 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:14.908082 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:54:14.908283 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:14.908161 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:54:14.908283 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:14.908189 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:54:14.908283 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:14.908274 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:54:14.908389 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:14.908351 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:54:15.139505 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:15.139471 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7zs6q" event={"ID":"d89d33b9-52c1-474f-a5b8-221754ae1cc6","Type":"ContainerStarted","Data":"519cbf1d459bdaf0eeaf6d083ef3ba19df541fcf92cf8e117c2f629a8a4a154f"} Apr 24 23:54:16.907989 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:16.907960 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:54:16.908384 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:16.907964 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:54:16.908384 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:16.908122 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:54:16.908384 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:16.908048 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:54:16.908384 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:16.907964 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:54:16.908384 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:16.908233 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:54:18.908123 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:18.908093 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:54:18.908123 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:18.908105 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:54:18.908590 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:18.908105 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:54:18.908590 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:18.908197 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:54:18.908590 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:18.908291 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:54:18.908590 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:18.908368 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:54:19.534370 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:19.534337 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret\") pod \"global-pull-secret-syncer-26g8h\" (UID: \"532cdfc7-fd38-495f-b85d-70daea2998a1\") " pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:54:19.534571 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:19.534461 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:19.534571 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:19.534518 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret podName:532cdfc7-fd38-495f-b85d-70daea2998a1 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:51.534503831 +0000 UTC m=+74.168485752 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret") pod "global-pull-secret-syncer-26g8h" (UID: "532cdfc7-fd38-495f-b85d-70daea2998a1") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:20.907832 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:20.907799 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:54:20.908327 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:20.907910 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:54:20.908327 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:20.907917 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:54:20.908327 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:20.908005 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:54:20.908327 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:20.908038 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:54:20.908327 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:20.908130 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:54:22.908026 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:22.907996 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:54:22.908504 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:22.908116 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:54:22.908504 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:22.908117 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:54:22.908504 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:22.908142 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:54:22.908504 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:22.908209 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:54:22.908504 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:22.908267 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:54:24.907675 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:24.907644 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:54:24.908101 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:24.907644 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:54:24.908101 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:24.907753 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:54:24.908101 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:24.907820 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:54:24.908101 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:24.907644 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:54:24.908101 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:24.907892 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:54:26.908044 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:26.908013 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:54:26.908468 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:26.908108 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:54:26.908468 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:26.908116 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:54:26.908468 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:26.908131 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:54:26.908468 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:26.908189 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:54:26.908468 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:26.908256 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:54:28.908082 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:28.908045 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:54:28.908608 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:28.908045 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:54:28.908608 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:28.908149 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:54:28.908608 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:28.908045 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:54:28.908608 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:28.908218 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:54:28.908608 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:28.908309 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:54:30.598938 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:30.598873 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7zs6q" podStartSLOduration=19.827084128 podStartE2EDuration="52.598856248s" podCreationTimestamp="2026-04-24 23:53:38 +0000 UTC" firstStartedPulling="2026-04-24 23:53:39.238035259 +0000 UTC m=+1.872017178" lastFinishedPulling="2026-04-24 23:54:12.00980738 +0000 UTC m=+34.643789298" observedRunningTime="2026-04-24 23:54:15.161112999 +0000 UTC m=+37.795094936" watchObservedRunningTime="2026-04-24 23:54:30.598856248 +0000 UTC m=+53.232838183" Apr 24 23:54:30.907801 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:30.907718 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:54:30.907954 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:30.907726 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:54:30.907954 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:30.907895 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rhtrz" podUID="d3fe756c-b2b5-42bc-8234-bd6d59e5dd29" Apr 24 23:54:30.907954 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:30.907800 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w2qd9" podUID="ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1" Apr 24 23:54:30.907954 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:30.907726 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:54:30.908103 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:30.907995 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-26g8h" podUID="532cdfc7-fd38-495f-b85d-70daea2998a1" Apr 24 23:54:31.191048 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.190965 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeReady" Apr 24 23:54:31.191184 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.191119 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 23:54:31.234727 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.234697 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-946b4db85-hh7bl"] Apr 24 23:54:31.239590 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.239568 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.242484 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.242465 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-n2vv4\"" Apr 24 23:54:31.242677 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.242490 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 23:54:31.242760 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.242490 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 23:54:31.242760 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.242568 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 23:54:31.248873 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.248857 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 23:54:31.251674 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.251655 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-946b4db85-hh7bl"] Apr 24 23:54:31.265965 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.265145 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-z6qcm"] Apr 24 23:54:31.269592 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.268649 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z6qcm" Apr 24 23:54:31.272241 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.272214 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 23:54:31.272735 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.272491 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 23:54:31.272832 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.272778 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 23:54:31.272927 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.272251 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-nb5d2\"" Apr 24 23:54:31.283180 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.283154 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z6qcm"] Apr 24 23:54:31.322520 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.322487 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hs5n\" (UniqueName: \"kubernetes.io/projected/899f41b7-00d1-44b2-bcba-0541dea9fcb3-kube-api-access-9hs5n\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.322669 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.322527 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/899f41b7-00d1-44b2-bcba-0541dea9fcb3-image-registry-private-configuration\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.322669 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.322556 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/899f41b7-00d1-44b2-bcba-0541dea9fcb3-ca-trust-extracted\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.322669 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.322578 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/899f41b7-00d1-44b2-bcba-0541dea9fcb3-registry-certificates\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.322669 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.322594 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/899f41b7-00d1-44b2-bcba-0541dea9fcb3-installation-pull-secrets\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.322813 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.322679 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx2b7\" (UniqueName: \"kubernetes.io/projected/ed05dbf9-ea8c-41d5-ac86-6efec6560e64-kube-api-access-wx2b7\") pod \"ingress-canary-z6qcm\" (UID: \"ed05dbf9-ea8c-41d5-ac86-6efec6560e64\") " pod="openshift-ingress-canary/ingress-canary-z6qcm" Apr 24 23:54:31.322813 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.322723 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/899f41b7-00d1-44b2-bcba-0541dea9fcb3-trusted-ca\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.322813 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.322741 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ed05dbf9-ea8c-41d5-ac86-6efec6560e64-cert\") pod \"ingress-canary-z6qcm\" (UID: \"ed05dbf9-ea8c-41d5-ac86-6efec6560e64\") " pod="openshift-ingress-canary/ingress-canary-z6qcm" Apr 24 23:54:31.322813 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.322771 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/899f41b7-00d1-44b2-bcba-0541dea9fcb3-registry-tls\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.322813 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.322790 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/899f41b7-00d1-44b2-bcba-0541dea9fcb3-bound-sa-token\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.348255 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.348233 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-l2htg"] Apr 24 23:54:31.351503 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.351489 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l2htg" Apr 24 23:54:31.353987 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.353963 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xgj74\"" Apr 24 23:54:31.354087 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.353965 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 23:54:31.354087 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.353983 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 23:54:31.359389 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.359365 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-l2htg"] Apr 24 23:54:31.423712 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.423681 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/899f41b7-00d1-44b2-bcba-0541dea9fcb3-registry-tls\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.423712 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.423713 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/899f41b7-00d1-44b2-bcba-0541dea9fcb3-bound-sa-token\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.423940 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.423733 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hs5n\" (UniqueName: \"kubernetes.io/projected/899f41b7-00d1-44b2-bcba-0541dea9fcb3-kube-api-access-9hs5n\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.423940 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.423755 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d910288-c4b3-4b19-9188-f9ded54fd92f-config-volume\") pod \"dns-default-l2htg\" (UID: \"9d910288-c4b3-4b19-9188-f9ded54fd92f\") " pod="openshift-dns/dns-default-l2htg" Apr 24 23:54:31.423940 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.423771 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9d910288-c4b3-4b19-9188-f9ded54fd92f-tmp-dir\") pod \"dns-default-l2htg\" (UID: \"9d910288-c4b3-4b19-9188-f9ded54fd92f\") " pod="openshift-dns/dns-default-l2htg" Apr 24 23:54:31.423940 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.423820 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/899f41b7-00d1-44b2-bcba-0541dea9fcb3-image-registry-private-configuration\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.423940 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.423922 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/899f41b7-00d1-44b2-bcba-0541dea9fcb3-ca-trust-extracted\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.424202 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.423953 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/899f41b7-00d1-44b2-bcba-0541dea9fcb3-registry-certificates\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.424202 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.423972 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/899f41b7-00d1-44b2-bcba-0541dea9fcb3-installation-pull-secrets\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.424202 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.424042 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wx2b7\" (UniqueName: \"kubernetes.io/projected/ed05dbf9-ea8c-41d5-ac86-6efec6560e64-kube-api-access-wx2b7\") pod \"ingress-canary-z6qcm\" (UID: \"ed05dbf9-ea8c-41d5-ac86-6efec6560e64\") " pod="openshift-ingress-canary/ingress-canary-z6qcm" Apr 24 23:54:31.424202 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.424066 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpjkl\" (UniqueName: \"kubernetes.io/projected/9d910288-c4b3-4b19-9188-f9ded54fd92f-kube-api-access-cpjkl\") pod \"dns-default-l2htg\" (UID: \"9d910288-c4b3-4b19-9188-f9ded54fd92f\") " pod="openshift-dns/dns-default-l2htg" Apr 24 23:54:31.424202 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.424114 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/899f41b7-00d1-44b2-bcba-0541dea9fcb3-trusted-ca\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.424202 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.424142 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ed05dbf9-ea8c-41d5-ac86-6efec6560e64-cert\") pod \"ingress-canary-z6qcm\" (UID: \"ed05dbf9-ea8c-41d5-ac86-6efec6560e64\") " pod="openshift-ingress-canary/ingress-canary-z6qcm" Apr 24 23:54:31.424202 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.424167 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d910288-c4b3-4b19-9188-f9ded54fd92f-metrics-tls\") pod \"dns-default-l2htg\" (UID: \"9d910288-c4b3-4b19-9188-f9ded54fd92f\") " pod="openshift-dns/dns-default-l2htg" Apr 24 23:54:31.424566 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.424446 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/899f41b7-00d1-44b2-bcba-0541dea9fcb3-ca-trust-extracted\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.425017 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.424934 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/899f41b7-00d1-44b2-bcba-0541dea9fcb3-registry-certificates\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.425110 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.425086 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/899f41b7-00d1-44b2-bcba-0541dea9fcb3-trusted-ca\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.427968 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.427947 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/899f41b7-00d1-44b2-bcba-0541dea9fcb3-installation-pull-secrets\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.428071 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.428041 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/899f41b7-00d1-44b2-bcba-0541dea9fcb3-image-registry-private-configuration\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.428125 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.428096 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/899f41b7-00d1-44b2-bcba-0541dea9fcb3-registry-tls\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.428125 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.428112 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ed05dbf9-ea8c-41d5-ac86-6efec6560e64-cert\") pod \"ingress-canary-z6qcm\" (UID: \"ed05dbf9-ea8c-41d5-ac86-6efec6560e64\") " pod="openshift-ingress-canary/ingress-canary-z6qcm" Apr 24 23:54:31.431312 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.431285 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hs5n\" (UniqueName: \"kubernetes.io/projected/899f41b7-00d1-44b2-bcba-0541dea9fcb3-kube-api-access-9hs5n\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.432020 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.432000 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/899f41b7-00d1-44b2-bcba-0541dea9fcb3-bound-sa-token\") pod \"image-registry-946b4db85-hh7bl\" (UID: \"899f41b7-00d1-44b2-bcba-0541dea9fcb3\") " pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.432114 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.432099 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx2b7\" (UniqueName: \"kubernetes.io/projected/ed05dbf9-ea8c-41d5-ac86-6efec6560e64-kube-api-access-wx2b7\") pod \"ingress-canary-z6qcm\" (UID: \"ed05dbf9-ea8c-41d5-ac86-6efec6560e64\") " pod="openshift-ingress-canary/ingress-canary-z6qcm" Apr 24 23:54:31.524753 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.524729 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpjkl\" (UniqueName: \"kubernetes.io/projected/9d910288-c4b3-4b19-9188-f9ded54fd92f-kube-api-access-cpjkl\") pod \"dns-default-l2htg\" (UID: \"9d910288-c4b3-4b19-9188-f9ded54fd92f\") " pod="openshift-dns/dns-default-l2htg" Apr 24 23:54:31.524885 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.524779 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d910288-c4b3-4b19-9188-f9ded54fd92f-metrics-tls\") pod \"dns-default-l2htg\" (UID: \"9d910288-c4b3-4b19-9188-f9ded54fd92f\") " pod="openshift-dns/dns-default-l2htg" Apr 24 23:54:31.524924 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.524897 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d910288-c4b3-4b19-9188-f9ded54fd92f-config-volume\") pod \"dns-default-l2htg\" (UID: \"9d910288-c4b3-4b19-9188-f9ded54fd92f\") " pod="openshift-dns/dns-default-l2htg" Apr 24 23:54:31.524968 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.524928 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9d910288-c4b3-4b19-9188-f9ded54fd92f-tmp-dir\") pod \"dns-default-l2htg\" (UID: \"9d910288-c4b3-4b19-9188-f9ded54fd92f\") " pod="openshift-dns/dns-default-l2htg" Apr 24 23:54:31.525354 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.525289 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9d910288-c4b3-4b19-9188-f9ded54fd92f-tmp-dir\") pod \"dns-default-l2htg\" (UID: \"9d910288-c4b3-4b19-9188-f9ded54fd92f\") " pod="openshift-dns/dns-default-l2htg" Apr 24 23:54:31.525585 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.525453 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d910288-c4b3-4b19-9188-f9ded54fd92f-config-volume\") pod \"dns-default-l2htg\" (UID: \"9d910288-c4b3-4b19-9188-f9ded54fd92f\") " pod="openshift-dns/dns-default-l2htg" Apr 24 23:54:31.527108 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.527086 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d910288-c4b3-4b19-9188-f9ded54fd92f-metrics-tls\") pod \"dns-default-l2htg\" (UID: \"9d910288-c4b3-4b19-9188-f9ded54fd92f\") " pod="openshift-dns/dns-default-l2htg" Apr 24 23:54:31.532254 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.532237 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpjkl\" (UniqueName: \"kubernetes.io/projected/9d910288-c4b3-4b19-9188-f9ded54fd92f-kube-api-access-cpjkl\") pod \"dns-default-l2htg\" (UID: \"9d910288-c4b3-4b19-9188-f9ded54fd92f\") " pod="openshift-dns/dns-default-l2htg" Apr 24 23:54:31.549201 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.549182 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:31.579874 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.579847 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z6qcm" Apr 24 23:54:31.659832 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.659791 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l2htg" Apr 24 23:54:31.695297 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.695264 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-946b4db85-hh7bl"] Apr 24 23:54:31.699380 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:54:31.699345 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod899f41b7_00d1_44b2_bcba_0541dea9fcb3.slice/crio-b66449d75dbdd8917dc73e5ff1a3304154a1a85d757dcbfb6fd451c1c1361d1c WatchSource:0}: Error finding container b66449d75dbdd8917dc73e5ff1a3304154a1a85d757dcbfb6fd451c1c1361d1c: Status 404 returned error can't find the container with id b66449d75dbdd8917dc73e5ff1a3304154a1a85d757dcbfb6fd451c1c1361d1c Apr 24 23:54:31.790348 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.790318 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-l2htg"] Apr 24 23:54:31.793374 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:54:31.793350 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d910288_c4b3_4b19_9188_f9ded54fd92f.slice/crio-5eccb950cc41d02c5fe0abe13ac40f346ddbd0481efac2722d7674a0af9cb34c WatchSource:0}: Error finding container 5eccb950cc41d02c5fe0abe13ac40f346ddbd0481efac2722d7674a0af9cb34c: Status 404 returned error can't find the container with id 5eccb950cc41d02c5fe0abe13ac40f346ddbd0481efac2722d7674a0af9cb34c Apr 24 23:54:31.914963 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:31.914932 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z6qcm"] Apr 24 23:54:31.918713 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:54:31.918686 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded05dbf9_ea8c_41d5_ac86_6efec6560e64.slice/crio-b629da11fde3eb81fc4645de91d275c9b0ba10215e6b7d853f29f1c3a230657e WatchSource:0}: Error finding container b629da11fde3eb81fc4645de91d275c9b0ba10215e6b7d853f29f1c3a230657e: Status 404 returned error can't find the container with id b629da11fde3eb81fc4645de91d275c9b0ba10215e6b7d853f29f1c3a230657e Apr 24 23:54:32.168405 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:32.168322 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l2htg" event={"ID":"9d910288-c4b3-4b19-9188-f9ded54fd92f","Type":"ContainerStarted","Data":"5eccb950cc41d02c5fe0abe13ac40f346ddbd0481efac2722d7674a0af9cb34c"} Apr 24 23:54:32.169599 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:32.169576 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-946b4db85-hh7bl" event={"ID":"899f41b7-00d1-44b2-bcba-0541dea9fcb3","Type":"ContainerStarted","Data":"335c72e92ae7a2a789459a9ec7a840f6a6a69e921f1adef0f8c892b183b40f21"} Apr 24 23:54:32.169599 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:32.169602 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-946b4db85-hh7bl" event={"ID":"899f41b7-00d1-44b2-bcba-0541dea9fcb3","Type":"ContainerStarted","Data":"b66449d75dbdd8917dc73e5ff1a3304154a1a85d757dcbfb6fd451c1c1361d1c"} Apr 24 23:54:32.169790 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:32.169650 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:32.170546 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:32.170526 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z6qcm" event={"ID":"ed05dbf9-ea8c-41d5-ac86-6efec6560e64","Type":"ContainerStarted","Data":"b629da11fde3eb81fc4645de91d275c9b0ba10215e6b7d853f29f1c3a230657e"} Apr 24 23:54:32.190582 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:32.190541 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-946b4db85-hh7bl" podStartSLOduration=2.190529773 podStartE2EDuration="2.190529773s" podCreationTimestamp="2026-04-24 23:54:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:54:32.189636086 +0000 UTC m=+54.823618033" watchObservedRunningTime="2026-04-24 23:54:32.190529773 +0000 UTC m=+54.824511708" Apr 24 23:54:32.908274 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:32.907686 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:54:32.908274 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:32.907728 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:54:32.908274 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:32.907861 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:54:32.911088 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:32.911065 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 23:54:32.911216 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:32.911101 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 23:54:32.911357 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:32.911340 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 23:54:32.911493 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:32.911475 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 23:54:32.911568 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:32.911065 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h8trl\"" Apr 24 23:54:32.911669 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:32.911654 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-ghb4v\"" Apr 24 23:54:34.175855 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:34.175816 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z6qcm" event={"ID":"ed05dbf9-ea8c-41d5-ac86-6efec6560e64","Type":"ContainerStarted","Data":"c25a5eb4692b626379623329a0b54cbcca5316ff8e6c31d74378256808df58ed"} Apr 24 23:54:34.177308 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:34.177279 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l2htg" event={"ID":"9d910288-c4b3-4b19-9188-f9ded54fd92f","Type":"ContainerStarted","Data":"2b43b2457c4a69f37bd800303102989eb6c3c984094d035d03e91c128d72a8d3"} Apr 24 23:54:34.177442 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:34.177313 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l2htg" event={"ID":"9d910288-c4b3-4b19-9188-f9ded54fd92f","Type":"ContainerStarted","Data":"b991ed0de3cb4939332994cf6ceb5fee67b4fc7d2e775a0b294011bfe1a722ba"} Apr 24 23:54:34.177442 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:34.177403 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-l2htg" Apr 24 23:54:34.189933 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:34.189895 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-z6qcm" podStartSLOduration=1.489995266 podStartE2EDuration="3.189885341s" podCreationTimestamp="2026-04-24 23:54:31 +0000 UTC" firstStartedPulling="2026-04-24 23:54:31.920429767 +0000 UTC m=+54.554411685" lastFinishedPulling="2026-04-24 23:54:33.620319843 +0000 UTC m=+56.254301760" observedRunningTime="2026-04-24 23:54:34.189716012 +0000 UTC m=+56.823697950" watchObservedRunningTime="2026-04-24 23:54:34.189885341 +0000 UTC m=+56.823867277" Apr 24 23:54:34.205633 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:34.205585 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-l2htg" podStartSLOduration=1.383715921 podStartE2EDuration="3.205568324s" podCreationTimestamp="2026-04-24 23:54:31 +0000 UTC" firstStartedPulling="2026-04-24 23:54:31.795105789 +0000 UTC m=+54.429087703" lastFinishedPulling="2026-04-24 23:54:33.616958188 +0000 UTC m=+56.250940106" observedRunningTime="2026-04-24 23:54:34.204389428 +0000 UTC m=+56.838371365" watchObservedRunningTime="2026-04-24 23:54:34.205568324 +0000 UTC m=+56.839550262" Apr 24 23:54:36.127020 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:36.126993 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t4f4q" Apr 24 23:54:37.749091 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:37.749053 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-l2htg_9d910288-c4b3-4b19-9188-f9ded54fd92f/dns/0.log" Apr 24 23:54:37.922823 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:37.922795 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-l2htg_9d910288-c4b3-4b19-9188-f9ded54fd92f/kube-rbac-proxy/0.log" Apr 24 23:54:38.926628 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:38.926595 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s2pf5_b50de4c3-3440-4c81-81ac-23466ec3f726/dns-node-resolver/0.log" Apr 24 23:54:39.323268 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:39.323244 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-946b4db85-hh7bl_899f41b7-00d1-44b2-bcba-0541dea9fcb3/registry/0.log" Apr 24 23:54:39.923653 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:39.923629 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vdjkm_220c5498-d45f-48c2-a25e-01ac23225100/node-ca/0.log" Apr 24 23:54:40.722389 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:40.722361 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-z6qcm_ed05dbf9-ea8c-41d5-ac86-6efec6560e64/serve-healthcheck-canary/0.log" Apr 24 23:54:42.600591 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:42.600551 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs\") pod \"network-metrics-daemon-rhtrz\" (UID: \"d3fe756c-b2b5-42bc-8234-bd6d59e5dd29\") " pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:54:42.602778 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:42.602755 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 23:54:42.613925 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:42.613900 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3fe756c-b2b5-42bc-8234-bd6d59e5dd29-metrics-certs\") pod \"network-metrics-daemon-rhtrz\" (UID: \"d3fe756c-b2b5-42bc-8234-bd6d59e5dd29\") " pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:54:42.801959 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:42.801925 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xgh5\" (UniqueName: \"kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5\") pod \"network-check-target-w2qd9\" (UID: \"ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1\") " pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:54:42.804276 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:42.804252 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 23:54:42.815242 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:42.815221 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 23:54:42.825851 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:42.825830 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xgh5\" (UniqueName: \"kubernetes.io/projected/ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1-kube-api-access-8xgh5\") pod \"network-check-target-w2qd9\" (UID: \"ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1\") " pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:54:42.833943 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:42.833926 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h8trl\"" Apr 24 23:54:42.841888 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:42.841869 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rhtrz" Apr 24 23:54:42.963659 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:42.963629 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rhtrz"] Apr 24 23:54:42.966617 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:54:42.966587 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3fe756c_b2b5_42bc_8234_bd6d59e5dd29.slice/crio-87e8abf2e706bb97c382679bd3502e104c1bb27864099ffa45aa8f29bdffbf75 WatchSource:0}: Error finding container 87e8abf2e706bb97c382679bd3502e104c1bb27864099ffa45aa8f29bdffbf75: Status 404 returned error can't find the container with id 87e8abf2e706bb97c382679bd3502e104c1bb27864099ffa45aa8f29bdffbf75 Apr 24 23:54:43.123560 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:43.123529 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-ghb4v\"" Apr 24 23:54:43.131660 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:43.131638 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:54:43.198710 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:43.198666 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rhtrz" event={"ID":"d3fe756c-b2b5-42bc-8234-bd6d59e5dd29","Type":"ContainerStarted","Data":"87e8abf2e706bb97c382679bd3502e104c1bb27864099ffa45aa8f29bdffbf75"} Apr 24 23:54:43.249078 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:43.249045 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-w2qd9"] Apr 24 23:54:43.252065 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:54:43.252036 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac28ac1b_bb45_4d9f_a544_3fa1e7fd33f1.slice/crio-ff9e7ce2600f134f82e74afed52274a6ead7f8e57f8576274ccc1d4a7ec27908 WatchSource:0}: Error finding container ff9e7ce2600f134f82e74afed52274a6ead7f8e57f8576274ccc1d4a7ec27908: Status 404 returned error can't find the container with id ff9e7ce2600f134f82e74afed52274a6ead7f8e57f8576274ccc1d4a7ec27908 Apr 24 23:54:44.181803 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:44.181774 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-l2htg" Apr 24 23:54:44.201715 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:44.201683 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-w2qd9" event={"ID":"ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1","Type":"ContainerStarted","Data":"ff9e7ce2600f134f82e74afed52274a6ead7f8e57f8576274ccc1d4a7ec27908"} Apr 24 23:54:44.903651 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:44.903617 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4ps8n"] Apr 24 23:54:44.906939 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:44.906918 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:44.909872 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:44.909601 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 23:54:44.909872 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:44.909617 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 23:54:44.909872 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:44.909636 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 23:54:44.909872 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:44.909644 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 23:54:44.910216 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:44.910200 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-m6xfv\"" Apr 24 23:54:44.910272 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:44.910251 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 23:54:44.910332 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:44.910304 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 23:54:45.017212 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.017178 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e360c868-68fa-4bd9-864f-093fce4cb0c5-node-exporter-wtmp\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.017212 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.017215 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e360c868-68fa-4bd9-864f-093fce4cb0c5-sys\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.017452 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.017270 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzptq\" (UniqueName: \"kubernetes.io/projected/e360c868-68fa-4bd9-864f-093fce4cb0c5-kube-api-access-xzptq\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.017452 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.017373 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e360c868-68fa-4bd9-864f-093fce4cb0c5-node-exporter-textfile\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.017452 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.017408 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e360c868-68fa-4bd9-864f-093fce4cb0c5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.017615 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.017482 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e360c868-68fa-4bd9-864f-093fce4cb0c5-root\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.017615 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.017506 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e360c868-68fa-4bd9-864f-093fce4cb0c5-node-exporter-tls\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.017615 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.017585 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e360c868-68fa-4bd9-864f-093fce4cb0c5-node-exporter-accelerators-collector-config\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.017749 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.017612 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e360c868-68fa-4bd9-864f-093fce4cb0c5-metrics-client-ca\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.118559 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.118516 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e360c868-68fa-4bd9-864f-093fce4cb0c5-node-exporter-accelerators-collector-config\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.118559 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.118557 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e360c868-68fa-4bd9-864f-093fce4cb0c5-metrics-client-ca\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.118792 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.118585 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e360c868-68fa-4bd9-864f-093fce4cb0c5-node-exporter-wtmp\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.118792 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.118610 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e360c868-68fa-4bd9-864f-093fce4cb0c5-sys\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.118792 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.118635 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzptq\" (UniqueName: \"kubernetes.io/projected/e360c868-68fa-4bd9-864f-093fce4cb0c5-kube-api-access-xzptq\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.118792 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.118663 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e360c868-68fa-4bd9-864f-093fce4cb0c5-node-exporter-textfile\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.118792 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.118690 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e360c868-68fa-4bd9-864f-093fce4cb0c5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.118792 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.118721 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e360c868-68fa-4bd9-864f-093fce4cb0c5-root\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.118792 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.118721 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e360c868-68fa-4bd9-864f-093fce4cb0c5-sys\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.118792 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.118738 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e360c868-68fa-4bd9-864f-093fce4cb0c5-node-exporter-tls\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.118792 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.118756 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e360c868-68fa-4bd9-864f-093fce4cb0c5-node-exporter-wtmp\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.119264 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.119149 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e360c868-68fa-4bd9-864f-093fce4cb0c5-root\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.119264 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.119193 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e360c868-68fa-4bd9-864f-093fce4cb0c5-metrics-client-ca\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.119394 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.119362 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e360c868-68fa-4bd9-864f-093fce4cb0c5-node-exporter-textfile\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.119660 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.119632 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e360c868-68fa-4bd9-864f-093fce4cb0c5-node-exporter-accelerators-collector-config\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.121614 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.121595 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e360c868-68fa-4bd9-864f-093fce4cb0c5-node-exporter-tls\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.121727 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.121633 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e360c868-68fa-4bd9-864f-093fce4cb0c5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.130727 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.130709 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzptq\" (UniqueName: \"kubernetes.io/projected/e360c868-68fa-4bd9-864f-093fce4cb0c5-kube-api-access-xzptq\") pod \"node-exporter-4ps8n\" (UID: \"e360c868-68fa-4bd9-864f-093fce4cb0c5\") " pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.207397 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.207302 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rhtrz" event={"ID":"d3fe756c-b2b5-42bc-8234-bd6d59e5dd29","Type":"ContainerStarted","Data":"b4a92446b43130d6d66bcb15eeffa2cf2747de572b0d6d92c328332f449bd39d"} Apr 24 23:54:45.207397 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.207343 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rhtrz" event={"ID":"d3fe756c-b2b5-42bc-8234-bd6d59e5dd29","Type":"ContainerStarted","Data":"2178707b1ee963025f4827fc2b34b0d967ae67cbd56e34f429a08079cb13f20a"} Apr 24 23:54:45.219338 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.219310 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4ps8n" Apr 24 23:54:45.225651 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.225601 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rhtrz" podStartSLOduration=65.816727447 podStartE2EDuration="1m7.225583027s" podCreationTimestamp="2026-04-24 23:53:38 +0000 UTC" firstStartedPulling="2026-04-24 23:54:42.968374293 +0000 UTC m=+65.602356208" lastFinishedPulling="2026-04-24 23:54:44.37722986 +0000 UTC m=+67.011211788" observedRunningTime="2026-04-24 23:54:45.224572265 +0000 UTC m=+67.858554228" watchObservedRunningTime="2026-04-24 23:54:45.225583027 +0000 UTC m=+67.859564965" Apr 24 23:54:45.937478 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:54:45.937448 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode360c868_68fa_4bd9_864f_093fce4cb0c5.slice/crio-39098d6f3d4d28027dc4ad41f6305802db25a0e023e510fdbbac4a18550cfb10 WatchSource:0}: Error finding container 39098d6f3d4d28027dc4ad41f6305802db25a0e023e510fdbbac4a18550cfb10: Status 404 returned error can't find the container with id 39098d6f3d4d28027dc4ad41f6305802db25a0e023e510fdbbac4a18550cfb10 Apr 24 23:54:45.971660 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.971631 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:54:45.979171 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.979145 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:45.982597 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.982575 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 23:54:45.982706 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.982583 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 23:54:45.982884 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.982867 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 23:54:45.983017 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.982981 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 23:54:45.983170 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.983101 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-b6hsp\"" Apr 24 23:54:45.983256 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.983234 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 23:54:45.983361 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.983343 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 23:54:45.983499 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.983453 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 23:54:45.983499 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.983483 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 23:54:45.983770 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.983754 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 23:54:45.995672 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:45.995648 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:54:46.026323 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.026295 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.026461 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.026331 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.026461 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.026360 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e07e2d64-a643-4e33-95e3-da4375fb0205-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.026461 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.026430 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85cqn\" (UniqueName: \"kubernetes.io/projected/e07e2d64-a643-4e33-95e3-da4375fb0205-kube-api-access-85cqn\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.026595 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.026476 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-config-volume\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.026595 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.026501 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e07e2d64-a643-4e33-95e3-da4375fb0205-config-out\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.026595 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.026522 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e07e2d64-a643-4e33-95e3-da4375fb0205-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.026595 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.026542 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-web-config\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.026595 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.026562 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.026763 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.026649 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.026763 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.026679 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e07e2d64-a643-4e33-95e3-da4375fb0205-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.026763 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.026731 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e07e2d64-a643-4e33-95e3-da4375fb0205-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.026871 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.026779 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.127654 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.127617 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e07e2d64-a643-4e33-95e3-da4375fb0205-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.127654 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.127655 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-web-config\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.127877 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.127673 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.127877 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.127695 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.127877 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.127713 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e07e2d64-a643-4e33-95e3-da4375fb0205-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.127877 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.127750 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e07e2d64-a643-4e33-95e3-da4375fb0205-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.127877 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.127806 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.127877 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.127837 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.127877 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.127859 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.128168 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:46.127886 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e07e2d64-a643-4e33-95e3-da4375fb0205-alertmanager-trusted-ca-bundle podName:e07e2d64-a643-4e33-95e3-da4375fb0205 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:46.62786027 +0000 UTC m=+69.261842397 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/e07e2d64-a643-4e33-95e3-da4375fb0205-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "e07e2d64-a643-4e33-95e3-da4375fb0205") : configmap references non-existent config key: ca-bundle.crt Apr 24 23:54:46.128168 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.127936 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e07e2d64-a643-4e33-95e3-da4375fb0205-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.128168 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:46.127941 2578 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 24 23:54:46.128168 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.127969 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85cqn\" (UniqueName: \"kubernetes.io/projected/e07e2d64-a643-4e33-95e3-da4375fb0205-kube-api-access-85cqn\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.128168 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.128003 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-config-volume\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.128168 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:46.128027 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-main-tls podName:e07e2d64-a643-4e33-95e3-da4375fb0205 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:46.628009574 +0000 UTC m=+69.261991490 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "e07e2d64-a643-4e33-95e3-da4375fb0205") : secret "alertmanager-main-tls" not found Apr 24 23:54:46.128168 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.128068 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e07e2d64-a643-4e33-95e3-da4375fb0205-config-out\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.128564 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.128238 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e07e2d64-a643-4e33-95e3-da4375fb0205-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.128564 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.128495 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e07e2d64-a643-4e33-95e3-da4375fb0205-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.131219 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.131189 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e07e2d64-a643-4e33-95e3-da4375fb0205-config-out\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.131219 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.131205 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-config-volume\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.131388 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.131264 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e07e2d64-a643-4e33-95e3-da4375fb0205-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.131388 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.131313 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.131388 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.131369 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.131755 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.131738 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.131755 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.131746 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-web-config\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.132070 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.132053 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.136235 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.136217 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85cqn\" (UniqueName: \"kubernetes.io/projected/e07e2d64-a643-4e33-95e3-da4375fb0205-kube-api-access-85cqn\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.211520 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.211437 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-w2qd9" event={"ID":"ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1","Type":"ContainerStarted","Data":"36ad2f229168ad781501207a009d1d147da91ccc30e85f80557990c7e6efad3f"} Apr 24 23:54:46.211940 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.211646 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:54:46.212562 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.212539 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4ps8n" event={"ID":"e360c868-68fa-4bd9-864f-093fce4cb0c5","Type":"ContainerStarted","Data":"39098d6f3d4d28027dc4ad41f6305802db25a0e023e510fdbbac4a18550cfb10"} Apr 24 23:54:46.228217 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.228166 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-w2qd9" podStartSLOduration=65.497121581 podStartE2EDuration="1m8.228151914s" podCreationTimestamp="2026-04-24 23:53:38 +0000 UTC" firstStartedPulling="2026-04-24 23:54:43.25382873 +0000 UTC m=+65.887810644" lastFinishedPulling="2026-04-24 23:54:45.984859058 +0000 UTC m=+68.618840977" observedRunningTime="2026-04-24 23:54:46.226916394 +0000 UTC m=+68.860898331" watchObservedRunningTime="2026-04-24 23:54:46.228151914 +0000 UTC m=+68.862133851" Apr 24 23:54:46.632550 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.632510 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.632736 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.632612 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e07e2d64-a643-4e33-95e3-da4375fb0205-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:46.632736 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:46.632676 2578 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 24 23:54:46.632851 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:54:46.632760 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-main-tls podName:e07e2d64-a643-4e33-95e3-da4375fb0205 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:47.632739177 +0000 UTC m=+70.266721091 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "e07e2d64-a643-4e33-95e3-da4375fb0205") : secret "alertmanager-main-tls" not found Apr 24 23:54:46.633228 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:46.633208 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e07e2d64-a643-4e33-95e3-da4375fb0205-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:47.188230 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.188188 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-rrznm"] Apr 24 23:54:47.191190 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.191164 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rrznm" Apr 24 23:54:47.193542 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.193520 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 23:54:47.193664 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.193555 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 23:54:47.193664 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.193563 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-r9wdl\"" Apr 24 23:54:47.194463 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.194448 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 23:54:47.194560 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.194480 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 23:54:47.202443 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.202405 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rrznm"] Apr 24 23:54:47.217171 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.217148 2578 generic.go:358] "Generic (PLEG): container finished" podID="e360c868-68fa-4bd9-864f-093fce4cb0c5" containerID="bf440f8fabfa3546af4ade4861c5d81a2525da35d38e3d35efcabe1561136173" exitCode=0 Apr 24 23:54:47.217518 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.217237 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4ps8n" event={"ID":"e360c868-68fa-4bd9-864f-093fce4cb0c5","Type":"ContainerDied","Data":"bf440f8fabfa3546af4ade4861c5d81a2525da35d38e3d35efcabe1561136173"} Apr 24 23:54:47.236604 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.236582 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bcc13d05-91af-4a72-98c9-e7de706cfb3c-data-volume\") pod \"insights-runtime-extractor-rrznm\" (UID: \"bcc13d05-91af-4a72-98c9-e7de706cfb3c\") " pod="openshift-insights/insights-runtime-extractor-rrznm" Apr 24 23:54:47.236711 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.236625 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bcc13d05-91af-4a72-98c9-e7de706cfb3c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rrznm\" (UID: \"bcc13d05-91af-4a72-98c9-e7de706cfb3c\") " pod="openshift-insights/insights-runtime-extractor-rrznm" Apr 24 23:54:47.236776 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.236706 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccm89\" (UniqueName: \"kubernetes.io/projected/bcc13d05-91af-4a72-98c9-e7de706cfb3c-kube-api-access-ccm89\") pod \"insights-runtime-extractor-rrznm\" (UID: \"bcc13d05-91af-4a72-98c9-e7de706cfb3c\") " pod="openshift-insights/insights-runtime-extractor-rrznm" Apr 24 23:54:47.236827 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.236794 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bcc13d05-91af-4a72-98c9-e7de706cfb3c-crio-socket\") pod \"insights-runtime-extractor-rrznm\" (UID: \"bcc13d05-91af-4a72-98c9-e7de706cfb3c\") " pod="openshift-insights/insights-runtime-extractor-rrznm" Apr 24 23:54:47.236886 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.236869 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bcc13d05-91af-4a72-98c9-e7de706cfb3c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rrznm\" (UID: \"bcc13d05-91af-4a72-98c9-e7de706cfb3c\") " pod="openshift-insights/insights-runtime-extractor-rrznm" Apr 24 23:54:47.337787 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.337732 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bcc13d05-91af-4a72-98c9-e7de706cfb3c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rrznm\" (UID: \"bcc13d05-91af-4a72-98c9-e7de706cfb3c\") " pod="openshift-insights/insights-runtime-extractor-rrznm" Apr 24 23:54:47.337787 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.337779 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ccm89\" (UniqueName: \"kubernetes.io/projected/bcc13d05-91af-4a72-98c9-e7de706cfb3c-kube-api-access-ccm89\") pod \"insights-runtime-extractor-rrznm\" (UID: \"bcc13d05-91af-4a72-98c9-e7de706cfb3c\") " pod="openshift-insights/insights-runtime-extractor-rrznm" Apr 24 23:54:47.337966 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.337866 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bcc13d05-91af-4a72-98c9-e7de706cfb3c-crio-socket\") pod \"insights-runtime-extractor-rrznm\" (UID: \"bcc13d05-91af-4a72-98c9-e7de706cfb3c\") " pod="openshift-insights/insights-runtime-extractor-rrznm" Apr 24 23:54:47.337966 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.337946 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bcc13d05-91af-4a72-98c9-e7de706cfb3c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rrznm\" (UID: \"bcc13d05-91af-4a72-98c9-e7de706cfb3c\") " pod="openshift-insights/insights-runtime-extractor-rrznm" Apr 24 23:54:47.338070 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.338027 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bcc13d05-91af-4a72-98c9-e7de706cfb3c-data-volume\") pod \"insights-runtime-extractor-rrznm\" (UID: \"bcc13d05-91af-4a72-98c9-e7de706cfb3c\") " pod="openshift-insights/insights-runtime-extractor-rrznm" Apr 24 23:54:47.338668 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.338335 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bcc13d05-91af-4a72-98c9-e7de706cfb3c-data-volume\") pod \"insights-runtime-extractor-rrznm\" (UID: \"bcc13d05-91af-4a72-98c9-e7de706cfb3c\") " pod="openshift-insights/insights-runtime-extractor-rrznm" Apr 24 23:54:47.338668 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.338447 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bcc13d05-91af-4a72-98c9-e7de706cfb3c-crio-socket\") pod \"insights-runtime-extractor-rrznm\" (UID: \"bcc13d05-91af-4a72-98c9-e7de706cfb3c\") " pod="openshift-insights/insights-runtime-extractor-rrznm" Apr 24 23:54:47.339733 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.339703 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bcc13d05-91af-4a72-98c9-e7de706cfb3c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rrznm\" (UID: \"bcc13d05-91af-4a72-98c9-e7de706cfb3c\") " pod="openshift-insights/insights-runtime-extractor-rrznm" Apr 24 23:54:47.340209 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.340178 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bcc13d05-91af-4a72-98c9-e7de706cfb3c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rrznm\" (UID: \"bcc13d05-91af-4a72-98c9-e7de706cfb3c\") " pod="openshift-insights/insights-runtime-extractor-rrznm" Apr 24 23:54:47.345974 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.345954 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccm89\" (UniqueName: \"kubernetes.io/projected/bcc13d05-91af-4a72-98c9-e7de706cfb3c-kube-api-access-ccm89\") pod \"insights-runtime-extractor-rrznm\" (UID: \"bcc13d05-91af-4a72-98c9-e7de706cfb3c\") " pod="openshift-insights/insights-runtime-extractor-rrznm" Apr 24 23:54:47.500387 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.500308 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rrznm" Apr 24 23:54:47.566046 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.566017 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6cbfc9cdc5-hxhrp"] Apr 24 23:54:47.571334 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.570381 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:47.574106 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.574079 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-qmbm4\"" Apr 24 23:54:47.575256 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.574593 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 23:54:47.575256 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.574671 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 23:54:47.575256 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.574916 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 23:54:47.575256 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.574976 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 23:54:47.575256 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.575099 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 23:54:47.575621 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.575264 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 23:54:47.575621 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.575455 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 23:54:47.599969 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.599948 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cbfc9cdc5-hxhrp"] Apr 24 23:54:47.619174 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.619147 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rrznm"] Apr 24 23:54:47.621841 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:54:47.621818 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcc13d05_91af_4a72_98c9_e7de706cfb3c.slice/crio-8ef356e904990d8e69fdaf346dad023323cbb0b48d9b03ffdc5f43bf3444a744 WatchSource:0}: Error finding container 8ef356e904990d8e69fdaf346dad023323cbb0b48d9b03ffdc5f43bf3444a744: Status 404 returned error can't find the container with id 8ef356e904990d8e69fdaf346dad023323cbb0b48d9b03ffdc5f43bf3444a744 Apr 24 23:54:47.641725 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.641705 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/996408a8-eb49-4662-835d-8d0ec08e9dfe-console-oauth-config\") pod \"console-6cbfc9cdc5-hxhrp\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:47.641861 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.641733 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/996408a8-eb49-4662-835d-8d0ec08e9dfe-oauth-serving-cert\") pod \"console-6cbfc9cdc5-hxhrp\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:47.641861 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.641756 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/996408a8-eb49-4662-835d-8d0ec08e9dfe-console-serving-cert\") pod \"console-6cbfc9cdc5-hxhrp\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:47.641861 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.641772 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt6c4\" (UniqueName: \"kubernetes.io/projected/996408a8-eb49-4662-835d-8d0ec08e9dfe-kube-api-access-jt6c4\") pod \"console-6cbfc9cdc5-hxhrp\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:47.641861 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.641841 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/996408a8-eb49-4662-835d-8d0ec08e9dfe-service-ca\") pod \"console-6cbfc9cdc5-hxhrp\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:47.641992 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.641885 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:47.641992 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.641908 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/996408a8-eb49-4662-835d-8d0ec08e9dfe-console-config\") pod \"console-6cbfc9cdc5-hxhrp\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:47.644504 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.644486 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:47.743197 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.743173 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/996408a8-eb49-4662-835d-8d0ec08e9dfe-service-ca\") pod \"console-6cbfc9cdc5-hxhrp\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:47.743351 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.743223 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/996408a8-eb49-4662-835d-8d0ec08e9dfe-console-config\") pod \"console-6cbfc9cdc5-hxhrp\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:47.743351 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.743247 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/996408a8-eb49-4662-835d-8d0ec08e9dfe-console-oauth-config\") pod \"console-6cbfc9cdc5-hxhrp\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:47.743351 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.743276 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/996408a8-eb49-4662-835d-8d0ec08e9dfe-oauth-serving-cert\") pod \"console-6cbfc9cdc5-hxhrp\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:47.743351 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.743297 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/996408a8-eb49-4662-835d-8d0ec08e9dfe-console-serving-cert\") pod \"console-6cbfc9cdc5-hxhrp\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:47.743351 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.743312 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jt6c4\" (UniqueName: \"kubernetes.io/projected/996408a8-eb49-4662-835d-8d0ec08e9dfe-kube-api-access-jt6c4\") pod \"console-6cbfc9cdc5-hxhrp\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:47.743996 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.743953 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/996408a8-eb49-4662-835d-8d0ec08e9dfe-service-ca\") pod \"console-6cbfc9cdc5-hxhrp\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:47.743996 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.743953 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/996408a8-eb49-4662-835d-8d0ec08e9dfe-console-config\") pod \"console-6cbfc9cdc5-hxhrp\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:47.743996 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.743965 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/996408a8-eb49-4662-835d-8d0ec08e9dfe-oauth-serving-cert\") pod \"console-6cbfc9cdc5-hxhrp\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:47.745751 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.745728 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/996408a8-eb49-4662-835d-8d0ec08e9dfe-console-oauth-config\") pod \"console-6cbfc9cdc5-hxhrp\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:47.745880 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.745860 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/996408a8-eb49-4662-835d-8d0ec08e9dfe-console-serving-cert\") pod \"console-6cbfc9cdc5-hxhrp\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:47.750636 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.750592 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt6c4\" (UniqueName: \"kubernetes.io/projected/996408a8-eb49-4662-835d-8d0ec08e9dfe-kube-api-access-jt6c4\") pod \"console-6cbfc9cdc5-hxhrp\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:47.788632 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.788611 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:54:47.880751 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.880701 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:47.915570 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:47.915540 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:54:47.919362 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:54:47.919329 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode07e2d64_a643_4e33_95e3_da4375fb0205.slice/crio-6fad1828a003441fcdfd819cca2337f56669b821b0b73dd4a3e993be5d828830 WatchSource:0}: Error finding container 6fad1828a003441fcdfd819cca2337f56669b821b0b73dd4a3e993be5d828830: Status 404 returned error can't find the container with id 6fad1828a003441fcdfd819cca2337f56669b821b0b73dd4a3e993be5d828830 Apr 24 23:54:48.040159 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:48.040131 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cbfc9cdc5-hxhrp"] Apr 24 23:54:48.043126 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:54:48.043097 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod996408a8_eb49_4662_835d_8d0ec08e9dfe.slice/crio-89ca7570c912771afd0b8c106561003317602e1da4b70ab35be5df8efcae913c WatchSource:0}: Error finding container 89ca7570c912771afd0b8c106561003317602e1da4b70ab35be5df8efcae913c: Status 404 returned error can't find the container with id 89ca7570c912771afd0b8c106561003317602e1da4b70ab35be5df8efcae913c Apr 24 23:54:48.220775 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:48.220739 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cbfc9cdc5-hxhrp" event={"ID":"996408a8-eb49-4662-835d-8d0ec08e9dfe","Type":"ContainerStarted","Data":"89ca7570c912771afd0b8c106561003317602e1da4b70ab35be5df8efcae913c"} Apr 24 23:54:48.222634 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:48.222607 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4ps8n" event={"ID":"e360c868-68fa-4bd9-864f-093fce4cb0c5","Type":"ContainerStarted","Data":"399c19d71ac6fdffdbae7f61e17ed066f8075ede31d53eb107f5bc43d885f570"} Apr 24 23:54:48.222769 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:48.222642 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4ps8n" event={"ID":"e360c868-68fa-4bd9-864f-093fce4cb0c5","Type":"ContainerStarted","Data":"ef445552e418fcf92f8b91ecf105650ff7f7b2ccf6d2eb630fec581743a9688f"} Apr 24 23:54:48.224168 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:48.224142 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e07e2d64-a643-4e33-95e3-da4375fb0205","Type":"ContainerStarted","Data":"6fad1828a003441fcdfd819cca2337f56669b821b0b73dd4a3e993be5d828830"} Apr 24 23:54:48.225320 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:48.225302 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rrznm" event={"ID":"bcc13d05-91af-4a72-98c9-e7de706cfb3c","Type":"ContainerStarted","Data":"cd9410572c1cffb60f3bbdf04507542c18874005e1af1ff08db11405cbd4120b"} Apr 24 23:54:48.225456 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:48.225324 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rrznm" event={"ID":"bcc13d05-91af-4a72-98c9-e7de706cfb3c","Type":"ContainerStarted","Data":"8ef356e904990d8e69fdaf346dad023323cbb0b48d9b03ffdc5f43bf3444a744"} Apr 24 23:54:48.241044 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:48.240998 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4ps8n" podStartSLOduration=3.509976796 podStartE2EDuration="4.240984387s" podCreationTimestamp="2026-04-24 23:54:44 +0000 UTC" firstStartedPulling="2026-04-24 23:54:45.939392772 +0000 UTC m=+68.573374700" lastFinishedPulling="2026-04-24 23:54:46.670400373 +0000 UTC m=+69.304382291" observedRunningTime="2026-04-24 23:54:48.239804472 +0000 UTC m=+70.873786448" watchObservedRunningTime="2026-04-24 23:54:48.240984387 +0000 UTC m=+70.874966325" Apr 24 23:54:49.230664 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.230633 2578 generic.go:358] "Generic (PLEG): container finished" podID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerID="077f0c4c632a1cb0f174986f3c986c3601614ac72bd4a4741b05f4f7c9b21e20" exitCode=0 Apr 24 23:54:49.231035 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.230721 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e07e2d64-a643-4e33-95e3-da4375fb0205","Type":"ContainerDied","Data":"077f0c4c632a1cb0f174986f3c986c3601614ac72bd4a4741b05f4f7c9b21e20"} Apr 24 23:54:49.234071 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.234034 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rrznm" event={"ID":"bcc13d05-91af-4a72-98c9-e7de706cfb3c","Type":"ContainerStarted","Data":"0bf7ee09cef9cf7e794ee7ff46320563cdd9f7a15f8acb08aee57b97b9ae1bb5"} Apr 24 23:54:49.416519 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.416441 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5fb87d9599-nll7r"] Apr 24 23:54:49.419827 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.419800 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.422542 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.422323 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 23:54:49.422542 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.422454 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 23:54:49.422722 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.422682 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-1k44j118ofnr0\"" Apr 24 23:54:49.422782 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.422731 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 23:54:49.422954 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.422938 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 23:54:49.423025 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.422961 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-d2l8h\"" Apr 24 23:54:49.430975 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.430956 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5fb87d9599-nll7r"] Apr 24 23:54:49.562093 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.562058 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trr64\" (UniqueName: \"kubernetes.io/projected/ffe4cebf-4220-4ce3-bbbb-19bf7016f72a-kube-api-access-trr64\") pod \"metrics-server-5fb87d9599-nll7r\" (UID: \"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a\") " pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.562254 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.562104 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ffe4cebf-4220-4ce3-bbbb-19bf7016f72a-audit-log\") pod \"metrics-server-5fb87d9599-nll7r\" (UID: \"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a\") " pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.562254 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.562151 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe4cebf-4220-4ce3-bbbb-19bf7016f72a-client-ca-bundle\") pod \"metrics-server-5fb87d9599-nll7r\" (UID: \"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a\") " pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.562254 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.562190 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ffe4cebf-4220-4ce3-bbbb-19bf7016f72a-secret-metrics-server-client-certs\") pod \"metrics-server-5fb87d9599-nll7r\" (UID: \"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a\") " pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.562254 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.562208 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffe4cebf-4220-4ce3-bbbb-19bf7016f72a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fb87d9599-nll7r\" (UID: \"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a\") " pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.562254 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.562227 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ffe4cebf-4220-4ce3-bbbb-19bf7016f72a-metrics-server-audit-profiles\") pod \"metrics-server-5fb87d9599-nll7r\" (UID: \"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a\") " pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.562456 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.562254 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ffe4cebf-4220-4ce3-bbbb-19bf7016f72a-secret-metrics-server-tls\") pod \"metrics-server-5fb87d9599-nll7r\" (UID: \"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a\") " pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.663158 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.663102 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ffe4cebf-4220-4ce3-bbbb-19bf7016f72a-metrics-server-audit-profiles\") pod \"metrics-server-5fb87d9599-nll7r\" (UID: \"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a\") " pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.663158 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.663156 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ffe4cebf-4220-4ce3-bbbb-19bf7016f72a-secret-metrics-server-tls\") pod \"metrics-server-5fb87d9599-nll7r\" (UID: \"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a\") " pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.663388 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.663196 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trr64\" (UniqueName: \"kubernetes.io/projected/ffe4cebf-4220-4ce3-bbbb-19bf7016f72a-kube-api-access-trr64\") pod \"metrics-server-5fb87d9599-nll7r\" (UID: \"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a\") " pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.663388 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.663228 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ffe4cebf-4220-4ce3-bbbb-19bf7016f72a-audit-log\") pod \"metrics-server-5fb87d9599-nll7r\" (UID: \"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a\") " pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.663388 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.663296 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe4cebf-4220-4ce3-bbbb-19bf7016f72a-client-ca-bundle\") pod \"metrics-server-5fb87d9599-nll7r\" (UID: \"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a\") " pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.663388 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.663342 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ffe4cebf-4220-4ce3-bbbb-19bf7016f72a-secret-metrics-server-client-certs\") pod \"metrics-server-5fb87d9599-nll7r\" (UID: \"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a\") " pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.663388 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.663370 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffe4cebf-4220-4ce3-bbbb-19bf7016f72a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fb87d9599-nll7r\" (UID: \"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a\") " pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.664353 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.663816 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ffe4cebf-4220-4ce3-bbbb-19bf7016f72a-audit-log\") pod \"metrics-server-5fb87d9599-nll7r\" (UID: \"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a\") " pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.664968 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.664915 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffe4cebf-4220-4ce3-bbbb-19bf7016f72a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fb87d9599-nll7r\" (UID: \"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a\") " pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.665205 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.665184 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ffe4cebf-4220-4ce3-bbbb-19bf7016f72a-metrics-server-audit-profiles\") pod \"metrics-server-5fb87d9599-nll7r\" (UID: \"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a\") " pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.666593 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.666540 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe4cebf-4220-4ce3-bbbb-19bf7016f72a-client-ca-bundle\") pod \"metrics-server-5fb87d9599-nll7r\" (UID: \"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a\") " pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.666681 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.666637 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ffe4cebf-4220-4ce3-bbbb-19bf7016f72a-secret-metrics-server-tls\") pod \"metrics-server-5fb87d9599-nll7r\" (UID: \"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a\") " pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.666748 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.666685 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ffe4cebf-4220-4ce3-bbbb-19bf7016f72a-secret-metrics-server-client-certs\") pod \"metrics-server-5fb87d9599-nll7r\" (UID: \"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a\") " pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.679255 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.679234 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trr64\" (UniqueName: \"kubernetes.io/projected/ffe4cebf-4220-4ce3-bbbb-19bf7016f72a-kube-api-access-trr64\") pod \"metrics-server-5fb87d9599-nll7r\" (UID: \"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a\") " pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.733078 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.733050 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:54:49.988066 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:49.988039 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5fb87d9599-nll7r"] Apr 24 23:54:50.160430 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.160374 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-8d467dc86-ltlcr"] Apr 24 23:54:50.163926 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.163893 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.168123 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.168054 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 23:54:50.168305 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.168127 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 23:54:50.168709 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.168683 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 23:54:50.169005 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.168988 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-lrwk6\"" Apr 24 23:54:50.169222 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.169207 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 23:54:50.169347 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.169329 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 23:54:50.175172 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.175150 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 23:54:50.183210 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.183185 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-8d467dc86-ltlcr"] Apr 24 23:54:50.238044 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.238012 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rrznm" event={"ID":"bcc13d05-91af-4a72-98c9-e7de706cfb3c","Type":"ContainerStarted","Data":"dc05468f8cba974677003834829fde02345b4b9ad6660eee4dcf490ccb1e72d7"} Apr 24 23:54:50.257065 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.257015 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-rrznm" podStartSLOduration=1.044206297 podStartE2EDuration="3.256997806s" podCreationTimestamp="2026-04-24 23:54:47 +0000 UTC" firstStartedPulling="2026-04-24 23:54:47.67743758 +0000 UTC m=+70.311419494" lastFinishedPulling="2026-04-24 23:54:49.890229075 +0000 UTC m=+72.524211003" observedRunningTime="2026-04-24 23:54:50.256402386 +0000 UTC m=+72.890384323" watchObservedRunningTime="2026-04-24 23:54:50.256997806 +0000 UTC m=+72.890979744" Apr 24 23:54:50.267761 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.267731 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/55e30091-910e-4fab-9cad-4ef17aa7f6f6-secret-telemeter-client\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.267877 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.267770 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4g8d\" (UniqueName: \"kubernetes.io/projected/55e30091-910e-4fab-9cad-4ef17aa7f6f6-kube-api-access-c4g8d\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.267877 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.267820 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/55e30091-910e-4fab-9cad-4ef17aa7f6f6-federate-client-tls\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.267877 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.267840 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55e30091-910e-4fab-9cad-4ef17aa7f6f6-telemeter-trusted-ca-bundle\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.268020 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.267893 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/55e30091-910e-4fab-9cad-4ef17aa7f6f6-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.268020 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.267940 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55e30091-910e-4fab-9cad-4ef17aa7f6f6-serving-certs-ca-bundle\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.268020 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.267981 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/55e30091-910e-4fab-9cad-4ef17aa7f6f6-telemeter-client-tls\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.268020 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.268014 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/55e30091-910e-4fab-9cad-4ef17aa7f6f6-metrics-client-ca\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.369433 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.369381 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/55e30091-910e-4fab-9cad-4ef17aa7f6f6-federate-client-tls\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.369672 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.369450 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55e30091-910e-4fab-9cad-4ef17aa7f6f6-telemeter-trusted-ca-bundle\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.369672 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.369489 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/55e30091-910e-4fab-9cad-4ef17aa7f6f6-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.369672 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.369563 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55e30091-910e-4fab-9cad-4ef17aa7f6f6-serving-certs-ca-bundle\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.369672 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.369591 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/55e30091-910e-4fab-9cad-4ef17aa7f6f6-telemeter-client-tls\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.369672 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.369618 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/55e30091-910e-4fab-9cad-4ef17aa7f6f6-metrics-client-ca\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.369936 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.369683 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/55e30091-910e-4fab-9cad-4ef17aa7f6f6-secret-telemeter-client\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.369936 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.369708 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c4g8d\" (UniqueName: \"kubernetes.io/projected/55e30091-910e-4fab-9cad-4ef17aa7f6f6-kube-api-access-c4g8d\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.370633 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.370599 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55e30091-910e-4fab-9cad-4ef17aa7f6f6-serving-certs-ca-bundle\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.370744 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.370695 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55e30091-910e-4fab-9cad-4ef17aa7f6f6-telemeter-trusted-ca-bundle\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.370815 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.370771 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/55e30091-910e-4fab-9cad-4ef17aa7f6f6-metrics-client-ca\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.373258 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.373235 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/55e30091-910e-4fab-9cad-4ef17aa7f6f6-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.373515 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.373472 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/55e30091-910e-4fab-9cad-4ef17aa7f6f6-secret-telemeter-client\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.374151 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.374079 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/55e30091-910e-4fab-9cad-4ef17aa7f6f6-federate-client-tls\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.376082 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.376058 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/55e30091-910e-4fab-9cad-4ef17aa7f6f6-telemeter-client-tls\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.379674 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.379648 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4g8d\" (UniqueName: \"kubernetes.io/projected/55e30091-910e-4fab-9cad-4ef17aa7f6f6-kube-api-access-c4g8d\") pod \"telemeter-client-8d467dc86-ltlcr\" (UID: \"55e30091-910e-4fab-9cad-4ef17aa7f6f6\") " pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:50.477491 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:50.477300 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" Apr 24 23:54:51.264871 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.264841 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:54:51.268738 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.268715 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.271828 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.271804 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 23:54:51.271959 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.271804 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 23:54:51.272136 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.272076 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 23:54:51.272292 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.272278 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 23:54:51.272367 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.272332 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 23:54:51.272553 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.272533 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 23:54:51.272671 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.272553 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 23:54:51.272941 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.272924 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 23:54:51.273702 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.273654 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 23:54:51.274893 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.274873 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-a51i7k9kpjgm0\"" Apr 24 23:54:51.275233 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.275214 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-tjhn5\"" Apr 24 23:54:51.275693 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.275667 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 23:54:51.277236 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.276717 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 23:54:51.288752 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.288728 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 23:54:51.291529 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.291506 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:54:51.363646 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:54:51.363619 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffe4cebf_4220_4ce3_bbbb_19bf7016f72a.slice/crio-4755c636739af2a032103660c3bebae4590d5f0630f55562dadf95c1ed069e3e WatchSource:0}: Error finding container 4755c636739af2a032103660c3bebae4590d5f0630f55562dadf95c1ed069e3e: Status 404 returned error can't find the container with id 4755c636739af2a032103660c3bebae4590d5f0630f55562dadf95c1ed069e3e Apr 24 23:54:51.378456 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.378430 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/84cf5627-5273-43e3-99ae-b6ab2371aa69-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.378556 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.378469 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.378556 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.378503 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/84cf5627-5273-43e3-99ae-b6ab2371aa69-config-out\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.378556 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.378533 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-config\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.378772 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.378562 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/84cf5627-5273-43e3-99ae-b6ab2371aa69-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.378772 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.378588 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.378772 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.378612 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.378772 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.378634 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.378772 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.378716 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-web-config\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.378772 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.378756 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.379016 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.378780 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.379016 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.378808 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.379016 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.378856 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.379016 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.378909 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.379016 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.378957 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhmnn\" (UniqueName: \"kubernetes.io/projected/84cf5627-5273-43e3-99ae-b6ab2371aa69-kube-api-access-hhmnn\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.379016 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.378993 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.379016 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.379010 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.379275 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.379032 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.480663 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.480632 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hhmnn\" (UniqueName: \"kubernetes.io/projected/84cf5627-5273-43e3-99ae-b6ab2371aa69-kube-api-access-hhmnn\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.482739 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.482237 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.482739 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.482304 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.482739 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.482344 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.482739 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.482391 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/84cf5627-5273-43e3-99ae-b6ab2371aa69-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.482739 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.482447 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.482739 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.482485 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/84cf5627-5273-43e3-99ae-b6ab2371aa69-config-out\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.482739 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.482522 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-config\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.482739 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.482552 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/84cf5627-5273-43e3-99ae-b6ab2371aa69-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.482739 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.482580 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.482739 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.482615 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.482739 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.482651 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.482739 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.482711 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-web-config\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.483582 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.482769 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.483582 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.482799 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.483582 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.482836 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.483582 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.482936 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.483582 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.482973 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.486081 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.483978 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/84cf5627-5273-43e3-99ae-b6ab2371aa69-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.486081 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.485109 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.486081 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.485955 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.487647 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.487217 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.487647 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.487471 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.489065 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.489038 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.491706 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.491684 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.494598 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.491807 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.494598 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.492052 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.494598 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.492719 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/84cf5627-5273-43e3-99ae-b6ab2371aa69-config-out\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.494598 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.493373 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.494598 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.493407 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.494598 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.493387 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/84cf5627-5273-43e3-99ae-b6ab2371aa69-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.494598 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.494440 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.495579 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.495555 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.496303 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.496261 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-config\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.496535 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.496504 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-web-config\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.500718 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.500688 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhmnn\" (UniqueName: \"kubernetes.io/projected/84cf5627-5273-43e3-99ae-b6ab2371aa69-kube-api-access-hhmnn\") pod \"prometheus-k8s-0\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.562283 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.562252 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-8d467dc86-ltlcr"] Apr 24 23:54:51.565820 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:54:51.565626 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55e30091_910e_4fab_9cad_4ef17aa7f6f6.slice/crio-0f1408aefb34a891c90adeed234941b25fb0532e726dc1cd9c2789d1090c24ce WatchSource:0}: Error finding container 0f1408aefb34a891c90adeed234941b25fb0532e726dc1cd9c2789d1090c24ce: Status 404 returned error can't find the container with id 0f1408aefb34a891c90adeed234941b25fb0532e726dc1cd9c2789d1090c24ce Apr 24 23:54:51.580006 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.579914 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:54:51.583845 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.583818 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret\") pod \"global-pull-secret-syncer-26g8h\" (UID: \"532cdfc7-fd38-495f-b85d-70daea2998a1\") " pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:54:51.586323 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.586300 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 23:54:51.597155 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.597126 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/532cdfc7-fd38-495f-b85d-70daea2998a1-original-pull-secret\") pod \"global-pull-secret-syncer-26g8h\" (UID: \"532cdfc7-fd38-495f-b85d-70daea2998a1\") " pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:54:51.717211 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.717170 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:54:51.719776 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:54:51.719747 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84cf5627_5273_43e3_99ae_b6ab2371aa69.slice/crio-ec014283f674c62ee42133eb239fee1a7936360ef62f5673e73245bea582e034 WatchSource:0}: Error finding container ec014283f674c62ee42133eb239fee1a7936360ef62f5673e73245bea582e034: Status 404 returned error can't find the container with id ec014283f674c62ee42133eb239fee1a7936360ef62f5673e73245bea582e034 Apr 24 23:54:51.838602 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.838569 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-26g8h" Apr 24 23:54:51.950183 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:51.950152 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-26g8h"] Apr 24 23:54:51.954164 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:54:51.954136 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod532cdfc7_fd38_495f_b85d_70daea2998a1.slice/crio-cf720d4e4e03a68afba9a58c07a120fb853df297b2bdb889fa6b05f01e8a14e7 WatchSource:0}: Error finding container cf720d4e4e03a68afba9a58c07a120fb853df297b2bdb889fa6b05f01e8a14e7: Status 404 returned error can't find the container with id cf720d4e4e03a68afba9a58c07a120fb853df297b2bdb889fa6b05f01e8a14e7 Apr 24 23:54:52.246216 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:52.246139 2578 generic.go:358] "Generic (PLEG): container finished" podID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerID="9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b" exitCode=0 Apr 24 23:54:52.246364 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:52.246217 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"84cf5627-5273-43e3-99ae-b6ab2371aa69","Type":"ContainerDied","Data":"9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b"} Apr 24 23:54:52.246364 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:52.246243 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"84cf5627-5273-43e3-99ae-b6ab2371aa69","Type":"ContainerStarted","Data":"ec014283f674c62ee42133eb239fee1a7936360ef62f5673e73245bea582e034"} Apr 24 23:54:52.247509 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:52.247401 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" event={"ID":"55e30091-910e-4fab-9cad-4ef17aa7f6f6","Type":"ContainerStarted","Data":"0f1408aefb34a891c90adeed234941b25fb0532e726dc1cd9c2789d1090c24ce"} Apr 24 23:54:52.250369 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:52.250348 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e07e2d64-a643-4e33-95e3-da4375fb0205","Type":"ContainerStarted","Data":"99294f0747c4126f8fe9006d9730b80faae6830a85b9343bc357aef8341732cf"} Apr 24 23:54:52.250490 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:52.250375 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e07e2d64-a643-4e33-95e3-da4375fb0205","Type":"ContainerStarted","Data":"a81c532dabecc5247289b65a6bf8e85e15fa634465a91fc1b43d3ff71e502536"} Apr 24 23:54:52.250490 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:52.250388 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e07e2d64-a643-4e33-95e3-da4375fb0205","Type":"ContainerStarted","Data":"548c16366e57ef7df4d8c50d6080ec4c0037725883316f998ce434c0fa70418a"} Apr 24 23:54:52.250490 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:52.250399 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e07e2d64-a643-4e33-95e3-da4375fb0205","Type":"ContainerStarted","Data":"86ad16442cc1e55cfc193dcd83a5d5229e04d198c6daed2b2f59117d3e2a31f5"} Apr 24 23:54:52.250490 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:52.250427 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e07e2d64-a643-4e33-95e3-da4375fb0205","Type":"ContainerStarted","Data":"bfb58deac7bb137f563fa61d2941426ef3cab29d69cd97ab4425562ec2b95d9f"} Apr 24 23:54:52.251775 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:52.251742 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" event={"ID":"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a","Type":"ContainerStarted","Data":"4755c636739af2a032103660c3bebae4590d5f0630f55562dadf95c1ed069e3e"} Apr 24 23:54:52.253304 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:52.253283 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cbfc9cdc5-hxhrp" event={"ID":"996408a8-eb49-4662-835d-8d0ec08e9dfe","Type":"ContainerStarted","Data":"60704e5252805ae0b66a639a6d2d70a51f0873271ae7e4f475f017a0bdae61e5"} Apr 24 23:54:52.254454 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:52.254433 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-26g8h" event={"ID":"532cdfc7-fd38-495f-b85d-70daea2998a1","Type":"ContainerStarted","Data":"cf720d4e4e03a68afba9a58c07a120fb853df297b2bdb889fa6b05f01e8a14e7"} Apr 24 23:54:52.292855 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:52.292173 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6cbfc9cdc5-hxhrp" podStartSLOduration=1.911404675 podStartE2EDuration="5.292160249s" podCreationTimestamp="2026-04-24 23:54:47 +0000 UTC" firstStartedPulling="2026-04-24 23:54:48.045070603 +0000 UTC m=+70.679052516" lastFinishedPulling="2026-04-24 23:54:51.425826162 +0000 UTC m=+74.059808090" observedRunningTime="2026-04-24 23:54:52.290662475 +0000 UTC m=+74.924644412" watchObservedRunningTime="2026-04-24 23:54:52.292160249 +0000 UTC m=+74.926142186" Apr 24 23:54:53.177619 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:53.177589 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-946b4db85-hh7bl" Apr 24 23:54:54.264466 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:54.263568 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" event={"ID":"55e30091-910e-4fab-9cad-4ef17aa7f6f6","Type":"ContainerStarted","Data":"7869f9ebf381ba9b13c7108f58dcc1ee80546423e46f5eadcf864b8c66673968"} Apr 24 23:54:54.264466 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:54.263624 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" event={"ID":"55e30091-910e-4fab-9cad-4ef17aa7f6f6","Type":"ContainerStarted","Data":"57bdbd07ab146b99c76b27a06cbe54f036e00ae524473f30d165ad132c9c110c"} Apr 24 23:54:54.264466 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:54.263638 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" event={"ID":"55e30091-910e-4fab-9cad-4ef17aa7f6f6","Type":"ContainerStarted","Data":"3456168e74fd4c58677f5c5088f793db5a5918c3e7ac26c39cfab6e177fb6302"} Apr 24 23:54:54.270205 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:54.269364 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e07e2d64-a643-4e33-95e3-da4375fb0205","Type":"ContainerStarted","Data":"1786a9028ae6b3fcc402e6c94248d9c5a98605ff4f63d0bfa93f1d19e3bae21b"} Apr 24 23:54:54.273102 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:54.272605 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" event={"ID":"ffe4cebf-4220-4ce3-bbbb-19bf7016f72a","Type":"ContainerStarted","Data":"5df71c9b73c86b38bba9297de876dc3b8ae1e554e48d456dd24ec476f1b11e3c"} Apr 24 23:54:54.287462 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:54.285759 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-8d467dc86-ltlcr" podStartSLOduration=2.110184531 podStartE2EDuration="4.285742727s" podCreationTimestamp="2026-04-24 23:54:50 +0000 UTC" firstStartedPulling="2026-04-24 23:54:51.567831968 +0000 UTC m=+74.201813883" lastFinishedPulling="2026-04-24 23:54:53.743390152 +0000 UTC m=+76.377372079" observedRunningTime="2026-04-24 23:54:54.285579303 +0000 UTC m=+76.919561234" watchObservedRunningTime="2026-04-24 23:54:54.285742727 +0000 UTC m=+76.919724668" Apr 24 23:54:54.311903 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:54.310477 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.493868913 podStartE2EDuration="9.310460092s" podCreationTimestamp="2026-04-24 23:54:45 +0000 UTC" firstStartedPulling="2026-04-24 23:54:47.921284923 +0000 UTC m=+70.555266840" lastFinishedPulling="2026-04-24 23:54:53.73787609 +0000 UTC m=+76.371858019" observedRunningTime="2026-04-24 23:54:54.309028472 +0000 UTC m=+76.943010408" watchObservedRunningTime="2026-04-24 23:54:54.310460092 +0000 UTC m=+76.944442030" Apr 24 23:54:54.327561 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:54.326752 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" podStartSLOduration=2.978882676 podStartE2EDuration="5.326735562s" podCreationTimestamp="2026-04-24 23:54:49 +0000 UTC" firstStartedPulling="2026-04-24 23:54:51.389399683 +0000 UTC m=+74.023381597" lastFinishedPulling="2026-04-24 23:54:53.737252564 +0000 UTC m=+76.371234483" observedRunningTime="2026-04-24 23:54:54.325365772 +0000 UTC m=+76.959347709" watchObservedRunningTime="2026-04-24 23:54:54.326735562 +0000 UTC m=+76.960717500" Apr 24 23:54:57.284154 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:57.284110 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-26g8h" event={"ID":"532cdfc7-fd38-495f-b85d-70daea2998a1","Type":"ContainerStarted","Data":"06c451a3269d6082413eeb709132705c341a678a794133a46aae273016d713f4"} Apr 24 23:54:57.286026 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:57.286002 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"84cf5627-5273-43e3-99ae-b6ab2371aa69","Type":"ContainerStarted","Data":"037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982"} Apr 24 23:54:57.286139 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:57.286030 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"84cf5627-5273-43e3-99ae-b6ab2371aa69","Type":"ContainerStarted","Data":"4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926"} Apr 24 23:54:57.298409 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:57.298369 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-26g8h" podStartSLOduration=65.301952175 podStartE2EDuration="1m10.298357262s" podCreationTimestamp="2026-04-24 23:53:47 +0000 UTC" firstStartedPulling="2026-04-24 23:54:51.956116731 +0000 UTC m=+74.590098644" lastFinishedPulling="2026-04-24 23:54:56.952521814 +0000 UTC m=+79.586503731" observedRunningTime="2026-04-24 23:54:57.297785868 +0000 UTC m=+79.931767808" watchObservedRunningTime="2026-04-24 23:54:57.298357262 +0000 UTC m=+79.932339198" Apr 24 23:54:57.880804 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:57.880771 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:57.881233 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:57.881208 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:57.886392 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:57.886368 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:58.294257 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:58.294227 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:54:59.296048 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:59.296009 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"84cf5627-5273-43e3-99ae-b6ab2371aa69","Type":"ContainerStarted","Data":"90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600"} Apr 24 23:54:59.296048 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:59.296047 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"84cf5627-5273-43e3-99ae-b6ab2371aa69","Type":"ContainerStarted","Data":"13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574"} Apr 24 23:54:59.296048 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:59.296059 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"84cf5627-5273-43e3-99ae-b6ab2371aa69","Type":"ContainerStarted","Data":"f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f"} Apr 24 23:54:59.296626 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:59.296067 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"84cf5627-5273-43e3-99ae-b6ab2371aa69","Type":"ContainerStarted","Data":"a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe"} Apr 24 23:54:59.327565 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:54:59.327516 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.9499865029999999 podStartE2EDuration="8.327498441s" podCreationTimestamp="2026-04-24 23:54:51 +0000 UTC" firstStartedPulling="2026-04-24 23:54:52.247662745 +0000 UTC m=+74.881644674" lastFinishedPulling="2026-04-24 23:54:58.625174698 +0000 UTC m=+81.259156612" observedRunningTime="2026-04-24 23:54:59.325676674 +0000 UTC m=+81.959658610" watchObservedRunningTime="2026-04-24 23:54:59.327498441 +0000 UTC m=+81.961480389" Apr 24 23:55:01.580634 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:01.580597 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:55:07.108509 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:07.108470 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6cbfc9cdc5-hxhrp"] Apr 24 23:55:09.733899 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:09.733864 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:55:09.733899 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:09.733906 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:55:17.220688 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:17.220655 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-w2qd9" Apr 24 23:55:29.739430 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:29.739384 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:55:29.743338 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:29.743318 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5fb87d9599-nll7r" Apr 24 23:55:32.128622 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.128556 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6cbfc9cdc5-hxhrp" podUID="996408a8-eb49-4662-835d-8d0ec08e9dfe" containerName="console" containerID="cri-o://60704e5252805ae0b66a639a6d2d70a51f0873271ae7e4f475f017a0bdae61e5" gracePeriod=15 Apr 24 23:55:32.360142 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.360120 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cbfc9cdc5-hxhrp_996408a8-eb49-4662-835d-8d0ec08e9dfe/console/0.log" Apr 24 23:55:32.360252 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.360190 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:55:32.388498 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.388438 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cbfc9cdc5-hxhrp_996408a8-eb49-4662-835d-8d0ec08e9dfe/console/0.log" Apr 24 23:55:32.388498 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.388485 2578 generic.go:358] "Generic (PLEG): container finished" podID="996408a8-eb49-4662-835d-8d0ec08e9dfe" containerID="60704e5252805ae0b66a639a6d2d70a51f0873271ae7e4f475f017a0bdae61e5" exitCode=2 Apr 24 23:55:32.388665 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.388544 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cbfc9cdc5-hxhrp" event={"ID":"996408a8-eb49-4662-835d-8d0ec08e9dfe","Type":"ContainerDied","Data":"60704e5252805ae0b66a639a6d2d70a51f0873271ae7e4f475f017a0bdae61e5"} Apr 24 23:55:32.388665 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.388595 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cbfc9cdc5-hxhrp" event={"ID":"996408a8-eb49-4662-835d-8d0ec08e9dfe","Type":"ContainerDied","Data":"89ca7570c912771afd0b8c106561003317602e1da4b70ab35be5df8efcae913c"} Apr 24 23:55:32.388665 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.388619 2578 scope.go:117] "RemoveContainer" containerID="60704e5252805ae0b66a639a6d2d70a51f0873271ae7e4f475f017a0bdae61e5" Apr 24 23:55:32.388665 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.388561 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cbfc9cdc5-hxhrp" Apr 24 23:55:32.396943 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.396923 2578 scope.go:117] "RemoveContainer" containerID="60704e5252805ae0b66a639a6d2d70a51f0873271ae7e4f475f017a0bdae61e5" Apr 24 23:55:32.397206 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:55:32.397188 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60704e5252805ae0b66a639a6d2d70a51f0873271ae7e4f475f017a0bdae61e5\": container with ID starting with 60704e5252805ae0b66a639a6d2d70a51f0873271ae7e4f475f017a0bdae61e5 not found: ID does not exist" containerID="60704e5252805ae0b66a639a6d2d70a51f0873271ae7e4f475f017a0bdae61e5" Apr 24 23:55:32.397267 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.397216 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60704e5252805ae0b66a639a6d2d70a51f0873271ae7e4f475f017a0bdae61e5"} err="failed to get container status \"60704e5252805ae0b66a639a6d2d70a51f0873271ae7e4f475f017a0bdae61e5\": rpc error: code = NotFound desc = could not find container \"60704e5252805ae0b66a639a6d2d70a51f0873271ae7e4f475f017a0bdae61e5\": container with ID starting with 60704e5252805ae0b66a639a6d2d70a51f0873271ae7e4f475f017a0bdae61e5 not found: ID does not exist" Apr 24 23:55:32.399492 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.399479 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/996408a8-eb49-4662-835d-8d0ec08e9dfe-console-serving-cert\") pod \"996408a8-eb49-4662-835d-8d0ec08e9dfe\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " Apr 24 23:55:32.399560 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.399507 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt6c4\" (UniqueName: \"kubernetes.io/projected/996408a8-eb49-4662-835d-8d0ec08e9dfe-kube-api-access-jt6c4\") pod \"996408a8-eb49-4662-835d-8d0ec08e9dfe\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " Apr 24 23:55:32.399560 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.399537 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/996408a8-eb49-4662-835d-8d0ec08e9dfe-console-config\") pod \"996408a8-eb49-4662-835d-8d0ec08e9dfe\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " Apr 24 23:55:32.399634 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.399563 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/996408a8-eb49-4662-835d-8d0ec08e9dfe-oauth-serving-cert\") pod \"996408a8-eb49-4662-835d-8d0ec08e9dfe\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " Apr 24 23:55:32.399634 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.399607 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/996408a8-eb49-4662-835d-8d0ec08e9dfe-service-ca\") pod \"996408a8-eb49-4662-835d-8d0ec08e9dfe\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " Apr 24 23:55:32.399634 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.399629 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/996408a8-eb49-4662-835d-8d0ec08e9dfe-console-oauth-config\") pod \"996408a8-eb49-4662-835d-8d0ec08e9dfe\" (UID: \"996408a8-eb49-4662-835d-8d0ec08e9dfe\") " Apr 24 23:55:32.399934 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.399909 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/996408a8-eb49-4662-835d-8d0ec08e9dfe-console-config" (OuterVolumeSpecName: "console-config") pod "996408a8-eb49-4662-835d-8d0ec08e9dfe" (UID: "996408a8-eb49-4662-835d-8d0ec08e9dfe"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:55:32.400026 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.399959 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/996408a8-eb49-4662-835d-8d0ec08e9dfe-service-ca" (OuterVolumeSpecName: "service-ca") pod "996408a8-eb49-4662-835d-8d0ec08e9dfe" (UID: "996408a8-eb49-4662-835d-8d0ec08e9dfe"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:55:32.400026 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.399982 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/996408a8-eb49-4662-835d-8d0ec08e9dfe-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "996408a8-eb49-4662-835d-8d0ec08e9dfe" (UID: "996408a8-eb49-4662-835d-8d0ec08e9dfe"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:55:32.401903 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.401876 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/996408a8-eb49-4662-835d-8d0ec08e9dfe-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "996408a8-eb49-4662-835d-8d0ec08e9dfe" (UID: "996408a8-eb49-4662-835d-8d0ec08e9dfe"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:55:32.401994 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.401910 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/996408a8-eb49-4662-835d-8d0ec08e9dfe-kube-api-access-jt6c4" (OuterVolumeSpecName: "kube-api-access-jt6c4") pod "996408a8-eb49-4662-835d-8d0ec08e9dfe" (UID: "996408a8-eb49-4662-835d-8d0ec08e9dfe"). InnerVolumeSpecName "kube-api-access-jt6c4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:55:32.401994 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.401930 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/996408a8-eb49-4662-835d-8d0ec08e9dfe-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "996408a8-eb49-4662-835d-8d0ec08e9dfe" (UID: "996408a8-eb49-4662-835d-8d0ec08e9dfe"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:55:32.500326 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.500288 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/996408a8-eb49-4662-835d-8d0ec08e9dfe-service-ca\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:55:32.500326 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.500326 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/996408a8-eb49-4662-835d-8d0ec08e9dfe-console-oauth-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:55:32.500519 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.500336 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/996408a8-eb49-4662-835d-8d0ec08e9dfe-console-serving-cert\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:55:32.500519 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.500345 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jt6c4\" (UniqueName: \"kubernetes.io/projected/996408a8-eb49-4662-835d-8d0ec08e9dfe-kube-api-access-jt6c4\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:55:32.500519 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.500355 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/996408a8-eb49-4662-835d-8d0ec08e9dfe-console-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:55:32.500519 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.500364 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/996408a8-eb49-4662-835d-8d0ec08e9dfe-oauth-serving-cert\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:55:32.708299 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.708270 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6cbfc9cdc5-hxhrp"] Apr 24 23:55:32.711733 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:32.711708 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6cbfc9cdc5-hxhrp"] Apr 24 23:55:33.911525 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:33.911492 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="996408a8-eb49-4662-835d-8d0ec08e9dfe" path="/var/lib/kubelet/pods/996408a8-eb49-4662-835d-8d0ec08e9dfe/volumes" Apr 24 23:55:51.580669 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:51.580626 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:55:51.600671 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:51.600641 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:55:52.459568 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:55:52.459537 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:05.351880 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:05.351845 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:56:05.352388 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:05.352333 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="alertmanager" containerID="cri-o://bfb58deac7bb137f563fa61d2941426ef3cab29d69cd97ab4425562ec2b95d9f" gracePeriod=120 Apr 24 23:56:05.352564 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:05.352373 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="kube-rbac-proxy-metric" containerID="cri-o://99294f0747c4126f8fe9006d9730b80faae6830a85b9343bc357aef8341732cf" gracePeriod=120 Apr 24 23:56:05.352564 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:05.352446 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="kube-rbac-proxy" containerID="cri-o://a81c532dabecc5247289b65a6bf8e85e15fa634465a91fc1b43d3ff71e502536" gracePeriod=120 Apr 24 23:56:05.352564 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:05.352371 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="kube-rbac-proxy-web" containerID="cri-o://548c16366e57ef7df4d8c50d6080ec4c0037725883316f998ce434c0fa70418a" gracePeriod=120 Apr 24 23:56:05.352564 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:05.352506 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="prom-label-proxy" containerID="cri-o://1786a9028ae6b3fcc402e6c94248d9c5a98605ff4f63d0bfa93f1d19e3bae21b" gracePeriod=120 Apr 24 23:56:05.352564 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:05.352430 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="config-reloader" containerID="cri-o://86ad16442cc1e55cfc193dcd83a5d5229e04d198c6daed2b2f59117d3e2a31f5" gracePeriod=120 Apr 24 23:56:05.482869 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:05.482843 2578 generic.go:358] "Generic (PLEG): container finished" podID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerID="1786a9028ae6b3fcc402e6c94248d9c5a98605ff4f63d0bfa93f1d19e3bae21b" exitCode=0 Apr 24 23:56:05.482869 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:05.482867 2578 generic.go:358] "Generic (PLEG): container finished" podID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerID="a81c532dabecc5247289b65a6bf8e85e15fa634465a91fc1b43d3ff71e502536" exitCode=0 Apr 24 23:56:05.482992 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:05.482875 2578 generic.go:358] "Generic (PLEG): container finished" podID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerID="86ad16442cc1e55cfc193dcd83a5d5229e04d198c6daed2b2f59117d3e2a31f5" exitCode=0 Apr 24 23:56:05.482992 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:05.482880 2578 generic.go:358] "Generic (PLEG): container finished" podID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerID="bfb58deac7bb137f563fa61d2941426ef3cab29d69cd97ab4425562ec2b95d9f" exitCode=0 Apr 24 23:56:05.482992 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:05.482910 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e07e2d64-a643-4e33-95e3-da4375fb0205","Type":"ContainerDied","Data":"1786a9028ae6b3fcc402e6c94248d9c5a98605ff4f63d0bfa93f1d19e3bae21b"} Apr 24 23:56:05.482992 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:05.482942 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e07e2d64-a643-4e33-95e3-da4375fb0205","Type":"ContainerDied","Data":"a81c532dabecc5247289b65a6bf8e85e15fa634465a91fc1b43d3ff71e502536"} Apr 24 23:56:05.482992 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:05.482951 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e07e2d64-a643-4e33-95e3-da4375fb0205","Type":"ContainerDied","Data":"86ad16442cc1e55cfc193dcd83a5d5229e04d198c6daed2b2f59117d3e2a31f5"} Apr 24 23:56:05.482992 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:05.482960 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e07e2d64-a643-4e33-95e3-da4375fb0205","Type":"ContainerDied","Data":"bfb58deac7bb137f563fa61d2941426ef3cab29d69cd97ab4425562ec2b95d9f"} Apr 24 23:56:06.488586 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.488561 2578 generic.go:358] "Generic (PLEG): container finished" podID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerID="99294f0747c4126f8fe9006d9730b80faae6830a85b9343bc357aef8341732cf" exitCode=0 Apr 24 23:56:06.488586 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.488583 2578 generic.go:358] "Generic (PLEG): container finished" podID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerID="548c16366e57ef7df4d8c50d6080ec4c0037725883316f998ce434c0fa70418a" exitCode=0 Apr 24 23:56:06.488924 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.488629 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e07e2d64-a643-4e33-95e3-da4375fb0205","Type":"ContainerDied","Data":"99294f0747c4126f8fe9006d9730b80faae6830a85b9343bc357aef8341732cf"} Apr 24 23:56:06.488924 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.488665 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e07e2d64-a643-4e33-95e3-da4375fb0205","Type":"ContainerDied","Data":"548c16366e57ef7df4d8c50d6080ec4c0037725883316f998ce434c0fa70418a"} Apr 24 23:56:06.592645 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.592624 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:06.662043 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.661970 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e07e2d64-a643-4e33-95e3-da4375fb0205-tls-assets\") pod \"e07e2d64-a643-4e33-95e3-da4375fb0205\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " Apr 24 23:56:06.662043 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.662008 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-web-config\") pod \"e07e2d64-a643-4e33-95e3-da4375fb0205\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " Apr 24 23:56:06.662043 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.662036 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-kube-rbac-proxy-metric\") pod \"e07e2d64-a643-4e33-95e3-da4375fb0205\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " Apr 24 23:56:06.662291 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.662066 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e07e2d64-a643-4e33-95e3-da4375fb0205-config-out\") pod \"e07e2d64-a643-4e33-95e3-da4375fb0205\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " Apr 24 23:56:06.662291 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.662099 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-kube-rbac-proxy-web\") pod \"e07e2d64-a643-4e33-95e3-da4375fb0205\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " Apr 24 23:56:06.662291 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.662129 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e07e2d64-a643-4e33-95e3-da4375fb0205-alertmanager-trusted-ca-bundle\") pod \"e07e2d64-a643-4e33-95e3-da4375fb0205\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " Apr 24 23:56:06.662291 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.662160 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e07e2d64-a643-4e33-95e3-da4375fb0205-alertmanager-main-db\") pod \"e07e2d64-a643-4e33-95e3-da4375fb0205\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " Apr 24 23:56:06.662291 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.662189 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-cluster-tls-config\") pod \"e07e2d64-a643-4e33-95e3-da4375fb0205\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " Apr 24 23:56:06.662291 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.662228 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-config-volume\") pod \"e07e2d64-a643-4e33-95e3-da4375fb0205\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " Apr 24 23:56:06.662291 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.662279 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-kube-rbac-proxy\") pod \"e07e2d64-a643-4e33-95e3-da4375fb0205\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " Apr 24 23:56:06.662651 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.662304 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85cqn\" (UniqueName: \"kubernetes.io/projected/e07e2d64-a643-4e33-95e3-da4375fb0205-kube-api-access-85cqn\") pod \"e07e2d64-a643-4e33-95e3-da4375fb0205\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " Apr 24 23:56:06.662651 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.662343 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-main-tls\") pod \"e07e2d64-a643-4e33-95e3-da4375fb0205\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " Apr 24 23:56:06.662651 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.662402 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e07e2d64-a643-4e33-95e3-da4375fb0205-metrics-client-ca\") pod \"e07e2d64-a643-4e33-95e3-da4375fb0205\" (UID: \"e07e2d64-a643-4e33-95e3-da4375fb0205\") " Apr 24 23:56:06.662997 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.662974 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e07e2d64-a643-4e33-95e3-da4375fb0205-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "e07e2d64-a643-4e33-95e3-da4375fb0205" (UID: "e07e2d64-a643-4e33-95e3-da4375fb0205"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:56:06.663095 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.662984 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e07e2d64-a643-4e33-95e3-da4375fb0205-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "e07e2d64-a643-4e33-95e3-da4375fb0205" (UID: "e07e2d64-a643-4e33-95e3-da4375fb0205"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:56:06.663095 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.662998 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e07e2d64-a643-4e33-95e3-da4375fb0205-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "e07e2d64-a643-4e33-95e3-da4375fb0205" (UID: "e07e2d64-a643-4e33-95e3-da4375fb0205"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:56:06.666364 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.666336 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e07e2d64-a643-4e33-95e3-da4375fb0205-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e07e2d64-a643-4e33-95e3-da4375fb0205" (UID: "e07e2d64-a643-4e33-95e3-da4375fb0205"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:56:06.666616 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.666578 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e07e2d64-a643-4e33-95e3-da4375fb0205-config-out" (OuterVolumeSpecName: "config-out") pod "e07e2d64-a643-4e33-95e3-da4375fb0205" (UID: "e07e2d64-a643-4e33-95e3-da4375fb0205"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:56:06.667096 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.667068 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e07e2d64-a643-4e33-95e3-da4375fb0205-kube-api-access-85cqn" (OuterVolumeSpecName: "kube-api-access-85cqn") pod "e07e2d64-a643-4e33-95e3-da4375fb0205" (UID: "e07e2d64-a643-4e33-95e3-da4375fb0205"). InnerVolumeSpecName "kube-api-access-85cqn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:56:06.667187 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.667073 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "e07e2d64-a643-4e33-95e3-da4375fb0205" (UID: "e07e2d64-a643-4e33-95e3-da4375fb0205"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:56:06.667251 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.667197 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "e07e2d64-a643-4e33-95e3-da4375fb0205" (UID: "e07e2d64-a643-4e33-95e3-da4375fb0205"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:56:06.667388 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.667371 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-config-volume" (OuterVolumeSpecName: "config-volume") pod "e07e2d64-a643-4e33-95e3-da4375fb0205" (UID: "e07e2d64-a643-4e33-95e3-da4375fb0205"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:56:06.667743 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.667720 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "e07e2d64-a643-4e33-95e3-da4375fb0205" (UID: "e07e2d64-a643-4e33-95e3-da4375fb0205"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:56:06.667933 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.667910 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "e07e2d64-a643-4e33-95e3-da4375fb0205" (UID: "e07e2d64-a643-4e33-95e3-da4375fb0205"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:56:06.670761 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.670693 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "e07e2d64-a643-4e33-95e3-da4375fb0205" (UID: "e07e2d64-a643-4e33-95e3-da4375fb0205"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:56:06.677653 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.677627 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-web-config" (OuterVolumeSpecName: "web-config") pod "e07e2d64-a643-4e33-95e3-da4375fb0205" (UID: "e07e2d64-a643-4e33-95e3-da4375fb0205"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:56:06.763818 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.763790 2578 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e07e2d64-a643-4e33-95e3-da4375fb0205-metrics-client-ca\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:06.763818 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.763815 2578 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e07e2d64-a643-4e33-95e3-da4375fb0205-tls-assets\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:06.763986 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.763829 2578 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-web-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:06.763986 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.763843 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:06.763986 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.763855 2578 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e07e2d64-a643-4e33-95e3-da4375fb0205-config-out\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:06.763986 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.763867 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:06.763986 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.763880 2578 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e07e2d64-a643-4e33-95e3-da4375fb0205-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:06.763986 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.763896 2578 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e07e2d64-a643-4e33-95e3-da4375fb0205-alertmanager-main-db\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:06.763986 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.763908 2578 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-cluster-tls-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:06.763986 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.763921 2578 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-config-volume\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:06.763986 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.763935 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:06.763986 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.763948 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-85cqn\" (UniqueName: \"kubernetes.io/projected/e07e2d64-a643-4e33-95e3-da4375fb0205-kube-api-access-85cqn\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:06.763986 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:06.763965 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e07e2d64-a643-4e33-95e3-da4375fb0205-secret-alertmanager-main-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:07.494044 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.494006 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e07e2d64-a643-4e33-95e3-da4375fb0205","Type":"ContainerDied","Data":"6fad1828a003441fcdfd819cca2337f56669b821b0b73dd4a3e993be5d828830"} Apr 24 23:56:07.494471 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.494054 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.494471 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.494060 2578 scope.go:117] "RemoveContainer" containerID="1786a9028ae6b3fcc402e6c94248d9c5a98605ff4f63d0bfa93f1d19e3bae21b" Apr 24 23:56:07.502006 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.501990 2578 scope.go:117] "RemoveContainer" containerID="99294f0747c4126f8fe9006d9730b80faae6830a85b9343bc357aef8341732cf" Apr 24 23:56:07.508502 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.508485 2578 scope.go:117] "RemoveContainer" containerID="a81c532dabecc5247289b65a6bf8e85e15fa634465a91fc1b43d3ff71e502536" Apr 24 23:56:07.516001 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.515986 2578 scope.go:117] "RemoveContainer" containerID="548c16366e57ef7df4d8c50d6080ec4c0037725883316f998ce434c0fa70418a" Apr 24 23:56:07.519798 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.519774 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:56:07.524764 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.524730 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:56:07.527944 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.527925 2578 scope.go:117] "RemoveContainer" containerID="86ad16442cc1e55cfc193dcd83a5d5229e04d198c6daed2b2f59117d3e2a31f5" Apr 24 23:56:07.534322 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.534292 2578 scope.go:117] "RemoveContainer" containerID="bfb58deac7bb137f563fa61d2941426ef3cab29d69cd97ab4425562ec2b95d9f" Apr 24 23:56:07.540663 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.540644 2578 scope.go:117] "RemoveContainer" containerID="077f0c4c632a1cb0f174986f3c986c3601614ac72bd4a4741b05f4f7c9b21e20" Apr 24 23:56:07.550444 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550407 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:56:07.550704 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550692 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="init-config-reloader" Apr 24 23:56:07.550749 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550707 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="init-config-reloader" Apr 24 23:56:07.550749 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550715 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="996408a8-eb49-4662-835d-8d0ec08e9dfe" containerName="console" Apr 24 23:56:07.550749 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550720 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="996408a8-eb49-4662-835d-8d0ec08e9dfe" containerName="console" Apr 24 23:56:07.550749 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550732 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="kube-rbac-proxy" Apr 24 23:56:07.550749 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550738 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="kube-rbac-proxy" Apr 24 23:56:07.550889 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550750 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="kube-rbac-proxy-web" Apr 24 23:56:07.550889 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550758 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="kube-rbac-proxy-web" Apr 24 23:56:07.550889 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550768 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="config-reloader" Apr 24 23:56:07.550889 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550773 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="config-reloader" Apr 24 23:56:07.550889 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550779 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="alertmanager" Apr 24 23:56:07.550889 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550783 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="alertmanager" Apr 24 23:56:07.550889 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550790 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="kube-rbac-proxy-metric" Apr 24 23:56:07.550889 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550795 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="kube-rbac-proxy-metric" Apr 24 23:56:07.550889 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550802 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="prom-label-proxy" Apr 24 23:56:07.550889 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550807 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="prom-label-proxy" Apr 24 23:56:07.550889 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550853 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="kube-rbac-proxy" Apr 24 23:56:07.550889 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550861 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="config-reloader" Apr 24 23:56:07.550889 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550868 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="996408a8-eb49-4662-835d-8d0ec08e9dfe" containerName="console" Apr 24 23:56:07.550889 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550873 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="alertmanager" Apr 24 23:56:07.550889 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550880 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="prom-label-proxy" Apr 24 23:56:07.550889 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550889 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="kube-rbac-proxy-web" Apr 24 23:56:07.551388 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.550900 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" containerName="kube-rbac-proxy-metric" Apr 24 23:56:07.555867 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.555852 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.558379 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.558361 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 23:56:07.558790 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.558771 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 23:56:07.558888 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.558797 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 23:56:07.558888 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.558839 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 23:56:07.558888 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.558848 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 23:56:07.558888 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.558861 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 23:56:07.559103 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.558893 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 23:56:07.559103 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.558942 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-b6hsp\"" Apr 24 23:56:07.559216 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.559203 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 23:56:07.564992 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.564974 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 23:56:07.571167 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.571141 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:56:07.671141 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.671072 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.671141 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.671103 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-web-config\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.671141 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.671124 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-config-volume\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.671345 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.671158 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.671345 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.671188 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj96g\" (UniqueName: \"kubernetes.io/projected/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-kube-api-access-zj96g\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.671345 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.671203 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-config-out\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.671345 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.671222 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.671345 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.671248 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.671345 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.671263 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.671582 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.671346 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.671582 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.671369 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.671582 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.671391 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.671582 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.671435 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.772619 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.772592 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.772770 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.772624 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-web-config\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.772770 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.772646 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-config-volume\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.772770 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.772765 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.772909 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.772792 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zj96g\" (UniqueName: \"kubernetes.io/projected/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-kube-api-access-zj96g\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.772909 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.772811 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-config-out\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.772909 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.772849 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.772909 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.772893 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.773122 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.772917 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.773122 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.772959 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.773122 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.772986 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.773122 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.773020 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.773122 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.773050 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.773540 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.773144 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.773732 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.773707 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.774173 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.774147 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.775907 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.775882 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.776017 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.775971 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-config-volume\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.776138 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.776116 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-config-out\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.776332 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.776308 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.776653 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.776636 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.776717 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.776650 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.777308 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.777287 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-web-config\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.777619 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.777602 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.777944 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.777928 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.781670 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.781650 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj96g\" (UniqueName: \"kubernetes.io/projected/5ec9f71c-d45c-4be2-9915-8a57dfeb094d-kube-api-access-zj96g\") pod \"alertmanager-main-0\" (UID: \"5ec9f71c-d45c-4be2-9915-8a57dfeb094d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.865862 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.865831 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 23:56:07.914640 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:07.914048 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e07e2d64-a643-4e33-95e3-da4375fb0205" path="/var/lib/kubelet/pods/e07e2d64-a643-4e33-95e3-da4375fb0205/volumes" Apr 24 23:56:08.000160 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:08.000136 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 23:56:08.002573 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:56:08.002546 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ec9f71c_d45c_4be2_9915_8a57dfeb094d.slice/crio-b26fc1e691aa0cbbe78dfb346457cd4bee12f0609d72e33b7448fa2c14523bf8 WatchSource:0}: Error finding container b26fc1e691aa0cbbe78dfb346457cd4bee12f0609d72e33b7448fa2c14523bf8: Status 404 returned error can't find the container with id b26fc1e691aa0cbbe78dfb346457cd4bee12f0609d72e33b7448fa2c14523bf8 Apr 24 23:56:08.498854 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:08.498820 2578 generic.go:358] "Generic (PLEG): container finished" podID="5ec9f71c-d45c-4be2-9915-8a57dfeb094d" containerID="5db34b73c56166916d997fa0c21d4d41501175ce03cb51b753e71d35d4a73ade" exitCode=0 Apr 24 23:56:08.498854 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:08.498856 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ec9f71c-d45c-4be2-9915-8a57dfeb094d","Type":"ContainerDied","Data":"5db34b73c56166916d997fa0c21d4d41501175ce03cb51b753e71d35d4a73ade"} Apr 24 23:56:08.499260 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:08.498876 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ec9f71c-d45c-4be2-9915-8a57dfeb094d","Type":"ContainerStarted","Data":"b26fc1e691aa0cbbe78dfb346457cd4bee12f0609d72e33b7448fa2c14523bf8"} Apr 24 23:56:09.504178 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.504137 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ec9f71c-d45c-4be2-9915-8a57dfeb094d","Type":"ContainerStarted","Data":"2f8aa7ae526e0d1e589ac961cf937989f30226b2f0e142e50a27d058b4a9fec8"} Apr 24 23:56:09.504178 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.504182 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ec9f71c-d45c-4be2-9915-8a57dfeb094d","Type":"ContainerStarted","Data":"f22ce958407b980c43b6511acdc4fe384c866db74595b55461d221a40d388cb6"} Apr 24 23:56:09.504612 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.504192 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ec9f71c-d45c-4be2-9915-8a57dfeb094d","Type":"ContainerStarted","Data":"630ef25fd9814a22e069ebc37cd0619a752df484cb2f385bb0a57c1491d28efc"} Apr 24 23:56:09.504612 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.504201 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ec9f71c-d45c-4be2-9915-8a57dfeb094d","Type":"ContainerStarted","Data":"a6db05fc897b7e40891fca5c6f8fe09c7593b7b095c346bf5ff9b5310334a999"} Apr 24 23:56:09.504612 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.504209 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ec9f71c-d45c-4be2-9915-8a57dfeb094d","Type":"ContainerStarted","Data":"51ba7a729f1da62d44566a10305699b826e1576efd7920a04ea271c5162b3c0a"} Apr 24 23:56:09.504612 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.504216 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ec9f71c-d45c-4be2-9915-8a57dfeb094d","Type":"ContainerStarted","Data":"48e99b52072e7fcd94420823dc5844e96a8f3afead43d5595c5bda8a11e48f75"} Apr 24 23:56:09.537359 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.537316 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.537300816 podStartE2EDuration="2.537300816s" podCreationTimestamp="2026-04-24 23:56:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:56:09.535648036 +0000 UTC m=+152.169629972" watchObservedRunningTime="2026-04-24 23:56:09.537300816 +0000 UTC m=+152.171282753" Apr 24 23:56:09.554623 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.554595 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:56:09.555163 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.555006 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="prometheus" containerID="cri-o://4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926" gracePeriod=600 Apr 24 23:56:09.555163 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.555026 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="kube-rbac-proxy" containerID="cri-o://13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574" gracePeriod=600 Apr 24 23:56:09.555163 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.555048 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="thanos-sidecar" containerID="cri-o://a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe" gracePeriod=600 Apr 24 23:56:09.555163 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.555071 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="config-reloader" containerID="cri-o://037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982" gracePeriod=600 Apr 24 23:56:09.555163 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.555046 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="kube-rbac-proxy-web" containerID="cri-o://f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f" gracePeriod=600 Apr 24 23:56:09.555163 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.555127 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="kube-rbac-proxy-thanos" containerID="cri-o://90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600" gracePeriod=600 Apr 24 23:56:09.790980 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.790956 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:09.890099 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.890069 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-metrics-client-certs\") pod \"84cf5627-5273-43e3-99ae-b6ab2371aa69\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " Apr 24 23:56:09.890258 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.890114 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-prometheus-k8s-rulefiles-0\") pod \"84cf5627-5273-43e3-99ae-b6ab2371aa69\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " Apr 24 23:56:09.890258 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.890140 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-thanos-prometheus-http-client-file\") pod \"84cf5627-5273-43e3-99ae-b6ab2371aa69\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " Apr 24 23:56:09.890258 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.890168 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-web-config\") pod \"84cf5627-5273-43e3-99ae-b6ab2371aa69\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " Apr 24 23:56:09.890258 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.890195 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-configmap-serving-certs-ca-bundle\") pod \"84cf5627-5273-43e3-99ae-b6ab2371aa69\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " Apr 24 23:56:09.890258 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.890220 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-prometheus-trusted-ca-bundle\") pod \"84cf5627-5273-43e3-99ae-b6ab2371aa69\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " Apr 24 23:56:09.890258 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.890245 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/84cf5627-5273-43e3-99ae-b6ab2371aa69-tls-assets\") pod \"84cf5627-5273-43e3-99ae-b6ab2371aa69\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " Apr 24 23:56:09.890598 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.890268 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-prometheus-k8s-tls\") pod \"84cf5627-5273-43e3-99ae-b6ab2371aa69\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " Apr 24 23:56:09.890598 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.890299 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhmnn\" (UniqueName: \"kubernetes.io/projected/84cf5627-5273-43e3-99ae-b6ab2371aa69-kube-api-access-hhmnn\") pod \"84cf5627-5273-43e3-99ae-b6ab2371aa69\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " Apr 24 23:56:09.890598 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.890327 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"84cf5627-5273-43e3-99ae-b6ab2371aa69\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " Apr 24 23:56:09.890598 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.890367 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"84cf5627-5273-43e3-99ae-b6ab2371aa69\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " Apr 24 23:56:09.890598 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.890393 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-kube-rbac-proxy\") pod \"84cf5627-5273-43e3-99ae-b6ab2371aa69\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " Apr 24 23:56:09.890598 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.890450 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/84cf5627-5273-43e3-99ae-b6ab2371aa69-config-out\") pod \"84cf5627-5273-43e3-99ae-b6ab2371aa69\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " Apr 24 23:56:09.890598 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.890502 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-configmap-metrics-client-ca\") pod \"84cf5627-5273-43e3-99ae-b6ab2371aa69\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " Apr 24 23:56:09.890598 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.890537 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-configmap-kubelet-serving-ca-bundle\") pod \"84cf5627-5273-43e3-99ae-b6ab2371aa69\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " Apr 24 23:56:09.890598 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.890572 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-config\") pod \"84cf5627-5273-43e3-99ae-b6ab2371aa69\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " Apr 24 23:56:09.891002 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.890613 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-grpc-tls\") pod \"84cf5627-5273-43e3-99ae-b6ab2371aa69\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " Apr 24 23:56:09.891002 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.890647 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/84cf5627-5273-43e3-99ae-b6ab2371aa69-prometheus-k8s-db\") pod \"84cf5627-5273-43e3-99ae-b6ab2371aa69\" (UID: \"84cf5627-5273-43e3-99ae-b6ab2371aa69\") " Apr 24 23:56:09.891002 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.890694 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "84cf5627-5273-43e3-99ae-b6ab2371aa69" (UID: "84cf5627-5273-43e3-99ae-b6ab2371aa69"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:56:09.892459 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.891552 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "84cf5627-5273-43e3-99ae-b6ab2371aa69" (UID: "84cf5627-5273-43e3-99ae-b6ab2371aa69"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:56:09.892459 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.892144 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:09.892459 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.892155 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "84cf5627-5273-43e3-99ae-b6ab2371aa69" (UID: "84cf5627-5273-43e3-99ae-b6ab2371aa69"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:56:09.892459 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.892167 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:09.892786 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.892755 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "84cf5627-5273-43e3-99ae-b6ab2371aa69" (UID: "84cf5627-5273-43e3-99ae-b6ab2371aa69"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:56:09.892917 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.892894 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "84cf5627-5273-43e3-99ae-b6ab2371aa69" (UID: "84cf5627-5273-43e3-99ae-b6ab2371aa69"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:56:09.893114 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.893097 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84cf5627-5273-43e3-99ae-b6ab2371aa69-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "84cf5627-5273-43e3-99ae-b6ab2371aa69" (UID: "84cf5627-5273-43e3-99ae-b6ab2371aa69"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:56:09.893379 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.893360 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "84cf5627-5273-43e3-99ae-b6ab2371aa69" (UID: "84cf5627-5273-43e3-99ae-b6ab2371aa69"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:56:09.893838 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.893809 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84cf5627-5273-43e3-99ae-b6ab2371aa69-config-out" (OuterVolumeSpecName: "config-out") pod "84cf5627-5273-43e3-99ae-b6ab2371aa69" (UID: "84cf5627-5273-43e3-99ae-b6ab2371aa69"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:56:09.894291 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.894258 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "84cf5627-5273-43e3-99ae-b6ab2371aa69" (UID: "84cf5627-5273-43e3-99ae-b6ab2371aa69"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:56:09.894503 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.894482 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "84cf5627-5273-43e3-99ae-b6ab2371aa69" (UID: "84cf5627-5273-43e3-99ae-b6ab2371aa69"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:56:09.895661 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.895640 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-config" (OuterVolumeSpecName: "config") pod "84cf5627-5273-43e3-99ae-b6ab2371aa69" (UID: "84cf5627-5273-43e3-99ae-b6ab2371aa69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:56:09.895783 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.895730 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "84cf5627-5273-43e3-99ae-b6ab2371aa69" (UID: "84cf5627-5273-43e3-99ae-b6ab2371aa69"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:56:09.895867 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.895843 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84cf5627-5273-43e3-99ae-b6ab2371aa69-kube-api-access-hhmnn" (OuterVolumeSpecName: "kube-api-access-hhmnn") pod "84cf5627-5273-43e3-99ae-b6ab2371aa69" (UID: "84cf5627-5273-43e3-99ae-b6ab2371aa69"). InnerVolumeSpecName "kube-api-access-hhmnn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:56:09.895923 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.895881 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "84cf5627-5273-43e3-99ae-b6ab2371aa69" (UID: "84cf5627-5273-43e3-99ae-b6ab2371aa69"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:56:09.896108 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.896088 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "84cf5627-5273-43e3-99ae-b6ab2371aa69" (UID: "84cf5627-5273-43e3-99ae-b6ab2371aa69"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:56:09.896326 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.896308 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84cf5627-5273-43e3-99ae-b6ab2371aa69-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "84cf5627-5273-43e3-99ae-b6ab2371aa69" (UID: "84cf5627-5273-43e3-99ae-b6ab2371aa69"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:56:09.897103 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.897084 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "84cf5627-5273-43e3-99ae-b6ab2371aa69" (UID: "84cf5627-5273-43e3-99ae-b6ab2371aa69"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:56:09.903366 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.903346 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-web-config" (OuterVolumeSpecName: "web-config") pod "84cf5627-5273-43e3-99ae-b6ab2371aa69" (UID: "84cf5627-5273-43e3-99ae-b6ab2371aa69"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:56:09.993128 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.993086 2578 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:09.993128 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.993124 2578 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-grpc-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:09.993298 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.993143 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/84cf5627-5273-43e3-99ae-b6ab2371aa69-prometheus-k8s-db\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:09.993298 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.993156 2578 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-metrics-client-certs\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:09.993298 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.993171 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:09.993298 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.993185 2578 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-thanos-prometheus-http-client-file\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:09.993298 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.993200 2578 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-web-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:09.993298 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.993214 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-prometheus-trusted-ca-bundle\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:09.993298 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.993232 2578 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/84cf5627-5273-43e3-99ae-b6ab2371aa69-tls-assets\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:09.993298 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.993248 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-prometheus-k8s-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:09.993298 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.993262 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hhmnn\" (UniqueName: \"kubernetes.io/projected/84cf5627-5273-43e3-99ae-b6ab2371aa69-kube-api-access-hhmnn\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:09.993298 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.993278 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:09.993652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.993304 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:09.993652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.993324 2578 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/84cf5627-5273-43e3-99ae-b6ab2371aa69-secret-kube-rbac-proxy\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:09.993652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.993339 2578 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/84cf5627-5273-43e3-99ae-b6ab2371aa69-config-out\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:09.993652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:09.993354 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/84cf5627-5273-43e3-99ae-b6ab2371aa69-configmap-metrics-client-ca\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 24 23:56:10.509841 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.509808 2578 generic.go:358] "Generic (PLEG): container finished" podID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerID="90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600" exitCode=0 Apr 24 23:56:10.509841 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.509838 2578 generic.go:358] "Generic (PLEG): container finished" podID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerID="13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574" exitCode=0 Apr 24 23:56:10.510275 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.509853 2578 generic.go:358] "Generic (PLEG): container finished" podID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerID="f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f" exitCode=0 Apr 24 23:56:10.510275 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.509862 2578 generic.go:358] "Generic (PLEG): container finished" podID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerID="a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe" exitCode=0 Apr 24 23:56:10.510275 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.509869 2578 generic.go:358] "Generic (PLEG): container finished" podID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerID="037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982" exitCode=0 Apr 24 23:56:10.510275 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.509877 2578 generic.go:358] "Generic (PLEG): container finished" podID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerID="4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926" exitCode=0 Apr 24 23:56:10.510275 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.509890 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"84cf5627-5273-43e3-99ae-b6ab2371aa69","Type":"ContainerDied","Data":"90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600"} Apr 24 23:56:10.510275 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.509926 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"84cf5627-5273-43e3-99ae-b6ab2371aa69","Type":"ContainerDied","Data":"13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574"} Apr 24 23:56:10.510275 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.509937 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"84cf5627-5273-43e3-99ae-b6ab2371aa69","Type":"ContainerDied","Data":"f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f"} Apr 24 23:56:10.510275 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.509947 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"84cf5627-5273-43e3-99ae-b6ab2371aa69","Type":"ContainerDied","Data":"a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe"} Apr 24 23:56:10.510275 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.509955 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"84cf5627-5273-43e3-99ae-b6ab2371aa69","Type":"ContainerDied","Data":"037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982"} Apr 24 23:56:10.510275 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.509964 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"84cf5627-5273-43e3-99ae-b6ab2371aa69","Type":"ContainerDied","Data":"4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926"} Apr 24 23:56:10.510275 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.509973 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"84cf5627-5273-43e3-99ae-b6ab2371aa69","Type":"ContainerDied","Data":"ec014283f674c62ee42133eb239fee1a7936360ef62f5673e73245bea582e034"} Apr 24 23:56:10.510275 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.509935 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.510275 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.509997 2578 scope.go:117] "RemoveContainer" containerID="90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600" Apr 24 23:56:10.517055 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.517039 2578 scope.go:117] "RemoveContainer" containerID="13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574" Apr 24 23:56:10.523246 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.523229 2578 scope.go:117] "RemoveContainer" containerID="f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f" Apr 24 23:56:10.528875 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.528827 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:56:10.529900 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.529885 2578 scope.go:117] "RemoveContainer" containerID="a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe" Apr 24 23:56:10.534039 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.534021 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:56:10.537048 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.537033 2578 scope.go:117] "RemoveContainer" containerID="037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982" Apr 24 23:56:10.543132 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.543116 2578 scope.go:117] "RemoveContainer" containerID="4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926" Apr 24 23:56:10.549792 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.549776 2578 scope.go:117] "RemoveContainer" containerID="9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b" Apr 24 23:56:10.555971 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.555820 2578 scope.go:117] "RemoveContainer" containerID="90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600" Apr 24 23:56:10.556077 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:56:10.556052 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600\": container with ID starting with 90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600 not found: ID does not exist" containerID="90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600" Apr 24 23:56:10.556117 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.556085 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600"} err="failed to get container status \"90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600\": rpc error: code = NotFound desc = could not find container \"90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600\": container with ID starting with 90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600 not found: ID does not exist" Apr 24 23:56:10.556117 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.556102 2578 scope.go:117] "RemoveContainer" containerID="13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574" Apr 24 23:56:10.556345 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:56:10.556328 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574\": container with ID starting with 13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574 not found: ID does not exist" containerID="13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574" Apr 24 23:56:10.556407 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.556355 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574"} err="failed to get container status \"13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574\": rpc error: code = NotFound desc = could not find container \"13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574\": container with ID starting with 13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574 not found: ID does not exist" Apr 24 23:56:10.556407 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.556378 2578 scope.go:117] "RemoveContainer" containerID="f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f" Apr 24 23:56:10.556736 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:56:10.556720 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f\": container with ID starting with f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f not found: ID does not exist" containerID="f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f" Apr 24 23:56:10.556788 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.556740 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f"} err="failed to get container status \"f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f\": rpc error: code = NotFound desc = could not find container \"f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f\": container with ID starting with f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f not found: ID does not exist" Apr 24 23:56:10.556788 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.556754 2578 scope.go:117] "RemoveContainer" containerID="a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe" Apr 24 23:56:10.556975 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:56:10.556958 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe\": container with ID starting with a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe not found: ID does not exist" containerID="a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe" Apr 24 23:56:10.557016 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.556981 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe"} err="failed to get container status \"a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe\": rpc error: code = NotFound desc = could not find container \"a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe\": container with ID starting with a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe not found: ID does not exist" Apr 24 23:56:10.557016 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.556996 2578 scope.go:117] "RemoveContainer" containerID="037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982" Apr 24 23:56:10.557186 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:56:10.557162 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982\": container with ID starting with 037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982 not found: ID does not exist" containerID="037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982" Apr 24 23:56:10.557229 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.557191 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982"} err="failed to get container status \"037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982\": rpc error: code = NotFound desc = could not find container \"037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982\": container with ID starting with 037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982 not found: ID does not exist" Apr 24 23:56:10.557229 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.557204 2578 scope.go:117] "RemoveContainer" containerID="4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926" Apr 24 23:56:10.557399 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:56:10.557381 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926\": container with ID starting with 4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926 not found: ID does not exist" containerID="4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926" Apr 24 23:56:10.557487 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.557405 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926"} err="failed to get container status \"4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926\": rpc error: code = NotFound desc = could not find container \"4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926\": container with ID starting with 4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926 not found: ID does not exist" Apr 24 23:56:10.557487 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.557446 2578 scope.go:117] "RemoveContainer" containerID="9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b" Apr 24 23:56:10.557641 ip-10-0-129-98 kubenswrapper[2578]: E0424 23:56:10.557626 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b\": container with ID starting with 9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b not found: ID does not exist" containerID="9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b" Apr 24 23:56:10.557675 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.557646 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b"} err="failed to get container status \"9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b\": rpc error: code = NotFound desc = could not find container \"9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b\": container with ID starting with 9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b not found: ID does not exist" Apr 24 23:56:10.557675 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.557659 2578 scope.go:117] "RemoveContainer" containerID="90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600" Apr 24 23:56:10.557850 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.557832 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600"} err="failed to get container status \"90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600\": rpc error: code = NotFound desc = could not find container \"90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600\": container with ID starting with 90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600 not found: ID does not exist" Apr 24 23:56:10.557913 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.557852 2578 scope.go:117] "RemoveContainer" containerID="13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574" Apr 24 23:56:10.558059 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.558044 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574"} err="failed to get container status \"13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574\": rpc error: code = NotFound desc = could not find container \"13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574\": container with ID starting with 13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574 not found: ID does not exist" Apr 24 23:56:10.558118 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.558061 2578 scope.go:117] "RemoveContainer" containerID="f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f" Apr 24 23:56:10.558302 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.558280 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f"} err="failed to get container status \"f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f\": rpc error: code = NotFound desc = could not find container \"f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f\": container with ID starting with f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f not found: ID does not exist" Apr 24 23:56:10.558302 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.558300 2578 scope.go:117] "RemoveContainer" containerID="a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe" Apr 24 23:56:10.558652 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.558628 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe"} err="failed to get container status \"a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe\": rpc error: code = NotFound desc = could not find container \"a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe\": container with ID starting with a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe not found: ID does not exist" Apr 24 23:56:10.558729 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.558656 2578 scope.go:117] "RemoveContainer" containerID="037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982" Apr 24 23:56:10.558929 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.558890 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982"} err="failed to get container status \"037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982\": rpc error: code = NotFound desc = could not find container \"037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982\": container with ID starting with 037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982 not found: ID does not exist" Apr 24 23:56:10.558929 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.558930 2578 scope.go:117] "RemoveContainer" containerID="4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926" Apr 24 23:56:10.559268 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.559244 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926"} err="failed to get container status \"4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926\": rpc error: code = NotFound desc = could not find container \"4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926\": container with ID starting with 4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926 not found: ID does not exist" Apr 24 23:56:10.559329 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.559270 2578 scope.go:117] "RemoveContainer" containerID="9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b" Apr 24 23:56:10.559551 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.559528 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b"} err="failed to get container status \"9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b\": rpc error: code = NotFound desc = could not find container \"9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b\": container with ID starting with 9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b not found: ID does not exist" Apr 24 23:56:10.559628 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.559554 2578 scope.go:117] "RemoveContainer" containerID="90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600" Apr 24 23:56:10.559785 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.559766 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600"} err="failed to get container status \"90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600\": rpc error: code = NotFound desc = could not find container \"90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600\": container with ID starting with 90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600 not found: ID does not exist" Apr 24 23:56:10.559857 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.559785 2578 scope.go:117] "RemoveContainer" containerID="13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574" Apr 24 23:56:10.560010 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.559989 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574"} err="failed to get container status \"13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574\": rpc error: code = NotFound desc = could not find container \"13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574\": container with ID starting with 13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574 not found: ID does not exist" Apr 24 23:56:10.560010 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.560009 2578 scope.go:117] "RemoveContainer" containerID="f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f" Apr 24 23:56:10.560273 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.560250 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f"} err="failed to get container status \"f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f\": rpc error: code = NotFound desc = could not find container \"f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f\": container with ID starting with f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f not found: ID does not exist" Apr 24 23:56:10.560330 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.560276 2578 scope.go:117] "RemoveContainer" containerID="a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe" Apr 24 23:56:10.560522 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.560500 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe"} err="failed to get container status \"a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe\": rpc error: code = NotFound desc = could not find container \"a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe\": container with ID starting with a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe not found: ID does not exist" Apr 24 23:56:10.560586 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.560525 2578 scope.go:117] "RemoveContainer" containerID="037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982" Apr 24 23:56:10.560686 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.560672 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:56:10.560784 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.560764 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982"} err="failed to get container status \"037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982\": rpc error: code = NotFound desc = could not find container \"037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982\": container with ID starting with 037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982 not found: ID does not exist" Apr 24 23:56:10.560837 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.560787 2578 scope.go:117] "RemoveContainer" containerID="4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926" Apr 24 23:56:10.560950 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.560933 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="kube-rbac-proxy" Apr 24 23:56:10.560950 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.560952 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="kube-rbac-proxy" Apr 24 23:56:10.561077 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.560964 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="prometheus" Apr 24 23:56:10.561077 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.560972 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="prometheus" Apr 24 23:56:10.561077 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.560984 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="config-reloader" Apr 24 23:56:10.561077 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.560992 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="config-reloader" Apr 24 23:56:10.561077 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.561003 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926"} err="failed to get container status \"4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926\": rpc error: code = NotFound desc = could not find container \"4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926\": container with ID starting with 4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926 not found: ID does not exist" Apr 24 23:56:10.561077 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.561026 2578 scope.go:117] "RemoveContainer" containerID="9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b" Apr 24 23:56:10.561077 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.561009 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="thanos-sidecar" Apr 24 23:56:10.561077 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.561074 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="thanos-sidecar" Apr 24 23:56:10.561393 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.561085 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="kube-rbac-proxy-thanos" Apr 24 23:56:10.561393 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.561091 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="kube-rbac-proxy-thanos" Apr 24 23:56:10.561393 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.561106 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="init-config-reloader" Apr 24 23:56:10.561393 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.561115 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="init-config-reloader" Apr 24 23:56:10.561393 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.561129 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="kube-rbac-proxy-web" Apr 24 23:56:10.561393 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.561138 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="kube-rbac-proxy-web" Apr 24 23:56:10.561393 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.561217 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="thanos-sidecar" Apr 24 23:56:10.561393 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.561229 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="kube-rbac-proxy-thanos" Apr 24 23:56:10.561393 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.561242 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="kube-rbac-proxy" Apr 24 23:56:10.561393 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.561253 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="prometheus" Apr 24 23:56:10.561393 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.561260 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="kube-rbac-proxy-web" Apr 24 23:56:10.561393 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.561268 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" containerName="config-reloader" Apr 24 23:56:10.561393 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.561291 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b"} err="failed to get container status \"9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b\": rpc error: code = NotFound desc = could not find container \"9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b\": container with ID starting with 9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b not found: ID does not exist" Apr 24 23:56:10.561393 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.561314 2578 scope.go:117] "RemoveContainer" containerID="90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600" Apr 24 23:56:10.561918 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.561558 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600"} err="failed to get container status \"90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600\": rpc error: code = NotFound desc = could not find container \"90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600\": container with ID starting with 90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600 not found: ID does not exist" Apr 24 23:56:10.561918 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.561574 2578 scope.go:117] "RemoveContainer" containerID="13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574" Apr 24 23:56:10.561918 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.561808 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574"} err="failed to get container status \"13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574\": rpc error: code = NotFound desc = could not find container \"13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574\": container with ID starting with 13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574 not found: ID does not exist" Apr 24 23:56:10.561918 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.561833 2578 scope.go:117] "RemoveContainer" containerID="f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f" Apr 24 23:56:10.562060 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.562043 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f"} err="failed to get container status \"f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f\": rpc error: code = NotFound desc = could not find container \"f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f\": container with ID starting with f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f not found: ID does not exist" Apr 24 23:56:10.562104 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.562061 2578 scope.go:117] "RemoveContainer" containerID="a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe" Apr 24 23:56:10.562245 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.562227 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe"} err="failed to get container status \"a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe\": rpc error: code = NotFound desc = could not find container \"a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe\": container with ID starting with a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe not found: ID does not exist" Apr 24 23:56:10.562304 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.562245 2578 scope.go:117] "RemoveContainer" containerID="037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982" Apr 24 23:56:10.562450 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.562435 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982"} err="failed to get container status \"037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982\": rpc error: code = NotFound desc = could not find container \"037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982\": container with ID starting with 037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982 not found: ID does not exist" Apr 24 23:56:10.562487 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.562451 2578 scope.go:117] "RemoveContainer" containerID="4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926" Apr 24 23:56:10.562633 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.562616 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926"} err="failed to get container status \"4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926\": rpc error: code = NotFound desc = could not find container \"4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926\": container with ID starting with 4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926 not found: ID does not exist" Apr 24 23:56:10.562697 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.562634 2578 scope.go:117] "RemoveContainer" containerID="9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b" Apr 24 23:56:10.562868 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.562847 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b"} err="failed to get container status \"9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b\": rpc error: code = NotFound desc = could not find container \"9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b\": container with ID starting with 9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b not found: ID does not exist" Apr 24 23:56:10.562914 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.562869 2578 scope.go:117] "RemoveContainer" containerID="90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600" Apr 24 23:56:10.563066 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.563049 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600"} err="failed to get container status \"90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600\": rpc error: code = NotFound desc = could not find container \"90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600\": container with ID starting with 90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600 not found: ID does not exist" Apr 24 23:56:10.563130 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.563067 2578 scope.go:117] "RemoveContainer" containerID="13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574" Apr 24 23:56:10.563264 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.563246 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574"} err="failed to get container status \"13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574\": rpc error: code = NotFound desc = could not find container \"13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574\": container with ID starting with 13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574 not found: ID does not exist" Apr 24 23:56:10.563310 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.563265 2578 scope.go:117] "RemoveContainer" containerID="f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f" Apr 24 23:56:10.563476 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.563458 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f"} err="failed to get container status \"f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f\": rpc error: code = NotFound desc = could not find container \"f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f\": container with ID starting with f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f not found: ID does not exist" Apr 24 23:56:10.563540 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.563477 2578 scope.go:117] "RemoveContainer" containerID="a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe" Apr 24 23:56:10.563676 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.563652 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe"} err="failed to get container status \"a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe\": rpc error: code = NotFound desc = could not find container \"a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe\": container with ID starting with a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe not found: ID does not exist" Apr 24 23:56:10.563676 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.563675 2578 scope.go:117] "RemoveContainer" containerID="037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982" Apr 24 23:56:10.563863 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.563844 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982"} err="failed to get container status \"037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982\": rpc error: code = NotFound desc = could not find container \"037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982\": container with ID starting with 037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982 not found: ID does not exist" Apr 24 23:56:10.563912 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.563864 2578 scope.go:117] "RemoveContainer" containerID="4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926" Apr 24 23:56:10.564057 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.564039 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926"} err="failed to get container status \"4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926\": rpc error: code = NotFound desc = could not find container \"4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926\": container with ID starting with 4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926 not found: ID does not exist" Apr 24 23:56:10.564117 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.564059 2578 scope.go:117] "RemoveContainer" containerID="9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b" Apr 24 23:56:10.564293 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.564276 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b"} err="failed to get container status \"9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b\": rpc error: code = NotFound desc = could not find container \"9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b\": container with ID starting with 9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b not found: ID does not exist" Apr 24 23:56:10.564341 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.564294 2578 scope.go:117] "RemoveContainer" containerID="90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600" Apr 24 23:56:10.564502 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.564484 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600"} err="failed to get container status \"90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600\": rpc error: code = NotFound desc = could not find container \"90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600\": container with ID starting with 90e5596511cdf5282802c199c18215e535b8077bfb4ab732a57009943fa2c600 not found: ID does not exist" Apr 24 23:56:10.564550 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.564503 2578 scope.go:117] "RemoveContainer" containerID="13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574" Apr 24 23:56:10.564705 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.564671 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574"} err="failed to get container status \"13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574\": rpc error: code = NotFound desc = could not find container \"13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574\": container with ID starting with 13228ebfca44a63ffc18a86afaa581924478619231d49c3108f6eafc6fa02574 not found: ID does not exist" Apr 24 23:56:10.564783 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.564705 2578 scope.go:117] "RemoveContainer" containerID="f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f" Apr 24 23:56:10.564896 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.564881 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f"} err="failed to get container status \"f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f\": rpc error: code = NotFound desc = could not find container \"f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f\": container with ID starting with f4f518ce4211c6e85d50ffcf4501150e1e7f5364f53d0cbeccc69a9e4b49f03f not found: ID does not exist" Apr 24 23:56:10.564938 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.564896 2578 scope.go:117] "RemoveContainer" containerID="a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe" Apr 24 23:56:10.565083 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.565065 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe"} err="failed to get container status \"a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe\": rpc error: code = NotFound desc = could not find container \"a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe\": container with ID starting with a54cb504506964e1c1c9abe9b52e7b88b0c5e4d9720a09fa9fcb8fe618a68ffe not found: ID does not exist" Apr 24 23:56:10.565123 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.565084 2578 scope.go:117] "RemoveContainer" containerID="037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982" Apr 24 23:56:10.565239 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.565222 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982"} err="failed to get container status \"037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982\": rpc error: code = NotFound desc = could not find container \"037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982\": container with ID starting with 037044c43196c1cad728dbc8482db6cd6e480eb066ff2002a005885a94e59982 not found: ID does not exist" Apr 24 23:56:10.565275 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.565239 2578 scope.go:117] "RemoveContainer" containerID="4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926" Apr 24 23:56:10.565384 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.565369 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926"} err="failed to get container status \"4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926\": rpc error: code = NotFound desc = could not find container \"4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926\": container with ID starting with 4913be9894b17ed54b690cc715e9930743fe1cd15d564433295f14d7828e4926 not found: ID does not exist" Apr 24 23:56:10.565480 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.565384 2578 scope.go:117] "RemoveContainer" containerID="9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b" Apr 24 23:56:10.565574 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.565559 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b"} err="failed to get container status \"9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b\": rpc error: code = NotFound desc = could not find container \"9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b\": container with ID starting with 9476492d380d3c73acaf7320df12f13e2eab9899b56ff014529719937145ab0b not found: ID does not exist" Apr 24 23:56:10.566064 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.566051 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.571702 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.571673 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 23:56:10.571786 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.571721 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 23:56:10.571786 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.571742 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 23:56:10.572089 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.572070 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 23:56:10.572198 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.572182 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 23:56:10.572244 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.572184 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 23:56:10.572290 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.572229 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-tjhn5\"" Apr 24 23:56:10.572750 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.572735 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 23:56:10.572816 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.572803 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 23:56:10.573003 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.572987 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 23:56:10.573084 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.573010 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 23:56:10.573084 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.573012 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-a51i7k9kpjgm0\"" Apr 24 23:56:10.575561 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.575543 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 23:56:10.579347 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.579331 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 23:56:10.581275 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.581254 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:56:10.697259 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.697179 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.697259 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.697216 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/85c5a684-d475-4029-ad11-b6e97d35d195-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.697259 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.697247 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-web-config\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.697523 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.697297 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.697523 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.697320 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.697523 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.697358 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.697523 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.697385 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/85c5a684-d475-4029-ad11-b6e97d35d195-config-out\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.697523 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.697436 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.697523 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.697482 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85c5a684-d475-4029-ad11-b6e97d35d195-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.697523 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.697510 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/85c5a684-d475-4029-ad11-b6e97d35d195-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.697523 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.697527 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tshsv\" (UniqueName: \"kubernetes.io/projected/85c5a684-d475-4029-ad11-b6e97d35d195-kube-api-access-tshsv\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.697815 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.697547 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85c5a684-d475-4029-ad11-b6e97d35d195-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.697815 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.697564 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/85c5a684-d475-4029-ad11-b6e97d35d195-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.697815 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.697581 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-config\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.697815 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.697597 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/85c5a684-d475-4029-ad11-b6e97d35d195-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.697815 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.697614 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.697815 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.697628 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.697815 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.697664 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85c5a684-d475-4029-ad11-b6e97d35d195-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.798848 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.798812 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.798848 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.798844 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/85c5a684-d475-4029-ad11-b6e97d35d195-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.799063 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.798869 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-web-config\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.799063 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.798896 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.799063 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.798914 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.799063 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.798933 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.799063 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.798951 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/85c5a684-d475-4029-ad11-b6e97d35d195-config-out\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.799063 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.798968 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.799063 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.798998 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85c5a684-d475-4029-ad11-b6e97d35d195-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.799063 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.799030 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/85c5a684-d475-4029-ad11-b6e97d35d195-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.799063 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.799052 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tshsv\" (UniqueName: \"kubernetes.io/projected/85c5a684-d475-4029-ad11-b6e97d35d195-kube-api-access-tshsv\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.799515 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.799083 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85c5a684-d475-4029-ad11-b6e97d35d195-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.799515 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.799107 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/85c5a684-d475-4029-ad11-b6e97d35d195-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.799515 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.799147 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-config\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.799515 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.799172 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/85c5a684-d475-4029-ad11-b6e97d35d195-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.799515 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.799198 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.799515 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.799225 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.799515 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.799250 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85c5a684-d475-4029-ad11-b6e97d35d195-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.799515 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.799270 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/85c5a684-d475-4029-ad11-b6e97d35d195-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.800241 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.800214 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/85c5a684-d475-4029-ad11-b6e97d35d195-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.800348 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.800295 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85c5a684-d475-4029-ad11-b6e97d35d195-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.800905 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.800877 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85c5a684-d475-4029-ad11-b6e97d35d195-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.802184 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.802157 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-web-config\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.802257 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.802194 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85c5a684-d475-4029-ad11-b6e97d35d195-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.802672 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.802626 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.802911 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.802892 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/85c5a684-d475-4029-ad11-b6e97d35d195-config-out\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.803134 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.803113 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-config\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.803785 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.803720 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.803785 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.803741 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.803916 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.803850 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.804242 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.804165 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/85c5a684-d475-4029-ad11-b6e97d35d195-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.804723 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.804700 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/85c5a684-d475-4029-ad11-b6e97d35d195-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.805372 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.805350 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.805475 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.805434 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.805897 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.805878 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/85c5a684-d475-4029-ad11-b6e97d35d195-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.807676 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.807660 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tshsv\" (UniqueName: \"kubernetes.io/projected/85c5a684-d475-4029-ad11-b6e97d35d195-kube-api-access-tshsv\") pod \"prometheus-k8s-0\" (UID: \"85c5a684-d475-4029-ad11-b6e97d35d195\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:10.880984 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:10.880944 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:11.011000 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:11.010969 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:56:11.014037 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:56:11.014001 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85c5a684_d475_4029_ad11_b6e97d35d195.slice/crio-d743a3a247454852752e131e93c70c5e2a874fce5fd3064553581ea7dedbcef7 WatchSource:0}: Error finding container d743a3a247454852752e131e93c70c5e2a874fce5fd3064553581ea7dedbcef7: Status 404 returned error can't find the container with id d743a3a247454852752e131e93c70c5e2a874fce5fd3064553581ea7dedbcef7 Apr 24 23:56:11.513592 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:11.513552 2578 generic.go:358] "Generic (PLEG): container finished" podID="85c5a684-d475-4029-ad11-b6e97d35d195" containerID="8a45b6dfbcb6082c09f7f6d4a9c5822d4e56cccaccd6f93cd37523bfb7a1d665" exitCode=0 Apr 24 23:56:11.513959 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:11.513637 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"85c5a684-d475-4029-ad11-b6e97d35d195","Type":"ContainerDied","Data":"8a45b6dfbcb6082c09f7f6d4a9c5822d4e56cccaccd6f93cd37523bfb7a1d665"} Apr 24 23:56:11.513959 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:11.513674 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"85c5a684-d475-4029-ad11-b6e97d35d195","Type":"ContainerStarted","Data":"d743a3a247454852752e131e93c70c5e2a874fce5fd3064553581ea7dedbcef7"} Apr 24 23:56:11.916375 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:11.916336 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84cf5627-5273-43e3-99ae-b6ab2371aa69" path="/var/lib/kubelet/pods/84cf5627-5273-43e3-99ae-b6ab2371aa69/volumes" Apr 24 23:56:12.521948 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:12.521907 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"85c5a684-d475-4029-ad11-b6e97d35d195","Type":"ContainerStarted","Data":"e99cd2ff7df74a723c08ed2eb42db52cb7c67f8dbc6390a81421f6b1799bf2d6"} Apr 24 23:56:12.521948 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:12.521953 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"85c5a684-d475-4029-ad11-b6e97d35d195","Type":"ContainerStarted","Data":"c28c3ccca4806ef35a44edf86387d6351fb00120825cdacb2b789b5b7dfe3794"} Apr 24 23:56:12.522330 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:12.521967 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"85c5a684-d475-4029-ad11-b6e97d35d195","Type":"ContainerStarted","Data":"e75432111d8f7102a7253ad230a21875d8f78ef1ecd6375a0dde7b8c598a15e4"} Apr 24 23:56:12.522330 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:12.521980 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"85c5a684-d475-4029-ad11-b6e97d35d195","Type":"ContainerStarted","Data":"0a193cc2c90f3365eed1aadbed6dd258a0615d778c896643ced13d9b87dd5468"} Apr 24 23:56:12.522330 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:12.521991 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"85c5a684-d475-4029-ad11-b6e97d35d195","Type":"ContainerStarted","Data":"68aa1350ae053772ff2726785b5e483b17c145edc32d0808f85a1c36881ff4ee"} Apr 24 23:56:12.522330 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:12.522003 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"85c5a684-d475-4029-ad11-b6e97d35d195","Type":"ContainerStarted","Data":"21fd7ec4bc1a4445b1cb261b6de47f55528e979b0da23ff79d980b30fed7ca7f"} Apr 24 23:56:12.549103 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:12.549053 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.549032899 podStartE2EDuration="2.549032899s" podCreationTimestamp="2026-04-24 23:56:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:56:12.547294953 +0000 UTC m=+155.181276888" watchObservedRunningTime="2026-04-24 23:56:12.549032899 +0000 UTC m=+155.183014837" Apr 24 23:56:15.882104 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:56:15.882053 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:57:10.882073 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:57:10.882038 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:57:10.897624 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:57:10.897598 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:57:11.697835 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:57:11.697809 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:37.831166 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:37.831141 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 24 23:58:37.831813 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:37.831793 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 24 23:58:37.834440 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:37.834404 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 23:58:43.192770 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:43.192739 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-lxncq"] Apr 24 23:58:43.195835 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:43.195818 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-lxncq" Apr 24 23:58:43.197882 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:43.197862 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-mpkn4\"" Apr 24 23:58:43.198010 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:43.197950 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 23:58:43.198710 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:43.198688 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 23:58:43.198794 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:43.198699 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 23:58:43.205857 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:43.205836 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-lxncq"] Apr 24 23:58:43.293182 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:43.293139 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sdzw\" (UniqueName: \"kubernetes.io/projected/056ba31c-acb9-4e89-99bb-bf2a09c5d4fc-kube-api-access-4sdzw\") pod \"llmisvc-controller-manager-68cc5db7c4-lxncq\" (UID: \"056ba31c-acb9-4e89-99bb-bf2a09c5d4fc\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-lxncq" Apr 24 23:58:43.293329 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:43.293267 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/056ba31c-acb9-4e89-99bb-bf2a09c5d4fc-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-lxncq\" (UID: \"056ba31c-acb9-4e89-99bb-bf2a09c5d4fc\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-lxncq" Apr 24 23:58:43.394458 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:43.394405 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/056ba31c-acb9-4e89-99bb-bf2a09c5d4fc-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-lxncq\" (UID: \"056ba31c-acb9-4e89-99bb-bf2a09c5d4fc\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-lxncq" Apr 24 23:58:43.394612 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:43.394502 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4sdzw\" (UniqueName: \"kubernetes.io/projected/056ba31c-acb9-4e89-99bb-bf2a09c5d4fc-kube-api-access-4sdzw\") pod \"llmisvc-controller-manager-68cc5db7c4-lxncq\" (UID: \"056ba31c-acb9-4e89-99bb-bf2a09c5d4fc\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-lxncq" Apr 24 23:58:43.396909 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:43.396879 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/056ba31c-acb9-4e89-99bb-bf2a09c5d4fc-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-lxncq\" (UID: \"056ba31c-acb9-4e89-99bb-bf2a09c5d4fc\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-lxncq" Apr 24 23:58:43.402131 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:43.402108 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sdzw\" (UniqueName: \"kubernetes.io/projected/056ba31c-acb9-4e89-99bb-bf2a09c5d4fc-kube-api-access-4sdzw\") pod \"llmisvc-controller-manager-68cc5db7c4-lxncq\" (UID: \"056ba31c-acb9-4e89-99bb-bf2a09c5d4fc\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-lxncq" Apr 24 23:58:43.506422 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:43.506396 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-lxncq" Apr 24 23:58:43.623142 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:43.623041 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-lxncq"] Apr 24 23:58:43.625764 ip-10-0-129-98 kubenswrapper[2578]: W0424 23:58:43.625733 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod056ba31c_acb9_4e89_99bb_bf2a09c5d4fc.slice/crio-0de701b9efc8b0aa1d30e2af45045f6271adfd470f4b75686c0147b900c445f3 WatchSource:0}: Error finding container 0de701b9efc8b0aa1d30e2af45045f6271adfd470f4b75686c0147b900c445f3: Status 404 returned error can't find the container with id 0de701b9efc8b0aa1d30e2af45045f6271adfd470f4b75686c0147b900c445f3 Apr 24 23:58:43.627145 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:43.627127 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:58:43.923988 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:43.923915 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-lxncq" event={"ID":"056ba31c-acb9-4e89-99bb-bf2a09c5d4fc","Type":"ContainerStarted","Data":"0de701b9efc8b0aa1d30e2af45045f6271adfd470f4b75686c0147b900c445f3"} Apr 24 23:58:45.930306 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:45.930271 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-lxncq" event={"ID":"056ba31c-acb9-4e89-99bb-bf2a09c5d4fc","Type":"ContainerStarted","Data":"4729c830f1b9d21f66db43970ad4074e436a605f945aab7d03c49dbf7d2fc789"} Apr 24 23:58:45.930701 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:45.930392 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-lxncq" Apr 24 23:58:45.948273 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:58:45.948214 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-lxncq" podStartSLOduration=1.151217481 podStartE2EDuration="2.948186231s" podCreationTimestamp="2026-04-24 23:58:43 +0000 UTC" firstStartedPulling="2026-04-24 23:58:43.627250513 +0000 UTC m=+306.261232426" lastFinishedPulling="2026-04-24 23:58:45.424219259 +0000 UTC m=+308.058201176" observedRunningTime="2026-04-24 23:58:45.94631467 +0000 UTC m=+308.580296614" watchObservedRunningTime="2026-04-24 23:58:45.948186231 +0000 UTC m=+308.582168169" Apr 24 23:59:16.936623 ip-10-0-129-98 kubenswrapper[2578]: I0424 23:59:16.936594 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-lxncq" Apr 25 00:00:06.793718 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.793678 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6467d98968-v8qkd"] Apr 25 00:00:06.796192 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.796171 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:06.798458 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.798433 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 25 00:00:06.799280 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.799259 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 25 00:00:06.799385 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.799288 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 25 00:00:06.799385 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.799321 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 25 00:00:06.799385 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.799262 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-qmbm4\"" Apr 25 00:00:06.799385 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.799378 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 25 00:00:06.799691 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.799262 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 25 00:00:06.799691 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.799322 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 25 00:00:06.804301 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.804280 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 25 00:00:06.808489 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.808468 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6467d98968-v8qkd"] Apr 25 00:00:06.846583 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.846555 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e56e78d-5aae-468e-a90b-b0792d315656-console-oauth-config\") pod \"console-6467d98968-v8qkd\" (UID: \"6e56e78d-5aae-468e-a90b-b0792d315656\") " pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:06.846666 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.846587 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e56e78d-5aae-468e-a90b-b0792d315656-console-config\") pod \"console-6467d98968-v8qkd\" (UID: \"6e56e78d-5aae-468e-a90b-b0792d315656\") " pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:06.846666 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.846611 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e56e78d-5aae-468e-a90b-b0792d315656-oauth-serving-cert\") pod \"console-6467d98968-v8qkd\" (UID: \"6e56e78d-5aae-468e-a90b-b0792d315656\") " pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:06.846758 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.846688 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e56e78d-5aae-468e-a90b-b0792d315656-console-serving-cert\") pod \"console-6467d98968-v8qkd\" (UID: \"6e56e78d-5aae-468e-a90b-b0792d315656\") " pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:06.846758 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.846715 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm262\" (UniqueName: \"kubernetes.io/projected/6e56e78d-5aae-468e-a90b-b0792d315656-kube-api-access-mm262\") pod \"console-6467d98968-v8qkd\" (UID: \"6e56e78d-5aae-468e-a90b-b0792d315656\") " pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:06.846758 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.846750 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e56e78d-5aae-468e-a90b-b0792d315656-trusted-ca-bundle\") pod \"console-6467d98968-v8qkd\" (UID: \"6e56e78d-5aae-468e-a90b-b0792d315656\") " pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:06.846872 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.846811 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e56e78d-5aae-468e-a90b-b0792d315656-service-ca\") pod \"console-6467d98968-v8qkd\" (UID: \"6e56e78d-5aae-468e-a90b-b0792d315656\") " pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:06.947799 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.947775 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e56e78d-5aae-468e-a90b-b0792d315656-console-serving-cert\") pod \"console-6467d98968-v8qkd\" (UID: \"6e56e78d-5aae-468e-a90b-b0792d315656\") " pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:06.947799 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.947801 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mm262\" (UniqueName: \"kubernetes.io/projected/6e56e78d-5aae-468e-a90b-b0792d315656-kube-api-access-mm262\") pod \"console-6467d98968-v8qkd\" (UID: \"6e56e78d-5aae-468e-a90b-b0792d315656\") " pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:06.947958 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.947824 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e56e78d-5aae-468e-a90b-b0792d315656-trusted-ca-bundle\") pod \"console-6467d98968-v8qkd\" (UID: \"6e56e78d-5aae-468e-a90b-b0792d315656\") " pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:06.947958 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.947853 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e56e78d-5aae-468e-a90b-b0792d315656-service-ca\") pod \"console-6467d98968-v8qkd\" (UID: \"6e56e78d-5aae-468e-a90b-b0792d315656\") " pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:06.947958 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.947888 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e56e78d-5aae-468e-a90b-b0792d315656-console-oauth-config\") pod \"console-6467d98968-v8qkd\" (UID: \"6e56e78d-5aae-468e-a90b-b0792d315656\") " pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:06.947958 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.947902 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e56e78d-5aae-468e-a90b-b0792d315656-console-config\") pod \"console-6467d98968-v8qkd\" (UID: \"6e56e78d-5aae-468e-a90b-b0792d315656\") " pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:06.947958 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.947918 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e56e78d-5aae-468e-a90b-b0792d315656-oauth-serving-cert\") pod \"console-6467d98968-v8qkd\" (UID: \"6e56e78d-5aae-468e-a90b-b0792d315656\") " pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:06.948708 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.948686 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e56e78d-5aae-468e-a90b-b0792d315656-oauth-serving-cert\") pod \"console-6467d98968-v8qkd\" (UID: \"6e56e78d-5aae-468e-a90b-b0792d315656\") " pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:06.948708 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.948700 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e56e78d-5aae-468e-a90b-b0792d315656-trusted-ca-bundle\") pod \"console-6467d98968-v8qkd\" (UID: \"6e56e78d-5aae-468e-a90b-b0792d315656\") " pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:06.948708 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.948704 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e56e78d-5aae-468e-a90b-b0792d315656-console-config\") pod \"console-6467d98968-v8qkd\" (UID: \"6e56e78d-5aae-468e-a90b-b0792d315656\") " pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:06.948961 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.948940 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e56e78d-5aae-468e-a90b-b0792d315656-service-ca\") pod \"console-6467d98968-v8qkd\" (UID: \"6e56e78d-5aae-468e-a90b-b0792d315656\") " pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:06.950404 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.950378 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e56e78d-5aae-468e-a90b-b0792d315656-console-serving-cert\") pod \"console-6467d98968-v8qkd\" (UID: \"6e56e78d-5aae-468e-a90b-b0792d315656\") " pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:06.950404 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.950406 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e56e78d-5aae-468e-a90b-b0792d315656-console-oauth-config\") pod \"console-6467d98968-v8qkd\" (UID: \"6e56e78d-5aae-468e-a90b-b0792d315656\") " pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:06.958693 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:06.958673 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm262\" (UniqueName: \"kubernetes.io/projected/6e56e78d-5aae-468e-a90b-b0792d315656-kube-api-access-mm262\") pod \"console-6467d98968-v8qkd\" (UID: \"6e56e78d-5aae-468e-a90b-b0792d315656\") " pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:07.106098 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:07.106039 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:07.163378 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:07.163348 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-sk5sp"] Apr 25 00:00:07.167070 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:07.167044 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sk5sp" Apr 25 00:00:07.169469 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:07.169266 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 25 00:00:07.169584 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:07.169553 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-6c5nr\"" Apr 25 00:00:07.173670 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:07.173651 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-sk5sp"] Apr 25 00:00:07.231650 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:07.231613 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6467d98968-v8qkd"] Apr 25 00:00:07.250679 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:07.250641 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqsx6\" (UniqueName: \"kubernetes.io/projected/ec7dfc0e-4b8f-4573-9e35-638b6bb8681e-kube-api-access-pqsx6\") pod \"s3-init-sk5sp\" (UID: \"ec7dfc0e-4b8f-4573-9e35-638b6bb8681e\") " pod="kserve/s3-init-sk5sp" Apr 25 00:00:07.351433 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:07.351392 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqsx6\" (UniqueName: \"kubernetes.io/projected/ec7dfc0e-4b8f-4573-9e35-638b6bb8681e-kube-api-access-pqsx6\") pod \"s3-init-sk5sp\" (UID: \"ec7dfc0e-4b8f-4573-9e35-638b6bb8681e\") " pod="kserve/s3-init-sk5sp" Apr 25 00:00:07.360050 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:07.359995 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqsx6\" (UniqueName: \"kubernetes.io/projected/ec7dfc0e-4b8f-4573-9e35-638b6bb8681e-kube-api-access-pqsx6\") pod \"s3-init-sk5sp\" (UID: \"ec7dfc0e-4b8f-4573-9e35-638b6bb8681e\") " pod="kserve/s3-init-sk5sp" Apr 25 00:00:07.494911 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:07.494888 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sk5sp" Apr 25 00:00:07.629338 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:07.629273 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-sk5sp"] Apr 25 00:00:07.632465 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:00:07.632437 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec7dfc0e_4b8f_4573_9e35_638b6bb8681e.slice/crio-f5aa9112ba8cc5db03616dff00b852302054969472be0bb441c4805c9d3d2246 WatchSource:0}: Error finding container f5aa9112ba8cc5db03616dff00b852302054969472be0bb441c4805c9d3d2246: Status 404 returned error can't find the container with id f5aa9112ba8cc5db03616dff00b852302054969472be0bb441c4805c9d3d2246 Apr 25 00:00:08.147383 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:08.147346 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sk5sp" event={"ID":"ec7dfc0e-4b8f-4573-9e35-638b6bb8681e","Type":"ContainerStarted","Data":"f5aa9112ba8cc5db03616dff00b852302054969472be0bb441c4805c9d3d2246"} Apr 25 00:00:08.148622 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:08.148585 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6467d98968-v8qkd" event={"ID":"6e56e78d-5aae-468e-a90b-b0792d315656","Type":"ContainerStarted","Data":"ca9bc5fee65868694f6657ea40d934ea5aad56c480376c66b6093fe1212aa85c"} Apr 25 00:00:08.148622 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:08.148616 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6467d98968-v8qkd" event={"ID":"6e56e78d-5aae-468e-a90b-b0792d315656","Type":"ContainerStarted","Data":"6b52c5d34fa6c9b7587fcaf13cb438c3ba1e33870724dffcbc73256969d727c2"} Apr 25 00:00:08.169800 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:08.169749 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6467d98968-v8qkd" podStartSLOduration=2.169734997 podStartE2EDuration="2.169734997s" podCreationTimestamp="2026-04-25 00:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:00:08.168053343 +0000 UTC m=+390.802035303" watchObservedRunningTime="2026-04-25 00:00:08.169734997 +0000 UTC m=+390.803716943" Apr 25 00:00:15.175665 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:15.175611 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sk5sp" event={"ID":"ec7dfc0e-4b8f-4573-9e35-638b6bb8681e","Type":"ContainerStarted","Data":"c7b8c89d95d1730640a24047925412387c5baf4e572d824671232b64080be71a"} Apr 25 00:00:15.192038 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:15.191988 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-sk5sp" podStartSLOduration=1.321452323 podStartE2EDuration="8.191975964s" podCreationTimestamp="2026-04-25 00:00:07 +0000 UTC" firstStartedPulling="2026-04-25 00:00:07.63416343 +0000 UTC m=+390.268145345" lastFinishedPulling="2026-04-25 00:00:14.504687072 +0000 UTC m=+397.138668986" observedRunningTime="2026-04-25 00:00:15.190189017 +0000 UTC m=+397.824170952" watchObservedRunningTime="2026-04-25 00:00:15.191975964 +0000 UTC m=+397.825957949" Apr 25 00:00:17.106967 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:17.106933 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:17.106967 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:17.106976 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:17.111714 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:17.111685 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:17.184750 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:17.184723 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6467d98968-v8qkd" Apr 25 00:00:18.184638 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:18.184607 2578 generic.go:358] "Generic (PLEG): container finished" podID="ec7dfc0e-4b8f-4573-9e35-638b6bb8681e" containerID="c7b8c89d95d1730640a24047925412387c5baf4e572d824671232b64080be71a" exitCode=0 Apr 25 00:00:18.184982 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:18.184683 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sk5sp" event={"ID":"ec7dfc0e-4b8f-4573-9e35-638b6bb8681e","Type":"ContainerDied","Data":"c7b8c89d95d1730640a24047925412387c5baf4e572d824671232b64080be71a"} Apr 25 00:00:19.308508 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:19.308486 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sk5sp" Apr 25 00:00:19.449628 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:19.449556 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqsx6\" (UniqueName: \"kubernetes.io/projected/ec7dfc0e-4b8f-4573-9e35-638b6bb8681e-kube-api-access-pqsx6\") pod \"ec7dfc0e-4b8f-4573-9e35-638b6bb8681e\" (UID: \"ec7dfc0e-4b8f-4573-9e35-638b6bb8681e\") " Apr 25 00:00:19.451754 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:19.451726 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec7dfc0e-4b8f-4573-9e35-638b6bb8681e-kube-api-access-pqsx6" (OuterVolumeSpecName: "kube-api-access-pqsx6") pod "ec7dfc0e-4b8f-4573-9e35-638b6bb8681e" (UID: "ec7dfc0e-4b8f-4573-9e35-638b6bb8681e"). InnerVolumeSpecName "kube-api-access-pqsx6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:00:19.550202 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:19.550169 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pqsx6\" (UniqueName: \"kubernetes.io/projected/ec7dfc0e-4b8f-4573-9e35-638b6bb8681e-kube-api-access-pqsx6\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:00:20.191658 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:20.191569 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sk5sp" event={"ID":"ec7dfc0e-4b8f-4573-9e35-638b6bb8681e","Type":"ContainerDied","Data":"f5aa9112ba8cc5db03616dff00b852302054969472be0bb441c4805c9d3d2246"} Apr 25 00:00:20.191658 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:20.191606 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5aa9112ba8cc5db03616dff00b852302054969472be0bb441c4805c9d3d2246" Apr 25 00:00:20.191658 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:20.191612 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sk5sp" Apr 25 00:00:29.083664 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.083634 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k"] Apr 25 00:00:29.084012 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.083931 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec7dfc0e-4b8f-4573-9e35-638b6bb8681e" containerName="s3-init" Apr 25 00:00:29.084012 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.083941 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec7dfc0e-4b8f-4573-9e35-638b6bb8681e" containerName="s3-init" Apr 25 00:00:29.084012 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.083990 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec7dfc0e-4b8f-4573-9e35-638b6bb8681e" containerName="s3-init" Apr 25 00:00:29.088054 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.088038 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" Apr 25 00:00:29.090203 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.090172 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-4fa95-predictor-serving-cert\"" Apr 25 00:00:29.090363 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.090347 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 25 00:00:29.091189 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.091162 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 25 00:00:29.091189 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.091183 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-b9x42\"" Apr 25 00:00:29.091387 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.091164 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-4fa95-kube-rbac-proxy-sar-config\"" Apr 25 00:00:29.096628 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.096606 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k"] Apr 25 00:00:29.224272 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.224236 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b15f40f7-5f84-4365-b2e1-28591524b6b7-proxy-tls\") pod \"success-200-isvc-4fa95-predictor-5756bcc86-4mc9k\" (UID: \"b15f40f7-5f84-4365-b2e1-28591524b6b7\") " pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" Apr 25 00:00:29.224272 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.224275 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c9wx\" (UniqueName: \"kubernetes.io/projected/b15f40f7-5f84-4365-b2e1-28591524b6b7-kube-api-access-7c9wx\") pod \"success-200-isvc-4fa95-predictor-5756bcc86-4mc9k\" (UID: \"b15f40f7-5f84-4365-b2e1-28591524b6b7\") " pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" Apr 25 00:00:29.224486 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.224299 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-4fa95-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b15f40f7-5f84-4365-b2e1-28591524b6b7-success-200-isvc-4fa95-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-4fa95-predictor-5756bcc86-4mc9k\" (UID: \"b15f40f7-5f84-4365-b2e1-28591524b6b7\") " pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" Apr 25 00:00:29.324910 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.324884 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b15f40f7-5f84-4365-b2e1-28591524b6b7-proxy-tls\") pod \"success-200-isvc-4fa95-predictor-5756bcc86-4mc9k\" (UID: \"b15f40f7-5f84-4365-b2e1-28591524b6b7\") " pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" Apr 25 00:00:29.325030 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.324912 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7c9wx\" (UniqueName: \"kubernetes.io/projected/b15f40f7-5f84-4365-b2e1-28591524b6b7-kube-api-access-7c9wx\") pod \"success-200-isvc-4fa95-predictor-5756bcc86-4mc9k\" (UID: \"b15f40f7-5f84-4365-b2e1-28591524b6b7\") " pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" Apr 25 00:00:29.325030 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.324934 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-4fa95-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b15f40f7-5f84-4365-b2e1-28591524b6b7-success-200-isvc-4fa95-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-4fa95-predictor-5756bcc86-4mc9k\" (UID: \"b15f40f7-5f84-4365-b2e1-28591524b6b7\") " pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" Apr 25 00:00:29.325563 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.325531 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-4fa95-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b15f40f7-5f84-4365-b2e1-28591524b6b7-success-200-isvc-4fa95-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-4fa95-predictor-5756bcc86-4mc9k\" (UID: \"b15f40f7-5f84-4365-b2e1-28591524b6b7\") " pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" Apr 25 00:00:29.327506 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.327482 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b15f40f7-5f84-4365-b2e1-28591524b6b7-proxy-tls\") pod \"success-200-isvc-4fa95-predictor-5756bcc86-4mc9k\" (UID: \"b15f40f7-5f84-4365-b2e1-28591524b6b7\") " pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" Apr 25 00:00:29.332801 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.332773 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c9wx\" (UniqueName: \"kubernetes.io/projected/b15f40f7-5f84-4365-b2e1-28591524b6b7-kube-api-access-7c9wx\") pod \"success-200-isvc-4fa95-predictor-5756bcc86-4mc9k\" (UID: \"b15f40f7-5f84-4365-b2e1-28591524b6b7\") " pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" Apr 25 00:00:29.380997 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.380943 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt"] Apr 25 00:00:29.383352 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.383338 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" Apr 25 00:00:29.385753 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.385734 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-4fa95-predictor-serving-cert\"" Apr 25 00:00:29.386466 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.386450 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-4fa95-kube-rbac-proxy-sar-config\"" Apr 25 00:00:29.394710 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.394693 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt"] Apr 25 00:00:29.398728 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.398708 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" Apr 25 00:00:29.527144 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.527111 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k"] Apr 25 00:00:29.527364 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.527346 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fdcb0f25-048c-4797-9750-b37969b8cb48-proxy-tls\") pod \"error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt\" (UID: \"fdcb0f25-048c-4797-9750-b37969b8cb48\") " pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" Apr 25 00:00:29.527429 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.527401 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-4fa95-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fdcb0f25-048c-4797-9750-b37969b8cb48-error-404-isvc-4fa95-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt\" (UID: \"fdcb0f25-048c-4797-9750-b37969b8cb48\") " pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" Apr 25 00:00:29.527531 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.527512 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swsvb\" (UniqueName: \"kubernetes.io/projected/fdcb0f25-048c-4797-9750-b37969b8cb48-kube-api-access-swsvb\") pod \"error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt\" (UID: \"fdcb0f25-048c-4797-9750-b37969b8cb48\") " pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" Apr 25 00:00:29.530187 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:00:29.530165 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb15f40f7_5f84_4365_b2e1_28591524b6b7.slice/crio-0c90cc9233b9d3c4f0247aafa2093c65dcc80c67d015e0d1c18128b1cddd1458 WatchSource:0}: Error finding container 0c90cc9233b9d3c4f0247aafa2093c65dcc80c67d015e0d1c18128b1cddd1458: Status 404 returned error can't find the container with id 0c90cc9233b9d3c4f0247aafa2093c65dcc80c67d015e0d1c18128b1cddd1458 Apr 25 00:00:29.628293 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.628258 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-4fa95-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fdcb0f25-048c-4797-9750-b37969b8cb48-error-404-isvc-4fa95-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt\" (UID: \"fdcb0f25-048c-4797-9750-b37969b8cb48\") " pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" Apr 25 00:00:29.628409 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.628301 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swsvb\" (UniqueName: \"kubernetes.io/projected/fdcb0f25-048c-4797-9750-b37969b8cb48-kube-api-access-swsvb\") pod \"error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt\" (UID: \"fdcb0f25-048c-4797-9750-b37969b8cb48\") " pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" Apr 25 00:00:29.628409 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.628334 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fdcb0f25-048c-4797-9750-b37969b8cb48-proxy-tls\") pod \"error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt\" (UID: \"fdcb0f25-048c-4797-9750-b37969b8cb48\") " pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" Apr 25 00:00:29.628927 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.628907 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-4fa95-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fdcb0f25-048c-4797-9750-b37969b8cb48-error-404-isvc-4fa95-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt\" (UID: \"fdcb0f25-048c-4797-9750-b37969b8cb48\") " pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" Apr 25 00:00:29.631170 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.631123 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fdcb0f25-048c-4797-9750-b37969b8cb48-proxy-tls\") pod \"error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt\" (UID: \"fdcb0f25-048c-4797-9750-b37969b8cb48\") " pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" Apr 25 00:00:29.636732 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.636714 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swsvb\" (UniqueName: \"kubernetes.io/projected/fdcb0f25-048c-4797-9750-b37969b8cb48-kube-api-access-swsvb\") pod \"error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt\" (UID: \"fdcb0f25-048c-4797-9750-b37969b8cb48\") " pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" Apr 25 00:00:29.694704 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.694682 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" Apr 25 00:00:29.811921 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:29.811886 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt"] Apr 25 00:00:29.815272 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:00:29.815243 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdcb0f25_048c_4797_9750_b37969b8cb48.slice/crio-a82c16aa5763999a82754247377dbd5c64cf3e41603406e9546b8ee1388238e5 WatchSource:0}: Error finding container a82c16aa5763999a82754247377dbd5c64cf3e41603406e9546b8ee1388238e5: Status 404 returned error can't find the container with id a82c16aa5763999a82754247377dbd5c64cf3e41603406e9546b8ee1388238e5 Apr 25 00:00:30.222770 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:30.222736 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" event={"ID":"fdcb0f25-048c-4797-9750-b37969b8cb48","Type":"ContainerStarted","Data":"a82c16aa5763999a82754247377dbd5c64cf3e41603406e9546b8ee1388238e5"} Apr 25 00:00:30.225022 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:30.224990 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" event={"ID":"b15f40f7-5f84-4365-b2e1-28591524b6b7","Type":"ContainerStarted","Data":"0c90cc9233b9d3c4f0247aafa2093c65dcc80c67d015e0d1c18128b1cddd1458"} Apr 25 00:00:45.289177 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:45.289112 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" event={"ID":"fdcb0f25-048c-4797-9750-b37969b8cb48","Type":"ContainerStarted","Data":"eb477d6281ee975410f117247beee1c638698ceb8179a1b8b2c87b6d060e93e9"} Apr 25 00:00:45.291761 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:45.291731 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" event={"ID":"b15f40f7-5f84-4365-b2e1-28591524b6b7","Type":"ContainerStarted","Data":"f9628f2c3a50bfe4c84d4d7a4a9e62990e225f19106a6df3f121d2df372110f7"} Apr 25 00:00:47.299772 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:47.299718 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" event={"ID":"fdcb0f25-048c-4797-9750-b37969b8cb48","Type":"ContainerStarted","Data":"0eb1d59c68f699c5bbb62c8c3d96967492b6ed74c0b8e85b7202d370499b5879"} Apr 25 00:00:47.300235 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:47.299924 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" Apr 25 00:00:47.300235 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:47.299958 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" Apr 25 00:00:47.301315 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:47.301256 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" podUID="fdcb0f25-048c-4797-9750-b37969b8cb48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 25 00:00:47.302043 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:47.302018 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" event={"ID":"b15f40f7-5f84-4365-b2e1-28591524b6b7","Type":"ContainerStarted","Data":"a02d554d67294d91810aa1afffbe598c93d61f58b57f041e476ded998dd3923e"} Apr 25 00:00:47.302194 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:47.302175 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" Apr 25 00:00:47.317505 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:47.317460 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" podStartSLOduration=0.987684863 podStartE2EDuration="18.317447156s" podCreationTimestamp="2026-04-25 00:00:29 +0000 UTC" firstStartedPulling="2026-04-25 00:00:29.817070185 +0000 UTC m=+412.451052098" lastFinishedPulling="2026-04-25 00:00:47.146832469 +0000 UTC m=+429.780814391" observedRunningTime="2026-04-25 00:00:47.316927893 +0000 UTC m=+429.950909829" watchObservedRunningTime="2026-04-25 00:00:47.317447156 +0000 UTC m=+429.951429092" Apr 25 00:00:48.305655 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:48.305607 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" podUID="fdcb0f25-048c-4797-9750-b37969b8cb48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 25 00:00:48.306031 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:48.305666 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" Apr 25 00:00:48.307011 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:48.306987 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" podUID="b15f40f7-5f84-4365-b2e1-28591524b6b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 25 00:00:49.307934 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:49.307891 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" podUID="b15f40f7-5f84-4365-b2e1-28591524b6b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 25 00:00:53.309680 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:53.309615 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" Apr 25 00:00:53.310165 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:53.310137 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" podUID="fdcb0f25-048c-4797-9750-b37969b8cb48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 25 00:00:53.326937 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:53.326891 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" podStartSLOduration=6.721741746 podStartE2EDuration="24.326879257s" podCreationTimestamp="2026-04-25 00:00:29 +0000 UTC" firstStartedPulling="2026-04-25 00:00:29.532490502 +0000 UTC m=+412.166472420" lastFinishedPulling="2026-04-25 00:00:47.137628017 +0000 UTC m=+429.771609931" observedRunningTime="2026-04-25 00:00:47.334931049 +0000 UTC m=+429.968912985" watchObservedRunningTime="2026-04-25 00:00:53.326879257 +0000 UTC m=+435.960861186" Apr 25 00:00:54.312488 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:54.312459 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" Apr 25 00:00:54.313019 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:00:54.312992 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" podUID="b15f40f7-5f84-4365-b2e1-28591524b6b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 25 00:01:03.310349 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:03.310311 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" podUID="fdcb0f25-048c-4797-9750-b37969b8cb48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 25 00:01:04.313648 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:04.313612 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" podUID="b15f40f7-5f84-4365-b2e1-28591524b6b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 25 00:01:13.310222 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:13.310183 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" podUID="fdcb0f25-048c-4797-9750-b37969b8cb48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 25 00:01:14.313058 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:14.313024 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" podUID="b15f40f7-5f84-4365-b2e1-28591524b6b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 25 00:01:23.310224 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:23.310187 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" podUID="fdcb0f25-048c-4797-9750-b37969b8cb48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 25 00:01:24.313625 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:24.313588 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" podUID="b15f40f7-5f84-4365-b2e1-28591524b6b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 25 00:01:33.310962 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:33.310937 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" Apr 25 00:01:34.314190 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:34.314165 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" Apr 25 00:01:49.133301 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:49.133264 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6"] Apr 25 00:01:49.136679 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:49.136662 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" Apr 25 00:01:49.138875 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:49.138853 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-4fa95-serving-cert\"" Apr 25 00:01:49.138959 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:49.138878 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-4fa95-kube-rbac-proxy-sar-config\"" Apr 25 00:01:49.145693 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:49.145671 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6"] Apr 25 00:01:49.202518 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:49.202488 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63952d9a-11c4-4a41-83df-09777283aeef-proxy-tls\") pod \"switch-graph-4fa95-549f8ff959-xxjk6\" (UID: \"63952d9a-11c4-4a41-83df-09777283aeef\") " pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" Apr 25 00:01:49.202654 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:49.202618 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63952d9a-11c4-4a41-83df-09777283aeef-openshift-service-ca-bundle\") pod \"switch-graph-4fa95-549f8ff959-xxjk6\" (UID: \"63952d9a-11c4-4a41-83df-09777283aeef\") " pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" Apr 25 00:01:49.303160 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:49.303134 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63952d9a-11c4-4a41-83df-09777283aeef-proxy-tls\") pod \"switch-graph-4fa95-549f8ff959-xxjk6\" (UID: \"63952d9a-11c4-4a41-83df-09777283aeef\") " pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" Apr 25 00:01:49.303299 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:49.303233 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63952d9a-11c4-4a41-83df-09777283aeef-openshift-service-ca-bundle\") pod \"switch-graph-4fa95-549f8ff959-xxjk6\" (UID: \"63952d9a-11c4-4a41-83df-09777283aeef\") " pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" Apr 25 00:01:49.303916 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:49.303896 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63952d9a-11c4-4a41-83df-09777283aeef-openshift-service-ca-bundle\") pod \"switch-graph-4fa95-549f8ff959-xxjk6\" (UID: \"63952d9a-11c4-4a41-83df-09777283aeef\") " pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" Apr 25 00:01:49.305583 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:49.305564 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63952d9a-11c4-4a41-83df-09777283aeef-proxy-tls\") pod \"switch-graph-4fa95-549f8ff959-xxjk6\" (UID: \"63952d9a-11c4-4a41-83df-09777283aeef\") " pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" Apr 25 00:01:49.447016 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:49.446937 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" Apr 25 00:01:49.569082 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:49.569052 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6"] Apr 25 00:01:49.571841 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:01:49.571812 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63952d9a_11c4_4a41_83df_09777283aeef.slice/crio-0c33c12c10288eea7978da582463ae7b3597370eecffe888a71aac8d2b827601 WatchSource:0}: Error finding container 0c33c12c10288eea7978da582463ae7b3597370eecffe888a71aac8d2b827601: Status 404 returned error can't find the container with id 0c33c12c10288eea7978da582463ae7b3597370eecffe888a71aac8d2b827601 Apr 25 00:01:50.491584 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:50.491512 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" event={"ID":"63952d9a-11c4-4a41-83df-09777283aeef","Type":"ContainerStarted","Data":"0c33c12c10288eea7978da582463ae7b3597370eecffe888a71aac8d2b827601"} Apr 25 00:01:53.503386 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:53.503354 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" event={"ID":"63952d9a-11c4-4a41-83df-09777283aeef","Type":"ContainerStarted","Data":"8e79e537e39b60b2b42ae6aecbf622c86b2b6f95020b45a1b54f128a055a464b"} Apr 25 00:01:53.503760 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:53.503401 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" Apr 25 00:01:53.521440 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:53.521373 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" podStartSLOduration=1.309982845 podStartE2EDuration="4.521358663s" podCreationTimestamp="2026-04-25 00:01:49 +0000 UTC" firstStartedPulling="2026-04-25 00:01:49.573559937 +0000 UTC m=+492.207541851" lastFinishedPulling="2026-04-25 00:01:52.784935742 +0000 UTC m=+495.418917669" observedRunningTime="2026-04-25 00:01:53.519636847 +0000 UTC m=+496.153618783" watchObservedRunningTime="2026-04-25 00:01:53.521358663 +0000 UTC m=+496.155340598" Apr 25 00:01:59.510920 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:01:59.510889 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" Apr 25 00:02:03.394643 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.394611 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6"] Apr 25 00:02:03.395044 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.394829 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" podUID="63952d9a-11c4-4a41-83df-09777283aeef" containerName="switch-graph-4fa95" containerID="cri-o://8e79e537e39b60b2b42ae6aecbf622c86b2b6f95020b45a1b54f128a055a464b" gracePeriod=30 Apr 25 00:02:03.503321 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.503290 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k"] Apr 25 00:02:03.503657 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.503624 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" podUID="b15f40f7-5f84-4365-b2e1-28591524b6b7" containerName="kserve-container" containerID="cri-o://f9628f2c3a50bfe4c84d4d7a4a9e62990e225f19106a6df3f121d2df372110f7" gracePeriod=30 Apr 25 00:02:03.503657 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.503636 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" podUID="b15f40f7-5f84-4365-b2e1-28591524b6b7" containerName="kube-rbac-proxy" containerID="cri-o://a02d554d67294d91810aa1afffbe598c93d61f58b57f041e476ded998dd3923e" gracePeriod=30 Apr 25 00:02:03.567644 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.567612 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt"] Apr 25 00:02:03.567934 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.567908 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" podUID="fdcb0f25-048c-4797-9750-b37969b8cb48" containerName="kserve-container" containerID="cri-o://eb477d6281ee975410f117247beee1c638698ceb8179a1b8b2c87b6d060e93e9" gracePeriod=30 Apr 25 00:02:03.568025 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.567981 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" podUID="fdcb0f25-048c-4797-9750-b37969b8cb48" containerName="kube-rbac-proxy" containerID="cri-o://0eb1d59c68f699c5bbb62c8c3d96967492b6ed74c0b8e85b7202d370499b5879" gracePeriod=30 Apr 25 00:02:03.581595 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.581572 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk"] Apr 25 00:02:03.583938 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.583922 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" Apr 25 00:02:03.585828 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.585807 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-b9319-predictor-serving-cert\"" Apr 25 00:02:03.585933 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.585917 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-b9319-kube-rbac-proxy-sar-config\"" Apr 25 00:02:03.592596 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.592574 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk"] Apr 25 00:02:03.620579 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.620549 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-b9319-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4d06f33-056d-47a6-a3c4-4957f99749d0-success-200-isvc-b9319-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-b9319-predictor-67fcdd6676-f9tsk\" (UID: \"f4d06f33-056d-47a6-a3c4-4957f99749d0\") " pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" Apr 25 00:02:03.620719 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.620599 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6jv7\" (UniqueName: \"kubernetes.io/projected/f4d06f33-056d-47a6-a3c4-4957f99749d0-kube-api-access-k6jv7\") pod \"success-200-isvc-b9319-predictor-67fcdd6676-f9tsk\" (UID: \"f4d06f33-056d-47a6-a3c4-4957f99749d0\") " pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" Apr 25 00:02:03.620719 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.620662 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4d06f33-056d-47a6-a3c4-4957f99749d0-proxy-tls\") pod \"success-200-isvc-b9319-predictor-67fcdd6676-f9tsk\" (UID: \"f4d06f33-056d-47a6-a3c4-4957f99749d0\") " pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" Apr 25 00:02:03.648292 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.648228 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22"] Apr 25 00:02:03.650624 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.650610 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" Apr 25 00:02:03.652838 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.652818 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-b9319-predictor-serving-cert\"" Apr 25 00:02:03.652940 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.652822 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-b9319-kube-rbac-proxy-sar-config\"" Apr 25 00:02:03.661223 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.661202 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22"] Apr 25 00:02:03.721300 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.721271 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-b9319-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4d06f33-056d-47a6-a3c4-4957f99749d0-success-200-isvc-b9319-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-b9319-predictor-67fcdd6676-f9tsk\" (UID: \"f4d06f33-056d-47a6-a3c4-4957f99749d0\") " pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" Apr 25 00:02:03.721454 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.721318 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k6jv7\" (UniqueName: \"kubernetes.io/projected/f4d06f33-056d-47a6-a3c4-4957f99749d0-kube-api-access-k6jv7\") pod \"success-200-isvc-b9319-predictor-67fcdd6676-f9tsk\" (UID: \"f4d06f33-056d-47a6-a3c4-4957f99749d0\") " pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" Apr 25 00:02:03.721454 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.721344 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4d06f33-056d-47a6-a3c4-4957f99749d0-proxy-tls\") pod \"success-200-isvc-b9319-predictor-67fcdd6676-f9tsk\" (UID: \"f4d06f33-056d-47a6-a3c4-4957f99749d0\") " pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" Apr 25 00:02:03.721454 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.721371 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbz8b\" (UniqueName: \"kubernetes.io/projected/d130d938-29c6-47d1-a675-1cd75b0d26c2-kube-api-access-vbz8b\") pod \"error-404-isvc-b9319-predictor-65db6866c5-fqs22\" (UID: \"d130d938-29c6-47d1-a675-1cd75b0d26c2\") " pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" Apr 25 00:02:03.721454 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.721399 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-b9319-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d130d938-29c6-47d1-a675-1cd75b0d26c2-error-404-isvc-b9319-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-b9319-predictor-65db6866c5-fqs22\" (UID: \"d130d938-29c6-47d1-a675-1cd75b0d26c2\") " pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" Apr 25 00:02:03.721454 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.721451 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d130d938-29c6-47d1-a675-1cd75b0d26c2-proxy-tls\") pod \"error-404-isvc-b9319-predictor-65db6866c5-fqs22\" (UID: \"d130d938-29c6-47d1-a675-1cd75b0d26c2\") " pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" Apr 25 00:02:03.721991 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.721969 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-b9319-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4d06f33-056d-47a6-a3c4-4957f99749d0-success-200-isvc-b9319-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-b9319-predictor-67fcdd6676-f9tsk\" (UID: \"f4d06f33-056d-47a6-a3c4-4957f99749d0\") " pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" Apr 25 00:02:03.723750 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.723733 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4d06f33-056d-47a6-a3c4-4957f99749d0-proxy-tls\") pod \"success-200-isvc-b9319-predictor-67fcdd6676-f9tsk\" (UID: \"f4d06f33-056d-47a6-a3c4-4957f99749d0\") " pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" Apr 25 00:02:03.729934 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.729910 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6jv7\" (UniqueName: \"kubernetes.io/projected/f4d06f33-056d-47a6-a3c4-4957f99749d0-kube-api-access-k6jv7\") pod \"success-200-isvc-b9319-predictor-67fcdd6676-f9tsk\" (UID: \"f4d06f33-056d-47a6-a3c4-4957f99749d0\") " pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" Apr 25 00:02:03.822287 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.822254 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vbz8b\" (UniqueName: \"kubernetes.io/projected/d130d938-29c6-47d1-a675-1cd75b0d26c2-kube-api-access-vbz8b\") pod \"error-404-isvc-b9319-predictor-65db6866c5-fqs22\" (UID: \"d130d938-29c6-47d1-a675-1cd75b0d26c2\") " pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" Apr 25 00:02:03.822638 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.822612 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-b9319-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d130d938-29c6-47d1-a675-1cd75b0d26c2-error-404-isvc-b9319-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-b9319-predictor-65db6866c5-fqs22\" (UID: \"d130d938-29c6-47d1-a675-1cd75b0d26c2\") " pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" Apr 25 00:02:03.822772 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.822664 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d130d938-29c6-47d1-a675-1cd75b0d26c2-proxy-tls\") pod \"error-404-isvc-b9319-predictor-65db6866c5-fqs22\" (UID: \"d130d938-29c6-47d1-a675-1cd75b0d26c2\") " pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" Apr 25 00:02:03.823717 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.823573 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-b9319-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d130d938-29c6-47d1-a675-1cd75b0d26c2-error-404-isvc-b9319-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-b9319-predictor-65db6866c5-fqs22\" (UID: \"d130d938-29c6-47d1-a675-1cd75b0d26c2\") " pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" Apr 25 00:02:03.825457 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.825435 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d130d938-29c6-47d1-a675-1cd75b0d26c2-proxy-tls\") pod \"error-404-isvc-b9319-predictor-65db6866c5-fqs22\" (UID: \"d130d938-29c6-47d1-a675-1cd75b0d26c2\") " pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" Apr 25 00:02:03.829766 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.829747 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbz8b\" (UniqueName: \"kubernetes.io/projected/d130d938-29c6-47d1-a675-1cd75b0d26c2-kube-api-access-vbz8b\") pod \"error-404-isvc-b9319-predictor-65db6866c5-fqs22\" (UID: \"d130d938-29c6-47d1-a675-1cd75b0d26c2\") " pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" Apr 25 00:02:03.893624 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.893594 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" Apr 25 00:02:03.961017 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:03.960990 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" Apr 25 00:02:04.019998 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:04.019938 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk"] Apr 25 00:02:04.024998 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:02:04.024944 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4d06f33_056d_47a6_a3c4_4957f99749d0.slice/crio-6fbb7498812be40ee8285391c4edc871d5e760f972f3c95b924f69e1a524fb44 WatchSource:0}: Error finding container 6fbb7498812be40ee8285391c4edc871d5e760f972f3c95b924f69e1a524fb44: Status 404 returned error can't find the container with id 6fbb7498812be40ee8285391c4edc871d5e760f972f3c95b924f69e1a524fb44 Apr 25 00:02:04.089916 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:04.089889 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22"] Apr 25 00:02:04.092471 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:02:04.092444 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd130d938_29c6_47d1_a675_1cd75b0d26c2.slice/crio-a9b4b5000348a9637b3e17cd6688c77b6b9c452dfd3ed76db44b81dae90dbf02 WatchSource:0}: Error finding container a9b4b5000348a9637b3e17cd6688c77b6b9c452dfd3ed76db44b81dae90dbf02: Status 404 returned error can't find the container with id a9b4b5000348a9637b3e17cd6688c77b6b9c452dfd3ed76db44b81dae90dbf02 Apr 25 00:02:04.308973 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:04.308937 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" podUID="b15f40f7-5f84-4365-b2e1-28591524b6b7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.20:8643/healthz\": dial tcp 10.133.0.20:8643: connect: connection refused" Apr 25 00:02:04.313324 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:04.313301 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" podUID="b15f40f7-5f84-4365-b2e1-28591524b6b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 25 00:02:04.510154 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:04.510119 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" podUID="63952d9a-11c4-4a41-83df-09777283aeef" containerName="switch-graph-4fa95" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:02:04.536613 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:04.536580 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" event={"ID":"d130d938-29c6-47d1-a675-1cd75b0d26c2","Type":"ContainerStarted","Data":"8f06465da540ee3ed7bc7ff97f8f9f4bea086482a708fe08cc5ff2e24bb42baa"} Apr 25 00:02:04.536734 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:04.536618 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" event={"ID":"d130d938-29c6-47d1-a675-1cd75b0d26c2","Type":"ContainerStarted","Data":"272cf4c1e175b1ea6f7d4bad6ec965af6d4f0ff2dca353b93c04be61f7946a0a"} Apr 25 00:02:04.536734 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:04.536632 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" event={"ID":"d130d938-29c6-47d1-a675-1cd75b0d26c2","Type":"ContainerStarted","Data":"a9b4b5000348a9637b3e17cd6688c77b6b9c452dfd3ed76db44b81dae90dbf02"} Apr 25 00:02:04.536734 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:04.536712 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" Apr 25 00:02:04.538205 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:04.538185 2578 generic.go:358] "Generic (PLEG): container finished" podID="fdcb0f25-048c-4797-9750-b37969b8cb48" containerID="0eb1d59c68f699c5bbb62c8c3d96967492b6ed74c0b8e85b7202d370499b5879" exitCode=2 Apr 25 00:02:04.538278 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:04.538261 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" event={"ID":"fdcb0f25-048c-4797-9750-b37969b8cb48","Type":"ContainerDied","Data":"0eb1d59c68f699c5bbb62c8c3d96967492b6ed74c0b8e85b7202d370499b5879"} Apr 25 00:02:04.539730 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:04.539709 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" event={"ID":"f4d06f33-056d-47a6-a3c4-4957f99749d0","Type":"ContainerStarted","Data":"7b735685ec2e2ab9527c04efd565369e1b5cf3012729768f99c14ff87225a3cb"} Apr 25 00:02:04.539827 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:04.539739 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" event={"ID":"f4d06f33-056d-47a6-a3c4-4957f99749d0","Type":"ContainerStarted","Data":"8cab42edf19dcdcc0599e6b34b2ce282fba82057a5edec3786f1b3d7305d811e"} Apr 25 00:02:04.539827 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:04.539752 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" event={"ID":"f4d06f33-056d-47a6-a3c4-4957f99749d0","Type":"ContainerStarted","Data":"6fbb7498812be40ee8285391c4edc871d5e760f972f3c95b924f69e1a524fb44"} Apr 25 00:02:04.539920 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:04.539845 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" Apr 25 00:02:04.541302 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:04.541285 2578 generic.go:358] "Generic (PLEG): container finished" podID="b15f40f7-5f84-4365-b2e1-28591524b6b7" containerID="a02d554d67294d91810aa1afffbe598c93d61f58b57f041e476ded998dd3923e" exitCode=2 Apr 25 00:02:04.541389 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:04.541337 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" event={"ID":"b15f40f7-5f84-4365-b2e1-28591524b6b7","Type":"ContainerDied","Data":"a02d554d67294d91810aa1afffbe598c93d61f58b57f041e476ded998dd3923e"} Apr 25 00:02:04.554568 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:04.554528 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" podStartSLOduration=1.554516763 podStartE2EDuration="1.554516763s" podCreationTimestamp="2026-04-25 00:02:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:02:04.551838507 +0000 UTC m=+507.185820466" watchObservedRunningTime="2026-04-25 00:02:04.554516763 +0000 UTC m=+507.188498702" Apr 25 00:02:04.568664 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:04.568624 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" podStartSLOduration=1.568614452 podStartE2EDuration="1.568614452s" podCreationTimestamp="2026-04-25 00:02:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:02:04.56776933 +0000 UTC m=+507.201751266" watchObservedRunningTime="2026-04-25 00:02:04.568614452 +0000 UTC m=+507.202596388" Apr 25 00:02:05.544499 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:05.544466 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" Apr 25 00:02:05.544499 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:05.544504 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" Apr 25 00:02:05.545275 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:05.545252 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" podUID="d130d938-29c6-47d1-a675-1cd75b0d26c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 25 00:02:05.545328 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:05.545276 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" podUID="f4d06f33-056d-47a6-a3c4-4957f99749d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 25 00:02:06.547680 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:06.547634 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" podUID="d130d938-29c6-47d1-a675-1cd75b0d26c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 25 00:02:06.548107 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:06.547752 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" podUID="f4d06f33-056d-47a6-a3c4-4957f99749d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 25 00:02:06.748355 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:06.748334 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" Apr 25 00:02:06.848212 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:06.848136 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b15f40f7-5f84-4365-b2e1-28591524b6b7-proxy-tls\") pod \"b15f40f7-5f84-4365-b2e1-28591524b6b7\" (UID: \"b15f40f7-5f84-4365-b2e1-28591524b6b7\") " Apr 25 00:02:06.848212 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:06.848195 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c9wx\" (UniqueName: \"kubernetes.io/projected/b15f40f7-5f84-4365-b2e1-28591524b6b7-kube-api-access-7c9wx\") pod \"b15f40f7-5f84-4365-b2e1-28591524b6b7\" (UID: \"b15f40f7-5f84-4365-b2e1-28591524b6b7\") " Apr 25 00:02:06.848378 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:06.848244 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-4fa95-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b15f40f7-5f84-4365-b2e1-28591524b6b7-success-200-isvc-4fa95-kube-rbac-proxy-sar-config\") pod \"b15f40f7-5f84-4365-b2e1-28591524b6b7\" (UID: \"b15f40f7-5f84-4365-b2e1-28591524b6b7\") " Apr 25 00:02:06.848616 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:06.848591 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b15f40f7-5f84-4365-b2e1-28591524b6b7-success-200-isvc-4fa95-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-4fa95-kube-rbac-proxy-sar-config") pod "b15f40f7-5f84-4365-b2e1-28591524b6b7" (UID: "b15f40f7-5f84-4365-b2e1-28591524b6b7"). InnerVolumeSpecName "success-200-isvc-4fa95-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:02:06.882780 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:06.882752 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b15f40f7-5f84-4365-b2e1-28591524b6b7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b15f40f7-5f84-4365-b2e1-28591524b6b7" (UID: "b15f40f7-5f84-4365-b2e1-28591524b6b7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:02:06.882914 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:06.882801 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b15f40f7-5f84-4365-b2e1-28591524b6b7-kube-api-access-7c9wx" (OuterVolumeSpecName: "kube-api-access-7c9wx") pod "b15f40f7-5f84-4365-b2e1-28591524b6b7" (UID: "b15f40f7-5f84-4365-b2e1-28591524b6b7"). InnerVolumeSpecName "kube-api-access-7c9wx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:02:06.949231 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:06.949210 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b15f40f7-5f84-4365-b2e1-28591524b6b7-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:02:06.949231 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:06.949233 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7c9wx\" (UniqueName: \"kubernetes.io/projected/b15f40f7-5f84-4365-b2e1-28591524b6b7-kube-api-access-7c9wx\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:02:06.949353 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:06.949244 2578 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-4fa95-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b15f40f7-5f84-4365-b2e1-28591524b6b7-success-200-isvc-4fa95-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:02:07.093974 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.093952 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" Apr 25 00:02:07.150599 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.150529 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-4fa95-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fdcb0f25-048c-4797-9750-b37969b8cb48-error-404-isvc-4fa95-kube-rbac-proxy-sar-config\") pod \"fdcb0f25-048c-4797-9750-b37969b8cb48\" (UID: \"fdcb0f25-048c-4797-9750-b37969b8cb48\") " Apr 25 00:02:07.150599 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.150565 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swsvb\" (UniqueName: \"kubernetes.io/projected/fdcb0f25-048c-4797-9750-b37969b8cb48-kube-api-access-swsvb\") pod \"fdcb0f25-048c-4797-9750-b37969b8cb48\" (UID: \"fdcb0f25-048c-4797-9750-b37969b8cb48\") " Apr 25 00:02:07.150599 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.150589 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fdcb0f25-048c-4797-9750-b37969b8cb48-proxy-tls\") pod \"fdcb0f25-048c-4797-9750-b37969b8cb48\" (UID: \"fdcb0f25-048c-4797-9750-b37969b8cb48\") " Apr 25 00:02:07.150878 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.150856 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdcb0f25-048c-4797-9750-b37969b8cb48-error-404-isvc-4fa95-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-4fa95-kube-rbac-proxy-sar-config") pod "fdcb0f25-048c-4797-9750-b37969b8cb48" (UID: "fdcb0f25-048c-4797-9750-b37969b8cb48"). InnerVolumeSpecName "error-404-isvc-4fa95-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:02:07.152803 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.152780 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdcb0f25-048c-4797-9750-b37969b8cb48-kube-api-access-swsvb" (OuterVolumeSpecName: "kube-api-access-swsvb") pod "fdcb0f25-048c-4797-9750-b37969b8cb48" (UID: "fdcb0f25-048c-4797-9750-b37969b8cb48"). InnerVolumeSpecName "kube-api-access-swsvb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:02:07.152894 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.152811 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdcb0f25-048c-4797-9750-b37969b8cb48-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fdcb0f25-048c-4797-9750-b37969b8cb48" (UID: "fdcb0f25-048c-4797-9750-b37969b8cb48"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:02:07.251995 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.251971 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-4fa95-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fdcb0f25-048c-4797-9750-b37969b8cb48-error-404-isvc-4fa95-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:02:07.251995 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.251991 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-swsvb\" (UniqueName: \"kubernetes.io/projected/fdcb0f25-048c-4797-9750-b37969b8cb48-kube-api-access-swsvb\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:02:07.252144 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.252001 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fdcb0f25-048c-4797-9750-b37969b8cb48-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:02:07.551684 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.551654 2578 generic.go:358] "Generic (PLEG): container finished" podID="b15f40f7-5f84-4365-b2e1-28591524b6b7" containerID="f9628f2c3a50bfe4c84d4d7a4a9e62990e225f19106a6df3f121d2df372110f7" exitCode=0 Apr 25 00:02:07.552106 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.551734 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" Apr 25 00:02:07.552106 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.551737 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" event={"ID":"b15f40f7-5f84-4365-b2e1-28591524b6b7","Type":"ContainerDied","Data":"f9628f2c3a50bfe4c84d4d7a4a9e62990e225f19106a6df3f121d2df372110f7"} Apr 25 00:02:07.552106 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.551769 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k" event={"ID":"b15f40f7-5f84-4365-b2e1-28591524b6b7","Type":"ContainerDied","Data":"0c90cc9233b9d3c4f0247aafa2093c65dcc80c67d015e0d1c18128b1cddd1458"} Apr 25 00:02:07.552106 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.551784 2578 scope.go:117] "RemoveContainer" containerID="a02d554d67294d91810aa1afffbe598c93d61f58b57f041e476ded998dd3923e" Apr 25 00:02:07.553309 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.553218 2578 generic.go:358] "Generic (PLEG): container finished" podID="fdcb0f25-048c-4797-9750-b37969b8cb48" containerID="eb477d6281ee975410f117247beee1c638698ceb8179a1b8b2c87b6d060e93e9" exitCode=0 Apr 25 00:02:07.553309 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.553265 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" event={"ID":"fdcb0f25-048c-4797-9750-b37969b8cb48","Type":"ContainerDied","Data":"eb477d6281ee975410f117247beee1c638698ceb8179a1b8b2c87b6d060e93e9"} Apr 25 00:02:07.553309 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.553285 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" event={"ID":"fdcb0f25-048c-4797-9750-b37969b8cb48","Type":"ContainerDied","Data":"a82c16aa5763999a82754247377dbd5c64cf3e41603406e9546b8ee1388238e5"} Apr 25 00:02:07.553309 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.553299 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt" Apr 25 00:02:07.559715 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.559684 2578 scope.go:117] "RemoveContainer" containerID="f9628f2c3a50bfe4c84d4d7a4a9e62990e225f19106a6df3f121d2df372110f7" Apr 25 00:02:07.567174 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.567159 2578 scope.go:117] "RemoveContainer" containerID="a02d554d67294d91810aa1afffbe598c93d61f58b57f041e476ded998dd3923e" Apr 25 00:02:07.567393 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:02:07.567378 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a02d554d67294d91810aa1afffbe598c93d61f58b57f041e476ded998dd3923e\": container with ID starting with a02d554d67294d91810aa1afffbe598c93d61f58b57f041e476ded998dd3923e not found: ID does not exist" containerID="a02d554d67294d91810aa1afffbe598c93d61f58b57f041e476ded998dd3923e" Apr 25 00:02:07.567484 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.567404 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a02d554d67294d91810aa1afffbe598c93d61f58b57f041e476ded998dd3923e"} err="failed to get container status \"a02d554d67294d91810aa1afffbe598c93d61f58b57f041e476ded998dd3923e\": rpc error: code = NotFound desc = could not find container \"a02d554d67294d91810aa1afffbe598c93d61f58b57f041e476ded998dd3923e\": container with ID starting with a02d554d67294d91810aa1afffbe598c93d61f58b57f041e476ded998dd3923e not found: ID does not exist" Apr 25 00:02:07.567484 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.567444 2578 scope.go:117] "RemoveContainer" containerID="f9628f2c3a50bfe4c84d4d7a4a9e62990e225f19106a6df3f121d2df372110f7" Apr 25 00:02:07.567701 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:02:07.567684 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9628f2c3a50bfe4c84d4d7a4a9e62990e225f19106a6df3f121d2df372110f7\": container with ID starting with f9628f2c3a50bfe4c84d4d7a4a9e62990e225f19106a6df3f121d2df372110f7 not found: ID does not exist" containerID="f9628f2c3a50bfe4c84d4d7a4a9e62990e225f19106a6df3f121d2df372110f7" Apr 25 00:02:07.567740 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.567707 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9628f2c3a50bfe4c84d4d7a4a9e62990e225f19106a6df3f121d2df372110f7"} err="failed to get container status \"f9628f2c3a50bfe4c84d4d7a4a9e62990e225f19106a6df3f121d2df372110f7\": rpc error: code = NotFound desc = could not find container \"f9628f2c3a50bfe4c84d4d7a4a9e62990e225f19106a6df3f121d2df372110f7\": container with ID starting with f9628f2c3a50bfe4c84d4d7a4a9e62990e225f19106a6df3f121d2df372110f7 not found: ID does not exist" Apr 25 00:02:07.567740 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.567723 2578 scope.go:117] "RemoveContainer" containerID="0eb1d59c68f699c5bbb62c8c3d96967492b6ed74c0b8e85b7202d370499b5879" Apr 25 00:02:07.573281 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.573260 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k"] Apr 25 00:02:07.574778 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.574760 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4fa95-predictor-5756bcc86-4mc9k"] Apr 25 00:02:07.575500 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.575483 2578 scope.go:117] "RemoveContainer" containerID="eb477d6281ee975410f117247beee1c638698ceb8179a1b8b2c87b6d060e93e9" Apr 25 00:02:07.584545 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.584529 2578 scope.go:117] "RemoveContainer" containerID="0eb1d59c68f699c5bbb62c8c3d96967492b6ed74c0b8e85b7202d370499b5879" Apr 25 00:02:07.584821 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:02:07.584803 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eb1d59c68f699c5bbb62c8c3d96967492b6ed74c0b8e85b7202d370499b5879\": container with ID starting with 0eb1d59c68f699c5bbb62c8c3d96967492b6ed74c0b8e85b7202d370499b5879 not found: ID does not exist" containerID="0eb1d59c68f699c5bbb62c8c3d96967492b6ed74c0b8e85b7202d370499b5879" Apr 25 00:02:07.584899 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.584829 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb1d59c68f699c5bbb62c8c3d96967492b6ed74c0b8e85b7202d370499b5879"} err="failed to get container status \"0eb1d59c68f699c5bbb62c8c3d96967492b6ed74c0b8e85b7202d370499b5879\": rpc error: code = NotFound desc = could not find container \"0eb1d59c68f699c5bbb62c8c3d96967492b6ed74c0b8e85b7202d370499b5879\": container with ID starting with 0eb1d59c68f699c5bbb62c8c3d96967492b6ed74c0b8e85b7202d370499b5879 not found: ID does not exist" Apr 25 00:02:07.584899 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.584852 2578 scope.go:117] "RemoveContainer" containerID="eb477d6281ee975410f117247beee1c638698ceb8179a1b8b2c87b6d060e93e9" Apr 25 00:02:07.585080 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:02:07.585062 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb477d6281ee975410f117247beee1c638698ceb8179a1b8b2c87b6d060e93e9\": container with ID starting with eb477d6281ee975410f117247beee1c638698ceb8179a1b8b2c87b6d060e93e9 not found: ID does not exist" containerID="eb477d6281ee975410f117247beee1c638698ceb8179a1b8b2c87b6d060e93e9" Apr 25 00:02:07.585135 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.585087 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb477d6281ee975410f117247beee1c638698ceb8179a1b8b2c87b6d060e93e9"} err="failed to get container status \"eb477d6281ee975410f117247beee1c638698ceb8179a1b8b2c87b6d060e93e9\": rpc error: code = NotFound desc = could not find container \"eb477d6281ee975410f117247beee1c638698ceb8179a1b8b2c87b6d060e93e9\": container with ID starting with eb477d6281ee975410f117247beee1c638698ceb8179a1b8b2c87b6d060e93e9 not found: ID does not exist" Apr 25 00:02:07.585775 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.585756 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt"] Apr 25 00:02:07.590523 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.590504 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4fa95-predictor-7fbb84df9c-ktxjt"] Apr 25 00:02:07.911546 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.911478 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b15f40f7-5f84-4365-b2e1-28591524b6b7" path="/var/lib/kubelet/pods/b15f40f7-5f84-4365-b2e1-28591524b6b7/volumes" Apr 25 00:02:07.911890 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:07.911878 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdcb0f25-048c-4797-9750-b37969b8cb48" path="/var/lib/kubelet/pods/fdcb0f25-048c-4797-9750-b37969b8cb48/volumes" Apr 25 00:02:09.510698 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:09.510663 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" podUID="63952d9a-11c4-4a41-83df-09777283aeef" containerName="switch-graph-4fa95" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:02:11.552328 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:11.552298 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" Apr 25 00:02:11.552787 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:11.552665 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" Apr 25 00:02:11.552787 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:11.552689 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" podUID="d130d938-29c6-47d1-a675-1cd75b0d26c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 25 00:02:11.553151 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:11.553127 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" podUID="f4d06f33-056d-47a6-a3c4-4957f99749d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 25 00:02:14.510003 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:14.509967 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" podUID="63952d9a-11c4-4a41-83df-09777283aeef" containerName="switch-graph-4fa95" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:02:14.510338 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:14.510082 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" Apr 25 00:02:19.510607 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:19.510564 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" podUID="63952d9a-11c4-4a41-83df-09777283aeef" containerName="switch-graph-4fa95" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:02:21.553381 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:21.553297 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" podUID="d130d938-29c6-47d1-a675-1cd75b0d26c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 25 00:02:21.553818 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:21.553308 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" podUID="f4d06f33-056d-47a6-a3c4-4957f99749d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 25 00:02:24.510675 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:24.510638 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" podUID="63952d9a-11c4-4a41-83df-09777283aeef" containerName="switch-graph-4fa95" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:02:29.130150 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.130118 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs"] Apr 25 00:02:29.130630 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.130612 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b15f40f7-5f84-4365-b2e1-28591524b6b7" containerName="kube-rbac-proxy" Apr 25 00:02:29.130699 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.130633 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15f40f7-5f84-4365-b2e1-28591524b6b7" containerName="kube-rbac-proxy" Apr 25 00:02:29.130699 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.130646 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fdcb0f25-048c-4797-9750-b37969b8cb48" containerName="kserve-container" Apr 25 00:02:29.130699 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.130654 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdcb0f25-048c-4797-9750-b37969b8cb48" containerName="kserve-container" Apr 25 00:02:29.130699 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.130674 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fdcb0f25-048c-4797-9750-b37969b8cb48" containerName="kube-rbac-proxy" Apr 25 00:02:29.130699 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.130683 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdcb0f25-048c-4797-9750-b37969b8cb48" containerName="kube-rbac-proxy" Apr 25 00:02:29.130699 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.130697 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b15f40f7-5f84-4365-b2e1-28591524b6b7" containerName="kserve-container" Apr 25 00:02:29.130975 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.130705 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15f40f7-5f84-4365-b2e1-28591524b6b7" containerName="kserve-container" Apr 25 00:02:29.130975 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.130778 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fdcb0f25-048c-4797-9750-b37969b8cb48" containerName="kube-rbac-proxy" Apr 25 00:02:29.130975 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.130791 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b15f40f7-5f84-4365-b2e1-28591524b6b7" containerName="kube-rbac-proxy" Apr 25 00:02:29.130975 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.130803 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fdcb0f25-048c-4797-9750-b37969b8cb48" containerName="kserve-container" Apr 25 00:02:29.130975 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.130812 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b15f40f7-5f84-4365-b2e1-28591524b6b7" containerName="kserve-container" Apr 25 00:02:29.137954 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.137931 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" Apr 25 00:02:29.140343 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.140308 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 25 00:02:29.140482 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.140319 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 25 00:02:29.141765 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.141744 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs"] Apr 25 00:02:29.235671 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.235641 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a6936f6-263c-4a1b-acd1-cb33518be1ad-openshift-service-ca-bundle\") pod \"model-chainer-6fc6549cd8-x4sbs\" (UID: \"7a6936f6-263c-4a1b-acd1-cb33518be1ad\") " pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" Apr 25 00:02:29.235810 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.235692 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a6936f6-263c-4a1b-acd1-cb33518be1ad-proxy-tls\") pod \"model-chainer-6fc6549cd8-x4sbs\" (UID: \"7a6936f6-263c-4a1b-acd1-cb33518be1ad\") " pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" Apr 25 00:02:29.337050 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.337020 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a6936f6-263c-4a1b-acd1-cb33518be1ad-openshift-service-ca-bundle\") pod \"model-chainer-6fc6549cd8-x4sbs\" (UID: \"7a6936f6-263c-4a1b-acd1-cb33518be1ad\") " pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" Apr 25 00:02:29.337183 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.337078 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a6936f6-263c-4a1b-acd1-cb33518be1ad-proxy-tls\") pod \"model-chainer-6fc6549cd8-x4sbs\" (UID: \"7a6936f6-263c-4a1b-acd1-cb33518be1ad\") " pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" Apr 25 00:02:29.337662 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.337640 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a6936f6-263c-4a1b-acd1-cb33518be1ad-openshift-service-ca-bundle\") pod \"model-chainer-6fc6549cd8-x4sbs\" (UID: \"7a6936f6-263c-4a1b-acd1-cb33518be1ad\") " pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" Apr 25 00:02:29.339507 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.339480 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a6936f6-263c-4a1b-acd1-cb33518be1ad-proxy-tls\") pod \"model-chainer-6fc6549cd8-x4sbs\" (UID: \"7a6936f6-263c-4a1b-acd1-cb33518be1ad\") " pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" Apr 25 00:02:29.448542 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.448482 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" Apr 25 00:02:29.510452 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.510224 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" podUID="63952d9a-11c4-4a41-83df-09777283aeef" containerName="switch-graph-4fa95" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:02:29.573262 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.573241 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs"] Apr 25 00:02:29.575792 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:02:29.575760 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a6936f6_263c_4a1b_acd1_cb33518be1ad.slice/crio-b016772fc8b0b59ab636157a30b82c2c7809a86b08bad46c5393399df18a98c8 WatchSource:0}: Error finding container b016772fc8b0b59ab636157a30b82c2c7809a86b08bad46c5393399df18a98c8: Status 404 returned error can't find the container with id b016772fc8b0b59ab636157a30b82c2c7809a86b08bad46c5393399df18a98c8 Apr 25 00:02:29.620442 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:29.620395 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" event={"ID":"7a6936f6-263c-4a1b-acd1-cb33518be1ad","Type":"ContainerStarted","Data":"b016772fc8b0b59ab636157a30b82c2c7809a86b08bad46c5393399df18a98c8"} Apr 25 00:02:30.624487 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:30.624450 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" event={"ID":"7a6936f6-263c-4a1b-acd1-cb33518be1ad","Type":"ContainerStarted","Data":"84528027353ff4cc53cb424657ab0da06cadd63c00296cda533804e725c9f202"} Apr 25 00:02:30.624866 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:30.624562 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" Apr 25 00:02:30.640167 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:30.640114 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" podStartSLOduration=1.640101056 podStartE2EDuration="1.640101056s" podCreationTimestamp="2026-04-25 00:02:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:02:30.638357479 +0000 UTC m=+533.272339414" watchObservedRunningTime="2026-04-25 00:02:30.640101056 +0000 UTC m=+533.274082991" Apr 25 00:02:31.552974 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:31.552927 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" podUID="d130d938-29c6-47d1-a675-1cd75b0d26c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 25 00:02:31.553239 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:31.553092 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" podUID="f4d06f33-056d-47a6-a3c4-4957f99749d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 25 00:02:33.533481 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:33.533455 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" Apr 25 00:02:33.636218 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:33.636185 2578 generic.go:358] "Generic (PLEG): container finished" podID="63952d9a-11c4-4a41-83df-09777283aeef" containerID="8e79e537e39b60b2b42ae6aecbf622c86b2b6f95020b45a1b54f128a055a464b" exitCode=0 Apr 25 00:02:33.636368 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:33.636232 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" event={"ID":"63952d9a-11c4-4a41-83df-09777283aeef","Type":"ContainerDied","Data":"8e79e537e39b60b2b42ae6aecbf622c86b2b6f95020b45a1b54f128a055a464b"} Apr 25 00:02:33.636368 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:33.636243 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" Apr 25 00:02:33.636368 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:33.636253 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6" event={"ID":"63952d9a-11c4-4a41-83df-09777283aeef","Type":"ContainerDied","Data":"0c33c12c10288eea7978da582463ae7b3597370eecffe888a71aac8d2b827601"} Apr 25 00:02:33.636368 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:33.636277 2578 scope.go:117] "RemoveContainer" containerID="8e79e537e39b60b2b42ae6aecbf622c86b2b6f95020b45a1b54f128a055a464b" Apr 25 00:02:33.643831 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:33.643804 2578 scope.go:117] "RemoveContainer" containerID="8e79e537e39b60b2b42ae6aecbf622c86b2b6f95020b45a1b54f128a055a464b" Apr 25 00:02:33.644063 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:02:33.644045 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e79e537e39b60b2b42ae6aecbf622c86b2b6f95020b45a1b54f128a055a464b\": container with ID starting with 8e79e537e39b60b2b42ae6aecbf622c86b2b6f95020b45a1b54f128a055a464b not found: ID does not exist" containerID="8e79e537e39b60b2b42ae6aecbf622c86b2b6f95020b45a1b54f128a055a464b" Apr 25 00:02:33.644138 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:33.644072 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e79e537e39b60b2b42ae6aecbf622c86b2b6f95020b45a1b54f128a055a464b"} err="failed to get container status \"8e79e537e39b60b2b42ae6aecbf622c86b2b6f95020b45a1b54f128a055a464b\": rpc error: code = NotFound desc = could not find container \"8e79e537e39b60b2b42ae6aecbf622c86b2b6f95020b45a1b54f128a055a464b\": container with ID starting with 8e79e537e39b60b2b42ae6aecbf622c86b2b6f95020b45a1b54f128a055a464b not found: ID does not exist" Apr 25 00:02:33.671557 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:33.671506 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63952d9a-11c4-4a41-83df-09777283aeef-proxy-tls\") pod \"63952d9a-11c4-4a41-83df-09777283aeef\" (UID: \"63952d9a-11c4-4a41-83df-09777283aeef\") " Apr 25 00:02:33.671557 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:33.671544 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63952d9a-11c4-4a41-83df-09777283aeef-openshift-service-ca-bundle\") pod \"63952d9a-11c4-4a41-83df-09777283aeef\" (UID: \"63952d9a-11c4-4a41-83df-09777283aeef\") " Apr 25 00:02:33.671904 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:33.671881 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63952d9a-11c4-4a41-83df-09777283aeef-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "63952d9a-11c4-4a41-83df-09777283aeef" (UID: "63952d9a-11c4-4a41-83df-09777283aeef"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:02:33.673504 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:33.673483 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63952d9a-11c4-4a41-83df-09777283aeef-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "63952d9a-11c4-4a41-83df-09777283aeef" (UID: "63952d9a-11c4-4a41-83df-09777283aeef"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:02:33.772280 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:33.772252 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63952d9a-11c4-4a41-83df-09777283aeef-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:02:33.772280 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:33.772278 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63952d9a-11c4-4a41-83df-09777283aeef-openshift-service-ca-bundle\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:02:33.951053 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:33.951001 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6"] Apr 25 00:02:33.955084 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:33.955062 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-4fa95-549f8ff959-xxjk6"] Apr 25 00:02:35.911757 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:35.911724 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63952d9a-11c4-4a41-83df-09777283aeef" path="/var/lib/kubelet/pods/63952d9a-11c4-4a41-83df-09777283aeef/volumes" Apr 25 00:02:36.635809 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:36.635780 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" Apr 25 00:02:39.233612 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.233579 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs"] Apr 25 00:02:39.234000 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.233790 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" podUID="7a6936f6-263c-4a1b-acd1-cb33518be1ad" containerName="model-chainer" containerID="cri-o://84528027353ff4cc53cb424657ab0da06cadd63c00296cda533804e725c9f202" gracePeriod=30 Apr 25 00:02:39.402197 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.402160 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z"] Apr 25 00:02:39.402539 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.402526 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63952d9a-11c4-4a41-83df-09777283aeef" containerName="switch-graph-4fa95" Apr 25 00:02:39.402597 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.402541 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="63952d9a-11c4-4a41-83df-09777283aeef" containerName="switch-graph-4fa95" Apr 25 00:02:39.402639 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.402602 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="63952d9a-11c4-4a41-83df-09777283aeef" containerName="switch-graph-4fa95" Apr 25 00:02:39.407001 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.406984 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" Apr 25 00:02:39.409072 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.409052 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-b9ddf-kube-rbac-proxy-sar-config\"" Apr 25 00:02:39.409468 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.409450 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-b9ddf-predictor-serving-cert\"" Apr 25 00:02:39.416263 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.416238 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z"] Apr 25 00:02:39.521055 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.521020 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/64f2aa52-331d-430a-9dcd-ac49d6f610e4-proxy-tls\") pod \"success-200-isvc-b9ddf-predictor-75548d844b-lf92z\" (UID: \"64f2aa52-331d-430a-9dcd-ac49d6f610e4\") " pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" Apr 25 00:02:39.521219 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.521059 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-b9ddf-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/64f2aa52-331d-430a-9dcd-ac49d6f610e4-success-200-isvc-b9ddf-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-b9ddf-predictor-75548d844b-lf92z\" (UID: \"64f2aa52-331d-430a-9dcd-ac49d6f610e4\") " pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" Apr 25 00:02:39.521219 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.521191 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sk5n\" (UniqueName: \"kubernetes.io/projected/64f2aa52-331d-430a-9dcd-ac49d6f610e4-kube-api-access-7sk5n\") pod \"success-200-isvc-b9ddf-predictor-75548d844b-lf92z\" (UID: \"64f2aa52-331d-430a-9dcd-ac49d6f610e4\") " pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" Apr 25 00:02:39.525641 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.525618 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k"] Apr 25 00:02:39.529020 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.529006 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" Apr 25 00:02:39.531757 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.531742 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-b9ddf-predictor-serving-cert\"" Apr 25 00:02:39.531835 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.531816 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-b9ddf-kube-rbac-proxy-sar-config\"" Apr 25 00:02:39.540096 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.540077 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k"] Apr 25 00:02:39.622541 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.622514 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7sk5n\" (UniqueName: \"kubernetes.io/projected/64f2aa52-331d-430a-9dcd-ac49d6f610e4-kube-api-access-7sk5n\") pod \"success-200-isvc-b9ddf-predictor-75548d844b-lf92z\" (UID: \"64f2aa52-331d-430a-9dcd-ac49d6f610e4\") " pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" Apr 25 00:02:39.622669 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.622567 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/64f2aa52-331d-430a-9dcd-ac49d6f610e4-proxy-tls\") pod \"success-200-isvc-b9ddf-predictor-75548d844b-lf92z\" (UID: \"64f2aa52-331d-430a-9dcd-ac49d6f610e4\") " pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" Apr 25 00:02:39.622669 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.622612 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-b9ddf-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/64f2aa52-331d-430a-9dcd-ac49d6f610e4-success-200-isvc-b9ddf-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-b9ddf-predictor-75548d844b-lf92z\" (UID: \"64f2aa52-331d-430a-9dcd-ac49d6f610e4\") " pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" Apr 25 00:02:39.622669 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.622653 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-b9ddf-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c70adfca-a5c8-4ccb-86b4-b51e8b0a7084-error-404-isvc-b9ddf-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-b9ddf-predictor-66459c556f-np74k\" (UID: \"c70adfca-a5c8-4ccb-86b4-b51e8b0a7084\") " pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" Apr 25 00:02:39.622814 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.622702 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c70adfca-a5c8-4ccb-86b4-b51e8b0a7084-proxy-tls\") pod \"error-404-isvc-b9ddf-predictor-66459c556f-np74k\" (UID: \"c70adfca-a5c8-4ccb-86b4-b51e8b0a7084\") " pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" Apr 25 00:02:39.622814 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.622719 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vth7\" (UniqueName: \"kubernetes.io/projected/c70adfca-a5c8-4ccb-86b4-b51e8b0a7084-kube-api-access-9vth7\") pod \"error-404-isvc-b9ddf-predictor-66459c556f-np74k\" (UID: \"c70adfca-a5c8-4ccb-86b4-b51e8b0a7084\") " pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" Apr 25 00:02:39.623188 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.623168 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-b9ddf-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/64f2aa52-331d-430a-9dcd-ac49d6f610e4-success-200-isvc-b9ddf-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-b9ddf-predictor-75548d844b-lf92z\" (UID: \"64f2aa52-331d-430a-9dcd-ac49d6f610e4\") " pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" Apr 25 00:02:39.625142 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.625122 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/64f2aa52-331d-430a-9dcd-ac49d6f610e4-proxy-tls\") pod \"success-200-isvc-b9ddf-predictor-75548d844b-lf92z\" (UID: \"64f2aa52-331d-430a-9dcd-ac49d6f610e4\") " pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" Apr 25 00:02:39.631799 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.631781 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sk5n\" (UniqueName: \"kubernetes.io/projected/64f2aa52-331d-430a-9dcd-ac49d6f610e4-kube-api-access-7sk5n\") pod \"success-200-isvc-b9ddf-predictor-75548d844b-lf92z\" (UID: \"64f2aa52-331d-430a-9dcd-ac49d6f610e4\") " pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" Apr 25 00:02:39.718861 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.718835 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" Apr 25 00:02:39.723740 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.723717 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-b9ddf-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c70adfca-a5c8-4ccb-86b4-b51e8b0a7084-error-404-isvc-b9ddf-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-b9ddf-predictor-66459c556f-np74k\" (UID: \"c70adfca-a5c8-4ccb-86b4-b51e8b0a7084\") " pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" Apr 25 00:02:39.723842 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.723764 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c70adfca-a5c8-4ccb-86b4-b51e8b0a7084-proxy-tls\") pod \"error-404-isvc-b9ddf-predictor-66459c556f-np74k\" (UID: \"c70adfca-a5c8-4ccb-86b4-b51e8b0a7084\") " pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" Apr 25 00:02:39.723902 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.723870 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vth7\" (UniqueName: \"kubernetes.io/projected/c70adfca-a5c8-4ccb-86b4-b51e8b0a7084-kube-api-access-9vth7\") pod \"error-404-isvc-b9ddf-predictor-66459c556f-np74k\" (UID: \"c70adfca-a5c8-4ccb-86b4-b51e8b0a7084\") " pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" Apr 25 00:02:39.724392 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.724375 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-b9ddf-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c70adfca-a5c8-4ccb-86b4-b51e8b0a7084-error-404-isvc-b9ddf-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-b9ddf-predictor-66459c556f-np74k\" (UID: \"c70adfca-a5c8-4ccb-86b4-b51e8b0a7084\") " pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" Apr 25 00:02:39.726233 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.726212 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c70adfca-a5c8-4ccb-86b4-b51e8b0a7084-proxy-tls\") pod \"error-404-isvc-b9ddf-predictor-66459c556f-np74k\" (UID: \"c70adfca-a5c8-4ccb-86b4-b51e8b0a7084\") " pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" Apr 25 00:02:39.733223 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.733202 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vth7\" (UniqueName: \"kubernetes.io/projected/c70adfca-a5c8-4ccb-86b4-b51e8b0a7084-kube-api-access-9vth7\") pod \"error-404-isvc-b9ddf-predictor-66459c556f-np74k\" (UID: \"c70adfca-a5c8-4ccb-86b4-b51e8b0a7084\") " pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" Apr 25 00:02:39.838963 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.838932 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" Apr 25 00:02:39.840978 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.840955 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z"] Apr 25 00:02:39.845587 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:02:39.845563 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64f2aa52_331d_430a_9dcd_ac49d6f610e4.slice/crio-3dbd45727dbd68eb05889c749d9438f47d0e355d170bd0b3b13487efc1b63e18 WatchSource:0}: Error finding container 3dbd45727dbd68eb05889c749d9438f47d0e355d170bd0b3b13487efc1b63e18: Status 404 returned error can't find the container with id 3dbd45727dbd68eb05889c749d9438f47d0e355d170bd0b3b13487efc1b63e18 Apr 25 00:02:39.963522 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:39.963501 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k"] Apr 25 00:02:39.965738 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:02:39.965710 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc70adfca_a5c8_4ccb_86b4_b51e8b0a7084.slice/crio-d6f80bce6a2551a6f18658ba65d1a056e695a9f99b2009c286a98073234ec2b6 WatchSource:0}: Error finding container d6f80bce6a2551a6f18658ba65d1a056e695a9f99b2009c286a98073234ec2b6: Status 404 returned error can't find the container with id d6f80bce6a2551a6f18658ba65d1a056e695a9f99b2009c286a98073234ec2b6 Apr 25 00:02:40.659235 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:40.659190 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" event={"ID":"c70adfca-a5c8-4ccb-86b4-b51e8b0a7084","Type":"ContainerStarted","Data":"1b1f24e375f55941e3e58ffd84bf1b361081bd50e7e80c3e2adb86dc2f6d4b2d"} Apr 25 00:02:40.659235 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:40.659239 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" event={"ID":"c70adfca-a5c8-4ccb-86b4-b51e8b0a7084","Type":"ContainerStarted","Data":"54bff80a3a2c7aa81f08fe16d25f1270064c78e8c1bfade2ec9319ee32a91278"} Apr 25 00:02:40.659719 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:40.659251 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" event={"ID":"c70adfca-a5c8-4ccb-86b4-b51e8b0a7084","Type":"ContainerStarted","Data":"d6f80bce6a2551a6f18658ba65d1a056e695a9f99b2009c286a98073234ec2b6"} Apr 25 00:02:40.659719 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:40.659336 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" Apr 25 00:02:40.660718 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:40.660699 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" event={"ID":"64f2aa52-331d-430a-9dcd-ac49d6f610e4","Type":"ContainerStarted","Data":"0d92a951d415182a6a6354e3c09600935a1ec07b912eea0222c67fa243eea4a0"} Apr 25 00:02:40.660718 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:40.660720 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" event={"ID":"64f2aa52-331d-430a-9dcd-ac49d6f610e4","Type":"ContainerStarted","Data":"2c3c142c4d92c29a77469d8c3246eb7913f79e7895aca1b07774d2c6c5b11d3b"} Apr 25 00:02:40.660718 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:40.660731 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" event={"ID":"64f2aa52-331d-430a-9dcd-ac49d6f610e4","Type":"ContainerStarted","Data":"3dbd45727dbd68eb05889c749d9438f47d0e355d170bd0b3b13487efc1b63e18"} Apr 25 00:02:40.660905 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:40.660821 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" Apr 25 00:02:40.677803 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:40.677762 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" podStartSLOduration=1.67774913 podStartE2EDuration="1.67774913s" podCreationTimestamp="2026-04-25 00:02:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:02:40.67552781 +0000 UTC m=+543.309509768" watchObservedRunningTime="2026-04-25 00:02:40.67774913 +0000 UTC m=+543.311731066" Apr 25 00:02:40.693327 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:40.693276 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" podStartSLOduration=1.693258052 podStartE2EDuration="1.693258052s" podCreationTimestamp="2026-04-25 00:02:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:02:40.692384328 +0000 UTC m=+543.326366267" watchObservedRunningTime="2026-04-25 00:02:40.693258052 +0000 UTC m=+543.327239990" Apr 25 00:02:41.553204 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:41.553153 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" podUID="f4d06f33-056d-47a6-a3c4-4957f99749d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 25 00:02:41.553362 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:41.553228 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" podUID="d130d938-29c6-47d1-a675-1cd75b0d26c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 25 00:02:41.634710 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:41.634676 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" podUID="7a6936f6-263c-4a1b-acd1-cb33518be1ad" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:02:41.664095 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:41.664071 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" Apr 25 00:02:41.664561 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:41.664105 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" Apr 25 00:02:41.665134 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:41.665111 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" podUID="c70adfca-a5c8-4ccb-86b4-b51e8b0a7084" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 25 00:02:41.665134 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:41.665124 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" podUID="64f2aa52-331d-430a-9dcd-ac49d6f610e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 25 00:02:42.667875 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:42.667833 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" podUID="64f2aa52-331d-430a-9dcd-ac49d6f610e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 25 00:02:42.668233 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:42.667997 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" podUID="c70adfca-a5c8-4ccb-86b4-b51e8b0a7084" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 25 00:02:46.634796 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:46.634755 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" podUID="7a6936f6-263c-4a1b-acd1-cb33518be1ad" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:02:47.671744 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:47.671707 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" Apr 25 00:02:47.672198 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:47.672096 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" Apr 25 00:02:47.672268 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:47.672197 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" podUID="c70adfca-a5c8-4ccb-86b4-b51e8b0a7084" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 25 00:02:47.672547 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:47.672514 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" podUID="64f2aa52-331d-430a-9dcd-ac49d6f610e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 25 00:02:51.553486 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:51.553458 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" Apr 25 00:02:51.553892 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:51.553521 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" Apr 25 00:02:51.633916 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:51.633884 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" podUID="7a6936f6-263c-4a1b-acd1-cb33518be1ad" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:02:51.634068 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:51.633971 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" Apr 25 00:02:56.634533 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:56.634494 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" podUID="7a6936f6-263c-4a1b-acd1-cb33518be1ad" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:02:57.672308 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:57.672272 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" podUID="c70adfca-a5c8-4ccb-86b4-b51e8b0a7084" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 25 00:02:57.672716 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:02:57.672523 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" podUID="64f2aa52-331d-430a-9dcd-ac49d6f610e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 25 00:03:01.634699 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:01.634654 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" podUID="7a6936f6-263c-4a1b-acd1-cb33518be1ad" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:03:03.631539 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:03.631505 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt"] Apr 25 00:03:03.634790 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:03.634773 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" Apr 25 00:03:03.637135 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:03.637114 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-b9319-kube-rbac-proxy-sar-config\"" Apr 25 00:03:03.637259 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:03.637129 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-b9319-serving-cert\"" Apr 25 00:03:03.641700 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:03.641680 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt"] Apr 25 00:03:03.734213 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:03.734187 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b7d90b4-2de2-4df2-a9c2-acf6501f9c14-openshift-service-ca-bundle\") pod \"switch-graph-b9319-859cc9dd58-k7xmt\" (UID: \"8b7d90b4-2de2-4df2-a9c2-acf6501f9c14\") " pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" Apr 25 00:03:03.734334 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:03.734225 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8b7d90b4-2de2-4df2-a9c2-acf6501f9c14-proxy-tls\") pod \"switch-graph-b9319-859cc9dd58-k7xmt\" (UID: \"8b7d90b4-2de2-4df2-a9c2-acf6501f9c14\") " pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" Apr 25 00:03:03.834594 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:03.834564 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b7d90b4-2de2-4df2-a9c2-acf6501f9c14-openshift-service-ca-bundle\") pod \"switch-graph-b9319-859cc9dd58-k7xmt\" (UID: \"8b7d90b4-2de2-4df2-a9c2-acf6501f9c14\") " pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" Apr 25 00:03:03.834739 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:03.834615 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8b7d90b4-2de2-4df2-a9c2-acf6501f9c14-proxy-tls\") pod \"switch-graph-b9319-859cc9dd58-k7xmt\" (UID: \"8b7d90b4-2de2-4df2-a9c2-acf6501f9c14\") " pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" Apr 25 00:03:03.835181 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:03.835146 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b7d90b4-2de2-4df2-a9c2-acf6501f9c14-openshift-service-ca-bundle\") pod \"switch-graph-b9319-859cc9dd58-k7xmt\" (UID: \"8b7d90b4-2de2-4df2-a9c2-acf6501f9c14\") " pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" Apr 25 00:03:03.837104 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:03.837068 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8b7d90b4-2de2-4df2-a9c2-acf6501f9c14-proxy-tls\") pod \"switch-graph-b9319-859cc9dd58-k7xmt\" (UID: \"8b7d90b4-2de2-4df2-a9c2-acf6501f9c14\") " pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" Apr 25 00:03:03.946176 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:03.946108 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" Apr 25 00:03:04.064393 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:04.064371 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt"] Apr 25 00:03:04.067016 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:03:04.066990 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b7d90b4_2de2_4df2_a9c2_acf6501f9c14.slice/crio-09712ef590be41af06b9f02c58bb70bb985ffd07c4a88faea48a5767bb642aa4 WatchSource:0}: Error finding container 09712ef590be41af06b9f02c58bb70bb985ffd07c4a88faea48a5767bb642aa4: Status 404 returned error can't find the container with id 09712ef590be41af06b9f02c58bb70bb985ffd07c4a88faea48a5767bb642aa4 Apr 25 00:03:04.735338 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:04.735301 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" event={"ID":"8b7d90b4-2de2-4df2-a9c2-acf6501f9c14","Type":"ContainerStarted","Data":"a257bd4991ac539fae24a6c7a1a2765ab7d8b2532279ed2d83e5305dab4cb564"} Apr 25 00:03:04.735338 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:04.735336 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" event={"ID":"8b7d90b4-2de2-4df2-a9c2-acf6501f9c14","Type":"ContainerStarted","Data":"09712ef590be41af06b9f02c58bb70bb985ffd07c4a88faea48a5767bb642aa4"} Apr 25 00:03:04.735831 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:04.735359 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" Apr 25 00:03:04.755833 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:04.755787 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" podStartSLOduration=1.7557695070000001 podStartE2EDuration="1.755769507s" podCreationTimestamp="2026-04-25 00:03:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:03:04.753918288 +0000 UTC m=+567.387900235" watchObservedRunningTime="2026-04-25 00:03:04.755769507 +0000 UTC m=+567.389751443" Apr 25 00:03:06.634540 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:06.634508 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" podUID="7a6936f6-263c-4a1b-acd1-cb33518be1ad" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:03:07.672257 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:07.672209 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" podUID="c70adfca-a5c8-4ccb-86b4-b51e8b0a7084" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 25 00:03:07.672724 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:07.672488 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" podUID="64f2aa52-331d-430a-9dcd-ac49d6f610e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 25 00:03:09.385479 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:09.385449 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" Apr 25 00:03:09.483100 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:09.483063 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a6936f6-263c-4a1b-acd1-cb33518be1ad-openshift-service-ca-bundle\") pod \"7a6936f6-263c-4a1b-acd1-cb33518be1ad\" (UID: \"7a6936f6-263c-4a1b-acd1-cb33518be1ad\") " Apr 25 00:03:09.483304 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:09.483147 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a6936f6-263c-4a1b-acd1-cb33518be1ad-proxy-tls\") pod \"7a6936f6-263c-4a1b-acd1-cb33518be1ad\" (UID: \"7a6936f6-263c-4a1b-acd1-cb33518be1ad\") " Apr 25 00:03:09.483505 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:09.483466 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a6936f6-263c-4a1b-acd1-cb33518be1ad-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "7a6936f6-263c-4a1b-acd1-cb33518be1ad" (UID: "7a6936f6-263c-4a1b-acd1-cb33518be1ad"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:03:09.485345 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:09.485322 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a6936f6-263c-4a1b-acd1-cb33518be1ad-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7a6936f6-263c-4a1b-acd1-cb33518be1ad" (UID: "7a6936f6-263c-4a1b-acd1-cb33518be1ad"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:03:09.583826 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:09.583788 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a6936f6-263c-4a1b-acd1-cb33518be1ad-openshift-service-ca-bundle\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:03:09.583826 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:09.583817 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a6936f6-263c-4a1b-acd1-cb33518be1ad-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:03:09.751292 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:09.751198 2578 generic.go:358] "Generic (PLEG): container finished" podID="7a6936f6-263c-4a1b-acd1-cb33518be1ad" containerID="84528027353ff4cc53cb424657ab0da06cadd63c00296cda533804e725c9f202" exitCode=0 Apr 25 00:03:09.751292 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:09.751252 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" event={"ID":"7a6936f6-263c-4a1b-acd1-cb33518be1ad","Type":"ContainerDied","Data":"84528027353ff4cc53cb424657ab0da06cadd63c00296cda533804e725c9f202"} Apr 25 00:03:09.751292 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:09.751257 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" Apr 25 00:03:09.751292 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:09.751286 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs" event={"ID":"7a6936f6-263c-4a1b-acd1-cb33518be1ad","Type":"ContainerDied","Data":"b016772fc8b0b59ab636157a30b82c2c7809a86b08bad46c5393399df18a98c8"} Apr 25 00:03:09.751683 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:09.751307 2578 scope.go:117] "RemoveContainer" containerID="84528027353ff4cc53cb424657ab0da06cadd63c00296cda533804e725c9f202" Apr 25 00:03:09.759645 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:09.759626 2578 scope.go:117] "RemoveContainer" containerID="84528027353ff4cc53cb424657ab0da06cadd63c00296cda533804e725c9f202" Apr 25 00:03:09.759914 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:03:09.759897 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84528027353ff4cc53cb424657ab0da06cadd63c00296cda533804e725c9f202\": container with ID starting with 84528027353ff4cc53cb424657ab0da06cadd63c00296cda533804e725c9f202 not found: ID does not exist" containerID="84528027353ff4cc53cb424657ab0da06cadd63c00296cda533804e725c9f202" Apr 25 00:03:09.759959 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:09.759923 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84528027353ff4cc53cb424657ab0da06cadd63c00296cda533804e725c9f202"} err="failed to get container status \"84528027353ff4cc53cb424657ab0da06cadd63c00296cda533804e725c9f202\": rpc error: code = NotFound desc = could not find container \"84528027353ff4cc53cb424657ab0da06cadd63c00296cda533804e725c9f202\": container with ID starting with 84528027353ff4cc53cb424657ab0da06cadd63c00296cda533804e725c9f202 not found: ID does not exist" Apr 25 00:03:09.770467 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:09.770442 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs"] Apr 25 00:03:09.774520 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:09.774498 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-6fc6549cd8-x4sbs"] Apr 25 00:03:09.911642 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:09.911609 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a6936f6-263c-4a1b-acd1-cb33518be1ad" path="/var/lib/kubelet/pods/7a6936f6-263c-4a1b-acd1-cb33518be1ad/volumes" Apr 25 00:03:10.744395 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:10.744371 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" Apr 25 00:03:17.672525 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:17.672488 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" podUID="64f2aa52-331d-430a-9dcd-ac49d6f610e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 25 00:03:17.672904 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:17.672478 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" podUID="c70adfca-a5c8-4ccb-86b4-b51e8b0a7084" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 25 00:03:27.672556 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:27.672528 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" Apr 25 00:03:27.673149 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:27.673132 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" Apr 25 00:03:37.854914 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:37.854888 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 25 00:03:37.855395 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:37.855232 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 25 00:03:39.406748 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:39.406711 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx"] Apr 25 00:03:39.407240 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:39.407221 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a6936f6-263c-4a1b-acd1-cb33518be1ad" containerName="model-chainer" Apr 25 00:03:39.407322 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:39.407244 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a6936f6-263c-4a1b-acd1-cb33518be1ad" containerName="model-chainer" Apr 25 00:03:39.407374 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:39.407350 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a6936f6-263c-4a1b-acd1-cb33518be1ad" containerName="model-chainer" Apr 25 00:03:39.411663 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:39.411644 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" Apr 25 00:03:39.413876 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:39.413855 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-b9ddf-serving-cert\"" Apr 25 00:03:39.413997 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:39.413981 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-b9ddf-kube-rbac-proxy-sar-config\"" Apr 25 00:03:39.419709 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:39.419683 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx"] Apr 25 00:03:39.521916 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:39.521891 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6972d18-0bab-4ab8-ad84-f7ccf650b8ad-proxy-tls\") pod \"sequence-graph-b9ddf-6dfbc67fd6-5m5nx\" (UID: \"b6972d18-0bab-4ab8-ad84-f7ccf650b8ad\") " pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" Apr 25 00:03:39.522051 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:39.521945 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6972d18-0bab-4ab8-ad84-f7ccf650b8ad-openshift-service-ca-bundle\") pod \"sequence-graph-b9ddf-6dfbc67fd6-5m5nx\" (UID: \"b6972d18-0bab-4ab8-ad84-f7ccf650b8ad\") " pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" Apr 25 00:03:39.622709 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:39.622682 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6972d18-0bab-4ab8-ad84-f7ccf650b8ad-proxy-tls\") pod \"sequence-graph-b9ddf-6dfbc67fd6-5m5nx\" (UID: \"b6972d18-0bab-4ab8-ad84-f7ccf650b8ad\") " pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" Apr 25 00:03:39.622859 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:39.622737 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6972d18-0bab-4ab8-ad84-f7ccf650b8ad-openshift-service-ca-bundle\") pod \"sequence-graph-b9ddf-6dfbc67fd6-5m5nx\" (UID: \"b6972d18-0bab-4ab8-ad84-f7ccf650b8ad\") " pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" Apr 25 00:03:39.623250 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:39.623231 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6972d18-0bab-4ab8-ad84-f7ccf650b8ad-openshift-service-ca-bundle\") pod \"sequence-graph-b9ddf-6dfbc67fd6-5m5nx\" (UID: \"b6972d18-0bab-4ab8-ad84-f7ccf650b8ad\") " pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" Apr 25 00:03:39.625164 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:39.625144 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6972d18-0bab-4ab8-ad84-f7ccf650b8ad-proxy-tls\") pod \"sequence-graph-b9ddf-6dfbc67fd6-5m5nx\" (UID: \"b6972d18-0bab-4ab8-ad84-f7ccf650b8ad\") " pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" Apr 25 00:03:39.721876 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:39.721817 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" Apr 25 00:03:39.865836 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:39.865805 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx"] Apr 25 00:03:39.868832 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:03:39.868809 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6972d18_0bab_4ab8_ad84_f7ccf650b8ad.slice/crio-a6ddf1816d66043b41459b8085a83ea2ae2e4400515df9182e9cbf4fc100be6e WatchSource:0}: Error finding container a6ddf1816d66043b41459b8085a83ea2ae2e4400515df9182e9cbf4fc100be6e: Status 404 returned error can't find the container with id a6ddf1816d66043b41459b8085a83ea2ae2e4400515df9182e9cbf4fc100be6e Apr 25 00:03:40.848563 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:40.848528 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" event={"ID":"b6972d18-0bab-4ab8-ad84-f7ccf650b8ad","Type":"ContainerStarted","Data":"38c364fc2f6dba60c2e6ae5b51f7715e97b5330997b25d775f65f07bac301671"} Apr 25 00:03:40.848563 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:40.848563 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" event={"ID":"b6972d18-0bab-4ab8-ad84-f7ccf650b8ad","Type":"ContainerStarted","Data":"a6ddf1816d66043b41459b8085a83ea2ae2e4400515df9182e9cbf4fc100be6e"} Apr 25 00:03:40.848982 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:40.848658 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" Apr 25 00:03:40.863952 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:40.863912 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" podStartSLOduration=1.863900592 podStartE2EDuration="1.863900592s" podCreationTimestamp="2026-04-25 00:03:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:03:40.862646997 +0000 UTC m=+603.496628938" watchObservedRunningTime="2026-04-25 00:03:40.863900592 +0000 UTC m=+603.497882528" Apr 25 00:03:46.857551 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:03:46.857517 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" Apr 25 00:08:37.880028 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:08:37.879997 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 25 00:08:37.881141 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:08:37.881119 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 25 00:11:18.204701 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.204667 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt"] Apr 25 00:11:18.207366 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.204977 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" podUID="8b7d90b4-2de2-4df2-a9c2-acf6501f9c14" containerName="switch-graph-b9319" containerID="cri-o://a257bd4991ac539fae24a6c7a1a2765ab7d8b2532279ed2d83e5305dab4cb564" gracePeriod=30 Apr 25 00:11:18.309337 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.309307 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk"] Apr 25 00:11:18.309627 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.309595 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" podUID="f4d06f33-056d-47a6-a3c4-4957f99749d0" containerName="kserve-container" containerID="cri-o://8cab42edf19dcdcc0599e6b34b2ce282fba82057a5edec3786f1b3d7305d811e" gracePeriod=30 Apr 25 00:11:18.309780 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.309626 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" podUID="f4d06f33-056d-47a6-a3c4-4957f99749d0" containerName="kube-rbac-proxy" containerID="cri-o://7b735685ec2e2ab9527c04efd565369e1b5cf3012729768f99c14ff87225a3cb" gracePeriod=30 Apr 25 00:11:18.363920 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.363888 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22"] Apr 25 00:11:18.364169 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.364142 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" podUID="d130d938-29c6-47d1-a675-1cd75b0d26c2" containerName="kserve-container" containerID="cri-o://272cf4c1e175b1ea6f7d4bad6ec965af6d4f0ff2dca353b93c04be61f7946a0a" gracePeriod=30 Apr 25 00:11:18.364302 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.364181 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" podUID="d130d938-29c6-47d1-a675-1cd75b0d26c2" containerName="kube-rbac-proxy" containerID="cri-o://8f06465da540ee3ed7bc7ff97f8f9f4bea086482a708fe08cc5ff2e24bb42baa" gracePeriod=30 Apr 25 00:11:18.383357 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.383331 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j"] Apr 25 00:11:18.386965 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.386945 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" Apr 25 00:11:18.389227 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.389209 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-d627c-kube-rbac-proxy-sar-config\"" Apr 25 00:11:18.389227 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.389215 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-d627c-predictor-serving-cert\"" Apr 25 00:11:18.404916 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.404894 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j"] Apr 25 00:11:18.429508 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.429482 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq"] Apr 25 00:11:18.432699 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.432685 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" Apr 25 00:11:18.434869 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.434846 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-d627c-kube-rbac-proxy-sar-config\"" Apr 25 00:11:18.434975 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.434934 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-d627c-predictor-serving-cert\"" Apr 25 00:11:18.441758 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.441735 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq"] Apr 25 00:11:18.476562 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.476542 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e196dbd6-90c3-4051-8b86-dc0bbe6b98cf-proxy-tls\") pod \"error-404-isvc-d627c-predictor-859777b8f9-r2nqq\" (UID: \"e196dbd6-90c3-4051-8b86-dc0bbe6b98cf\") " pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" Apr 25 00:11:18.476640 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.476577 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv7dn\" (UniqueName: \"kubernetes.io/projected/0e78737f-9719-418f-8287-6126f486855e-kube-api-access-fv7dn\") pod \"success-200-isvc-d627c-predictor-549977b56d-df86j\" (UID: \"0e78737f-9719-418f-8287-6126f486855e\") " pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" Apr 25 00:11:18.476640 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.476605 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-d627c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0e78737f-9719-418f-8287-6126f486855e-success-200-isvc-d627c-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d627c-predictor-549977b56d-df86j\" (UID: \"0e78737f-9719-418f-8287-6126f486855e\") " pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" Apr 25 00:11:18.476640 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.476623 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96s72\" (UniqueName: \"kubernetes.io/projected/e196dbd6-90c3-4051-8b86-dc0bbe6b98cf-kube-api-access-96s72\") pod \"error-404-isvc-d627c-predictor-859777b8f9-r2nqq\" (UID: \"e196dbd6-90c3-4051-8b86-dc0bbe6b98cf\") " pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" Apr 25 00:11:18.476765 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.476670 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0e78737f-9719-418f-8287-6126f486855e-proxy-tls\") pod \"success-200-isvc-d627c-predictor-549977b56d-df86j\" (UID: \"0e78737f-9719-418f-8287-6126f486855e\") " pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" Apr 25 00:11:18.476765 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.476711 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-d627c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e196dbd6-90c3-4051-8b86-dc0bbe6b98cf-error-404-isvc-d627c-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d627c-predictor-859777b8f9-r2nqq\" (UID: \"e196dbd6-90c3-4051-8b86-dc0bbe6b98cf\") " pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" Apr 25 00:11:18.578067 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.578040 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e196dbd6-90c3-4051-8b86-dc0bbe6b98cf-proxy-tls\") pod \"error-404-isvc-d627c-predictor-859777b8f9-r2nqq\" (UID: \"e196dbd6-90c3-4051-8b86-dc0bbe6b98cf\") " pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" Apr 25 00:11:18.578202 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.578080 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fv7dn\" (UniqueName: \"kubernetes.io/projected/0e78737f-9719-418f-8287-6126f486855e-kube-api-access-fv7dn\") pod \"success-200-isvc-d627c-predictor-549977b56d-df86j\" (UID: \"0e78737f-9719-418f-8287-6126f486855e\") " pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" Apr 25 00:11:18.578202 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.578123 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-d627c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0e78737f-9719-418f-8287-6126f486855e-success-200-isvc-d627c-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d627c-predictor-549977b56d-df86j\" (UID: \"0e78737f-9719-418f-8287-6126f486855e\") " pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" Apr 25 00:11:18.578202 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.578146 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96s72\" (UniqueName: \"kubernetes.io/projected/e196dbd6-90c3-4051-8b86-dc0bbe6b98cf-kube-api-access-96s72\") pod \"error-404-isvc-d627c-predictor-859777b8f9-r2nqq\" (UID: \"e196dbd6-90c3-4051-8b86-dc0bbe6b98cf\") " pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" Apr 25 00:11:18.578202 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.578177 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0e78737f-9719-418f-8287-6126f486855e-proxy-tls\") pod \"success-200-isvc-d627c-predictor-549977b56d-df86j\" (UID: \"0e78737f-9719-418f-8287-6126f486855e\") " pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" Apr 25 00:11:18.578202 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:11:18.578185 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-d627c-predictor-serving-cert: secret "error-404-isvc-d627c-predictor-serving-cert" not found Apr 25 00:11:18.578471 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.578239 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-d627c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e196dbd6-90c3-4051-8b86-dc0bbe6b98cf-error-404-isvc-d627c-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d627c-predictor-859777b8f9-r2nqq\" (UID: \"e196dbd6-90c3-4051-8b86-dc0bbe6b98cf\") " pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" Apr 25 00:11:18.578471 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:11:18.578263 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e196dbd6-90c3-4051-8b86-dc0bbe6b98cf-proxy-tls podName:e196dbd6-90c3-4051-8b86-dc0bbe6b98cf nodeName:}" failed. No retries permitted until 2026-04-25 00:11:19.078241134 +0000 UTC m=+1061.712223054 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e196dbd6-90c3-4051-8b86-dc0bbe6b98cf-proxy-tls") pod "error-404-isvc-d627c-predictor-859777b8f9-r2nqq" (UID: "e196dbd6-90c3-4051-8b86-dc0bbe6b98cf") : secret "error-404-isvc-d627c-predictor-serving-cert" not found Apr 25 00:11:18.578854 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.578835 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-d627c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0e78737f-9719-418f-8287-6126f486855e-success-200-isvc-d627c-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d627c-predictor-549977b56d-df86j\" (UID: \"0e78737f-9719-418f-8287-6126f486855e\") " pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" Apr 25 00:11:18.578919 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.578859 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-d627c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e196dbd6-90c3-4051-8b86-dc0bbe6b98cf-error-404-isvc-d627c-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d627c-predictor-859777b8f9-r2nqq\" (UID: \"e196dbd6-90c3-4051-8b86-dc0bbe6b98cf\") " pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" Apr 25 00:11:18.580638 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.580617 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0e78737f-9719-418f-8287-6126f486855e-proxy-tls\") pod \"success-200-isvc-d627c-predictor-549977b56d-df86j\" (UID: \"0e78737f-9719-418f-8287-6126f486855e\") " pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" Apr 25 00:11:18.586271 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.586228 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv7dn\" (UniqueName: \"kubernetes.io/projected/0e78737f-9719-418f-8287-6126f486855e-kube-api-access-fv7dn\") pod \"success-200-isvc-d627c-predictor-549977b56d-df86j\" (UID: \"0e78737f-9719-418f-8287-6126f486855e\") " pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" Apr 25 00:11:18.586779 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.586756 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96s72\" (UniqueName: \"kubernetes.io/projected/e196dbd6-90c3-4051-8b86-dc0bbe6b98cf-kube-api-access-96s72\") pod \"error-404-isvc-d627c-predictor-859777b8f9-r2nqq\" (UID: \"e196dbd6-90c3-4051-8b86-dc0bbe6b98cf\") " pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" Apr 25 00:11:18.697476 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.697432 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" Apr 25 00:11:18.823807 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.823782 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j"] Apr 25 00:11:18.825771 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:11:18.825742 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e78737f_9719_418f_8287_6126f486855e.slice/crio-d7ed0756ce880c69349a34cb287e90bb4ac10f4d49bd99d6ca89978976c65136 WatchSource:0}: Error finding container d7ed0756ce880c69349a34cb287e90bb4ac10f4d49bd99d6ca89978976c65136: Status 404 returned error can't find the container with id d7ed0756ce880c69349a34cb287e90bb4ac10f4d49bd99d6ca89978976c65136 Apr 25 00:11:18.827491 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:18.827475 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:11:19.083303 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:19.083266 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e196dbd6-90c3-4051-8b86-dc0bbe6b98cf-proxy-tls\") pod \"error-404-isvc-d627c-predictor-859777b8f9-r2nqq\" (UID: \"e196dbd6-90c3-4051-8b86-dc0bbe6b98cf\") " pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" Apr 25 00:11:19.086150 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:19.086112 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e196dbd6-90c3-4051-8b86-dc0bbe6b98cf-proxy-tls\") pod \"error-404-isvc-d627c-predictor-859777b8f9-r2nqq\" (UID: \"e196dbd6-90c3-4051-8b86-dc0bbe6b98cf\") " pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" Apr 25 00:11:19.176342 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:19.176310 2578 generic.go:358] "Generic (PLEG): container finished" podID="d130d938-29c6-47d1-a675-1cd75b0d26c2" containerID="8f06465da540ee3ed7bc7ff97f8f9f4bea086482a708fe08cc5ff2e24bb42baa" exitCode=2 Apr 25 00:11:19.176524 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:19.176384 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" event={"ID":"d130d938-29c6-47d1-a675-1cd75b0d26c2","Type":"ContainerDied","Data":"8f06465da540ee3ed7bc7ff97f8f9f4bea086482a708fe08cc5ff2e24bb42baa"} Apr 25 00:11:19.177969 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:19.177948 2578 generic.go:358] "Generic (PLEG): container finished" podID="f4d06f33-056d-47a6-a3c4-4957f99749d0" containerID="7b735685ec2e2ab9527c04efd565369e1b5cf3012729768f99c14ff87225a3cb" exitCode=2 Apr 25 00:11:19.178060 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:19.178023 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" event={"ID":"f4d06f33-056d-47a6-a3c4-4957f99749d0","Type":"ContainerDied","Data":"7b735685ec2e2ab9527c04efd565369e1b5cf3012729768f99c14ff87225a3cb"} Apr 25 00:11:19.179340 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:19.179319 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" event={"ID":"0e78737f-9719-418f-8287-6126f486855e","Type":"ContainerStarted","Data":"f68ee2d1d9f115e6ce17b987f1eb999486e7208a8ea12aac4f1178ed0d6ab5d0"} Apr 25 00:11:19.179432 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:19.179348 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" event={"ID":"0e78737f-9719-418f-8287-6126f486855e","Type":"ContainerStarted","Data":"55651d9b6384f4c2cb612171cb2aab3a9bf9b0978b1e13ce6212ea25595bab3a"} Apr 25 00:11:19.179432 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:19.179361 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" event={"ID":"0e78737f-9719-418f-8287-6126f486855e","Type":"ContainerStarted","Data":"d7ed0756ce880c69349a34cb287e90bb4ac10f4d49bd99d6ca89978976c65136"} Apr 25 00:11:19.179638 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:19.179611 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" Apr 25 00:11:19.179771 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:19.179757 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" Apr 25 00:11:19.180975 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:19.180954 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" podUID="0e78737f-9719-418f-8287-6126f486855e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 25 00:11:19.196099 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:19.196048 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" podStartSLOduration=1.196031383 podStartE2EDuration="1.196031383s" podCreationTimestamp="2026-04-25 00:11:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:11:19.19483227 +0000 UTC m=+1061.828814232" watchObservedRunningTime="2026-04-25 00:11:19.196031383 +0000 UTC m=+1061.830013320" Apr 25 00:11:19.343441 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:19.343337 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" Apr 25 00:11:19.467370 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:19.467310 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq"] Apr 25 00:11:19.469443 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:11:19.469400 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode196dbd6_90c3_4051_8b86_dc0bbe6b98cf.slice/crio-d2ed47823f15816ecce904609056dc6ed0783d90f509065e1caf1ff37d2881ff WatchSource:0}: Error finding container d2ed47823f15816ecce904609056dc6ed0783d90f509065e1caf1ff37d2881ff: Status 404 returned error can't find the container with id d2ed47823f15816ecce904609056dc6ed0783d90f509065e1caf1ff37d2881ff Apr 25 00:11:20.184797 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:20.184758 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" event={"ID":"e196dbd6-90c3-4051-8b86-dc0bbe6b98cf","Type":"ContainerStarted","Data":"f50b476428d3569bcb9ab6e0f547eb5ae355b5815703665db2dbb36c25f9d6fa"} Apr 25 00:11:20.184797 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:20.184802 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" event={"ID":"e196dbd6-90c3-4051-8b86-dc0bbe6b98cf","Type":"ContainerStarted","Data":"da1334e9f2c3471a957429422676e5d6adc3a51822da51b254cc4f000abeed65"} Apr 25 00:11:20.185062 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:20.184817 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" event={"ID":"e196dbd6-90c3-4051-8b86-dc0bbe6b98cf","Type":"ContainerStarted","Data":"d2ed47823f15816ecce904609056dc6ed0783d90f509065e1caf1ff37d2881ff"} Apr 25 00:11:20.185124 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:20.185104 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" podUID="0e78737f-9719-418f-8287-6126f486855e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 25 00:11:20.185472 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:20.185351 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" Apr 25 00:11:20.186799 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:20.186769 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" podUID="e196dbd6-90c3-4051-8b86-dc0bbe6b98cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 25 00:11:20.200781 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:20.200723 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" podStartSLOduration=2.200702688 podStartE2EDuration="2.200702688s" podCreationTimestamp="2026-04-25 00:11:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:11:20.200159739 +0000 UTC m=+1062.834141675" watchObservedRunningTime="2026-04-25 00:11:20.200702688 +0000 UTC m=+1062.834684626" Apr 25 00:11:20.742586 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:20.742499 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" podUID="8b7d90b4-2de2-4df2-a9c2-acf6501f9c14" containerName="switch-graph-b9319" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:11:21.193708 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:21.193666 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" Apr 25 00:11:21.193913 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:21.193775 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" podUID="e196dbd6-90c3-4051-8b86-dc0bbe6b98cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 25 00:11:21.548625 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:21.548579 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" podUID="f4d06f33-056d-47a6-a3c4-4957f99749d0" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.23:8643/healthz\": dial tcp 10.133.0.23:8643: connect: connection refused" Apr 25 00:11:21.548789 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:21.548589 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" podUID="d130d938-29c6-47d1-a675-1cd75b0d26c2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.24:8643/healthz\": dial tcp 10.133.0.24:8643: connect: connection refused" Apr 25 00:11:21.552873 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:21.552845 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" podUID="d130d938-29c6-47d1-a675-1cd75b0d26c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 25 00:11:21.553085 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:21.553067 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" podUID="f4d06f33-056d-47a6-a3c4-4957f99749d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 25 00:11:22.037299 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.037275 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" Apr 25 00:11:22.040424 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.040395 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" Apr 25 00:11:22.108981 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.108889 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6jv7\" (UniqueName: \"kubernetes.io/projected/f4d06f33-056d-47a6-a3c4-4957f99749d0-kube-api-access-k6jv7\") pod \"f4d06f33-056d-47a6-a3c4-4957f99749d0\" (UID: \"f4d06f33-056d-47a6-a3c4-4957f99749d0\") " Apr 25 00:11:22.108981 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.108967 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbz8b\" (UniqueName: \"kubernetes.io/projected/d130d938-29c6-47d1-a675-1cd75b0d26c2-kube-api-access-vbz8b\") pod \"d130d938-29c6-47d1-a675-1cd75b0d26c2\" (UID: \"d130d938-29c6-47d1-a675-1cd75b0d26c2\") " Apr 25 00:11:22.109178 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.108995 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-b9319-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d130d938-29c6-47d1-a675-1cd75b0d26c2-error-404-isvc-b9319-kube-rbac-proxy-sar-config\") pod \"d130d938-29c6-47d1-a675-1cd75b0d26c2\" (UID: \"d130d938-29c6-47d1-a675-1cd75b0d26c2\") " Apr 25 00:11:22.109178 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.109107 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-b9319-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4d06f33-056d-47a6-a3c4-4957f99749d0-success-200-isvc-b9319-kube-rbac-proxy-sar-config\") pod \"f4d06f33-056d-47a6-a3c4-4957f99749d0\" (UID: \"f4d06f33-056d-47a6-a3c4-4957f99749d0\") " Apr 25 00:11:22.109178 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.109152 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d130d938-29c6-47d1-a675-1cd75b0d26c2-proxy-tls\") pod \"d130d938-29c6-47d1-a675-1cd75b0d26c2\" (UID: \"d130d938-29c6-47d1-a675-1cd75b0d26c2\") " Apr 25 00:11:22.109331 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.109195 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4d06f33-056d-47a6-a3c4-4957f99749d0-proxy-tls\") pod \"f4d06f33-056d-47a6-a3c4-4957f99749d0\" (UID: \"f4d06f33-056d-47a6-a3c4-4957f99749d0\") " Apr 25 00:11:22.109430 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.109378 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d130d938-29c6-47d1-a675-1cd75b0d26c2-error-404-isvc-b9319-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-b9319-kube-rbac-proxy-sar-config") pod "d130d938-29c6-47d1-a675-1cd75b0d26c2" (UID: "d130d938-29c6-47d1-a675-1cd75b0d26c2"). InnerVolumeSpecName "error-404-isvc-b9319-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:11:22.109540 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.109523 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-b9319-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d130d938-29c6-47d1-a675-1cd75b0d26c2-error-404-isvc-b9319-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:11:22.109589 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.109519 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4d06f33-056d-47a6-a3c4-4957f99749d0-success-200-isvc-b9319-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-b9319-kube-rbac-proxy-sar-config") pod "f4d06f33-056d-47a6-a3c4-4957f99749d0" (UID: "f4d06f33-056d-47a6-a3c4-4957f99749d0"). InnerVolumeSpecName "success-200-isvc-b9319-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:11:22.111335 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.111309 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d130d938-29c6-47d1-a675-1cd75b0d26c2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d130d938-29c6-47d1-a675-1cd75b0d26c2" (UID: "d130d938-29c6-47d1-a675-1cd75b0d26c2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:11:22.111475 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.111388 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4d06f33-056d-47a6-a3c4-4957f99749d0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f4d06f33-056d-47a6-a3c4-4957f99749d0" (UID: "f4d06f33-056d-47a6-a3c4-4957f99749d0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:11:22.111475 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.111405 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4d06f33-056d-47a6-a3c4-4957f99749d0-kube-api-access-k6jv7" (OuterVolumeSpecName: "kube-api-access-k6jv7") pod "f4d06f33-056d-47a6-a3c4-4957f99749d0" (UID: "f4d06f33-056d-47a6-a3c4-4957f99749d0"). InnerVolumeSpecName "kube-api-access-k6jv7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:11:22.111592 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.111547 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d130d938-29c6-47d1-a675-1cd75b0d26c2-kube-api-access-vbz8b" (OuterVolumeSpecName: "kube-api-access-vbz8b") pod "d130d938-29c6-47d1-a675-1cd75b0d26c2" (UID: "d130d938-29c6-47d1-a675-1cd75b0d26c2"). InnerVolumeSpecName "kube-api-access-vbz8b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:11:22.197478 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.197442 2578 generic.go:358] "Generic (PLEG): container finished" podID="d130d938-29c6-47d1-a675-1cd75b0d26c2" containerID="272cf4c1e175b1ea6f7d4bad6ec965af6d4f0ff2dca353b93c04be61f7946a0a" exitCode=0 Apr 25 00:11:22.197628 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.197530 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" Apr 25 00:11:22.197628 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.197539 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" event={"ID":"d130d938-29c6-47d1-a675-1cd75b0d26c2","Type":"ContainerDied","Data":"272cf4c1e175b1ea6f7d4bad6ec965af6d4f0ff2dca353b93c04be61f7946a0a"} Apr 25 00:11:22.197628 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.197590 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22" event={"ID":"d130d938-29c6-47d1-a675-1cd75b0d26c2","Type":"ContainerDied","Data":"a9b4b5000348a9637b3e17cd6688c77b6b9c452dfd3ed76db44b81dae90dbf02"} Apr 25 00:11:22.197628 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.197611 2578 scope.go:117] "RemoveContainer" containerID="8f06465da540ee3ed7bc7ff97f8f9f4bea086482a708fe08cc5ff2e24bb42baa" Apr 25 00:11:22.199119 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.199017 2578 generic.go:358] "Generic (PLEG): container finished" podID="f4d06f33-056d-47a6-a3c4-4957f99749d0" containerID="8cab42edf19dcdcc0599e6b34b2ce282fba82057a5edec3786f1b3d7305d811e" exitCode=0 Apr 25 00:11:22.199119 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.199067 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" event={"ID":"f4d06f33-056d-47a6-a3c4-4957f99749d0","Type":"ContainerDied","Data":"8cab42edf19dcdcc0599e6b34b2ce282fba82057a5edec3786f1b3d7305d811e"} Apr 25 00:11:22.199119 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.199079 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" Apr 25 00:11:22.199119 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.199096 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk" event={"ID":"f4d06f33-056d-47a6-a3c4-4957f99749d0","Type":"ContainerDied","Data":"6fbb7498812be40ee8285391c4edc871d5e760f972f3c95b924f69e1a524fb44"} Apr 25 00:11:22.199599 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.199560 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" podUID="e196dbd6-90c3-4051-8b86-dc0bbe6b98cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 25 00:11:22.207571 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.207554 2578 scope.go:117] "RemoveContainer" containerID="272cf4c1e175b1ea6f7d4bad6ec965af6d4f0ff2dca353b93c04be61f7946a0a" Apr 25 00:11:22.209960 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.209942 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k6jv7\" (UniqueName: \"kubernetes.io/projected/f4d06f33-056d-47a6-a3c4-4957f99749d0-kube-api-access-k6jv7\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:11:22.209960 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.209963 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vbz8b\" (UniqueName: \"kubernetes.io/projected/d130d938-29c6-47d1-a675-1cd75b0d26c2-kube-api-access-vbz8b\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:11:22.210109 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.209974 2578 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-b9319-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4d06f33-056d-47a6-a3c4-4957f99749d0-success-200-isvc-b9319-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:11:22.210109 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.209985 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d130d938-29c6-47d1-a675-1cd75b0d26c2-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:11:22.210109 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.209994 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4d06f33-056d-47a6-a3c4-4957f99749d0-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:11:22.215201 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.215182 2578 scope.go:117] "RemoveContainer" containerID="8f06465da540ee3ed7bc7ff97f8f9f4bea086482a708fe08cc5ff2e24bb42baa" Apr 25 00:11:22.215500 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:11:22.215482 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f06465da540ee3ed7bc7ff97f8f9f4bea086482a708fe08cc5ff2e24bb42baa\": container with ID starting with 8f06465da540ee3ed7bc7ff97f8f9f4bea086482a708fe08cc5ff2e24bb42baa not found: ID does not exist" containerID="8f06465da540ee3ed7bc7ff97f8f9f4bea086482a708fe08cc5ff2e24bb42baa" Apr 25 00:11:22.215578 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.215512 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f06465da540ee3ed7bc7ff97f8f9f4bea086482a708fe08cc5ff2e24bb42baa"} err="failed to get container status \"8f06465da540ee3ed7bc7ff97f8f9f4bea086482a708fe08cc5ff2e24bb42baa\": rpc error: code = NotFound desc = could not find container \"8f06465da540ee3ed7bc7ff97f8f9f4bea086482a708fe08cc5ff2e24bb42baa\": container with ID starting with 8f06465da540ee3ed7bc7ff97f8f9f4bea086482a708fe08cc5ff2e24bb42baa not found: ID does not exist" Apr 25 00:11:22.215578 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.215537 2578 scope.go:117] "RemoveContainer" containerID="272cf4c1e175b1ea6f7d4bad6ec965af6d4f0ff2dca353b93c04be61f7946a0a" Apr 25 00:11:22.215795 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:11:22.215776 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"272cf4c1e175b1ea6f7d4bad6ec965af6d4f0ff2dca353b93c04be61f7946a0a\": container with ID starting with 272cf4c1e175b1ea6f7d4bad6ec965af6d4f0ff2dca353b93c04be61f7946a0a not found: ID does not exist" containerID="272cf4c1e175b1ea6f7d4bad6ec965af6d4f0ff2dca353b93c04be61f7946a0a" Apr 25 00:11:22.215854 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.215803 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"272cf4c1e175b1ea6f7d4bad6ec965af6d4f0ff2dca353b93c04be61f7946a0a"} err="failed to get container status \"272cf4c1e175b1ea6f7d4bad6ec965af6d4f0ff2dca353b93c04be61f7946a0a\": rpc error: code = NotFound desc = could not find container \"272cf4c1e175b1ea6f7d4bad6ec965af6d4f0ff2dca353b93c04be61f7946a0a\": container with ID starting with 272cf4c1e175b1ea6f7d4bad6ec965af6d4f0ff2dca353b93c04be61f7946a0a not found: ID does not exist" Apr 25 00:11:22.215854 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.215822 2578 scope.go:117] "RemoveContainer" containerID="7b735685ec2e2ab9527c04efd565369e1b5cf3012729768f99c14ff87225a3cb" Apr 25 00:11:22.219192 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.219166 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22"] Apr 25 00:11:22.222226 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.222207 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9319-predictor-65db6866c5-fqs22"] Apr 25 00:11:22.223968 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.223947 2578 scope.go:117] "RemoveContainer" containerID="8cab42edf19dcdcc0599e6b34b2ce282fba82057a5edec3786f1b3d7305d811e" Apr 25 00:11:22.231897 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.231875 2578 scope.go:117] "RemoveContainer" containerID="7b735685ec2e2ab9527c04efd565369e1b5cf3012729768f99c14ff87225a3cb" Apr 25 00:11:22.232072 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.232051 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk"] Apr 25 00:11:22.232175 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:11:22.232160 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b735685ec2e2ab9527c04efd565369e1b5cf3012729768f99c14ff87225a3cb\": container with ID starting with 7b735685ec2e2ab9527c04efd565369e1b5cf3012729768f99c14ff87225a3cb not found: ID does not exist" containerID="7b735685ec2e2ab9527c04efd565369e1b5cf3012729768f99c14ff87225a3cb" Apr 25 00:11:22.232237 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.232184 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b735685ec2e2ab9527c04efd565369e1b5cf3012729768f99c14ff87225a3cb"} err="failed to get container status \"7b735685ec2e2ab9527c04efd565369e1b5cf3012729768f99c14ff87225a3cb\": rpc error: code = NotFound desc = could not find container \"7b735685ec2e2ab9527c04efd565369e1b5cf3012729768f99c14ff87225a3cb\": container with ID starting with 7b735685ec2e2ab9527c04efd565369e1b5cf3012729768f99c14ff87225a3cb not found: ID does not exist" Apr 25 00:11:22.232237 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.232204 2578 scope.go:117] "RemoveContainer" containerID="8cab42edf19dcdcc0599e6b34b2ce282fba82057a5edec3786f1b3d7305d811e" Apr 25 00:11:22.232481 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:11:22.232464 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cab42edf19dcdcc0599e6b34b2ce282fba82057a5edec3786f1b3d7305d811e\": container with ID starting with 8cab42edf19dcdcc0599e6b34b2ce282fba82057a5edec3786f1b3d7305d811e not found: ID does not exist" containerID="8cab42edf19dcdcc0599e6b34b2ce282fba82057a5edec3786f1b3d7305d811e" Apr 25 00:11:22.232541 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.232490 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cab42edf19dcdcc0599e6b34b2ce282fba82057a5edec3786f1b3d7305d811e"} err="failed to get container status \"8cab42edf19dcdcc0599e6b34b2ce282fba82057a5edec3786f1b3d7305d811e\": rpc error: code = NotFound desc = could not find container \"8cab42edf19dcdcc0599e6b34b2ce282fba82057a5edec3786f1b3d7305d811e\": container with ID starting with 8cab42edf19dcdcc0599e6b34b2ce282fba82057a5edec3786f1b3d7305d811e not found: ID does not exist" Apr 25 00:11:22.234921 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:22.234901 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b9319-predictor-67fcdd6676-f9tsk"] Apr 25 00:11:23.911696 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:23.911662 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d130d938-29c6-47d1-a675-1cd75b0d26c2" path="/var/lib/kubelet/pods/d130d938-29c6-47d1-a675-1cd75b0d26c2/volumes" Apr 25 00:11:23.912255 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:23.912235 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4d06f33-056d-47a6-a3c4-4957f99749d0" path="/var/lib/kubelet/pods/f4d06f33-056d-47a6-a3c4-4957f99749d0/volumes" Apr 25 00:11:25.189298 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:25.189267 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" Apr 25 00:11:25.189881 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:25.189856 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" podUID="0e78737f-9719-418f-8287-6126f486855e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 25 00:11:25.744627 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:25.744590 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" podUID="8b7d90b4-2de2-4df2-a9c2-acf6501f9c14" containerName="switch-graph-b9319" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:11:27.203766 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:27.203733 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" Apr 25 00:11:27.204339 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:27.204309 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" podUID="e196dbd6-90c3-4051-8b86-dc0bbe6b98cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 25 00:11:30.742687 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:30.742651 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" podUID="8b7d90b4-2de2-4df2-a9c2-acf6501f9c14" containerName="switch-graph-b9319" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:11:30.743094 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:30.742755 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" Apr 25 00:11:35.190453 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:35.190397 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" podUID="0e78737f-9719-418f-8287-6126f486855e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 25 00:11:35.742621 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:35.742582 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" podUID="8b7d90b4-2de2-4df2-a9c2-acf6501f9c14" containerName="switch-graph-b9319" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:11:37.204867 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:37.204821 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" podUID="e196dbd6-90c3-4051-8b86-dc0bbe6b98cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 25 00:11:40.742996 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:40.742957 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" podUID="8b7d90b4-2de2-4df2-a9c2-acf6501f9c14" containerName="switch-graph-b9319" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:11:45.189994 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:45.189946 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" podUID="0e78737f-9719-418f-8287-6126f486855e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 25 00:11:45.742425 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:45.742371 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" podUID="8b7d90b4-2de2-4df2-a9c2-acf6501f9c14" containerName="switch-graph-b9319" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:11:47.204511 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:47.204471 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" podUID="e196dbd6-90c3-4051-8b86-dc0bbe6b98cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 25 00:11:48.278696 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:48.278666 2578 generic.go:358] "Generic (PLEG): container finished" podID="8b7d90b4-2de2-4df2-a9c2-acf6501f9c14" containerID="a257bd4991ac539fae24a6c7a1a2765ab7d8b2532279ed2d83e5305dab4cb564" exitCode=0 Apr 25 00:11:48.279017 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:48.278740 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" event={"ID":"8b7d90b4-2de2-4df2-a9c2-acf6501f9c14","Type":"ContainerDied","Data":"a257bd4991ac539fae24a6c7a1a2765ab7d8b2532279ed2d83e5305dab4cb564"} Apr 25 00:11:48.850088 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:48.850067 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" Apr 25 00:11:48.950564 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:48.950537 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b7d90b4-2de2-4df2-a9c2-acf6501f9c14-openshift-service-ca-bundle\") pod \"8b7d90b4-2de2-4df2-a9c2-acf6501f9c14\" (UID: \"8b7d90b4-2de2-4df2-a9c2-acf6501f9c14\") " Apr 25 00:11:48.950719 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:48.950580 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8b7d90b4-2de2-4df2-a9c2-acf6501f9c14-proxy-tls\") pod \"8b7d90b4-2de2-4df2-a9c2-acf6501f9c14\" (UID: \"8b7d90b4-2de2-4df2-a9c2-acf6501f9c14\") " Apr 25 00:11:48.950878 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:48.950855 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b7d90b4-2de2-4df2-a9c2-acf6501f9c14-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "8b7d90b4-2de2-4df2-a9c2-acf6501f9c14" (UID: "8b7d90b4-2de2-4df2-a9c2-acf6501f9c14"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:11:48.952897 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:48.952877 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b7d90b4-2de2-4df2-a9c2-acf6501f9c14-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8b7d90b4-2de2-4df2-a9c2-acf6501f9c14" (UID: "8b7d90b4-2de2-4df2-a9c2-acf6501f9c14"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:11:49.051793 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:49.051767 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b7d90b4-2de2-4df2-a9c2-acf6501f9c14-openshift-service-ca-bundle\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:11:49.051793 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:49.051793 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8b7d90b4-2de2-4df2-a9c2-acf6501f9c14-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:11:49.283155 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:49.283120 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" Apr 25 00:11:49.283750 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:49.283118 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt" event={"ID":"8b7d90b4-2de2-4df2-a9c2-acf6501f9c14","Type":"ContainerDied","Data":"09712ef590be41af06b9f02c58bb70bb985ffd07c4a88faea48a5767bb642aa4"} Apr 25 00:11:49.283750 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:49.283261 2578 scope.go:117] "RemoveContainer" containerID="a257bd4991ac539fae24a6c7a1a2765ab7d8b2532279ed2d83e5305dab4cb564" Apr 25 00:11:49.308046 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:49.307990 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt"] Apr 25 00:11:49.311898 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:49.311878 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b9319-859cc9dd58-k7xmt"] Apr 25 00:11:49.911812 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:49.911781 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b7d90b4-2de2-4df2-a9c2-acf6501f9c14" path="/var/lib/kubelet/pods/8b7d90b4-2de2-4df2-a9c2-acf6501f9c14/volumes" Apr 25 00:11:54.197329 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.197299 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx"] Apr 25 00:11:54.197769 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.197544 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" podUID="b6972d18-0bab-4ab8-ad84-f7ccf650b8ad" containerName="sequence-graph-b9ddf" containerID="cri-o://38c364fc2f6dba60c2e6ae5b51f7715e97b5330997b25d775f65f07bac301671" gracePeriod=30 Apr 25 00:11:54.290258 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.290222 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z"] Apr 25 00:11:54.290840 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.290780 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" podUID="64f2aa52-331d-430a-9dcd-ac49d6f610e4" containerName="kserve-container" containerID="cri-o://2c3c142c4d92c29a77469d8c3246eb7913f79e7895aca1b07774d2c6c5b11d3b" gracePeriod=30 Apr 25 00:11:54.290988 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.290861 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" podUID="64f2aa52-331d-430a-9dcd-ac49d6f610e4" containerName="kube-rbac-proxy" containerID="cri-o://0d92a951d415182a6a6354e3c09600935a1ec07b912eea0222c67fa243eea4a0" gracePeriod=30 Apr 25 00:11:54.348462 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.348399 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk"] Apr 25 00:11:54.348862 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.348844 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b7d90b4-2de2-4df2-a9c2-acf6501f9c14" containerName="switch-graph-b9319" Apr 25 00:11:54.348954 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.348864 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b7d90b4-2de2-4df2-a9c2-acf6501f9c14" containerName="switch-graph-b9319" Apr 25 00:11:54.348954 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.348882 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4d06f33-056d-47a6-a3c4-4957f99749d0" containerName="kube-rbac-proxy" Apr 25 00:11:54.348954 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.348890 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d06f33-056d-47a6-a3c4-4957f99749d0" containerName="kube-rbac-proxy" Apr 25 00:11:54.348954 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.348902 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d130d938-29c6-47d1-a675-1cd75b0d26c2" containerName="kube-rbac-proxy" Apr 25 00:11:54.348954 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.348912 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d130d938-29c6-47d1-a675-1cd75b0d26c2" containerName="kube-rbac-proxy" Apr 25 00:11:54.348954 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.348931 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4d06f33-056d-47a6-a3c4-4957f99749d0" containerName="kserve-container" Apr 25 00:11:54.348954 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.348940 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d06f33-056d-47a6-a3c4-4957f99749d0" containerName="kserve-container" Apr 25 00:11:54.349284 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.348963 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d130d938-29c6-47d1-a675-1cd75b0d26c2" containerName="kserve-container" Apr 25 00:11:54.349284 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.348971 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d130d938-29c6-47d1-a675-1cd75b0d26c2" containerName="kserve-container" Apr 25 00:11:54.349284 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.349049 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4d06f33-056d-47a6-a3c4-4957f99749d0" containerName="kserve-container" Apr 25 00:11:54.349284 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.349063 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d130d938-29c6-47d1-a675-1cd75b0d26c2" containerName="kserve-container" Apr 25 00:11:54.349284 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.349073 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d130d938-29c6-47d1-a675-1cd75b0d26c2" containerName="kube-rbac-proxy" Apr 25 00:11:54.349284 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.349086 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4d06f33-056d-47a6-a3c4-4957f99749d0" containerName="kube-rbac-proxy" Apr 25 00:11:54.349284 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.349096 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8b7d90b4-2de2-4df2-a9c2-acf6501f9c14" containerName="switch-graph-b9319" Apr 25 00:11:54.353663 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.353641 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" Apr 25 00:11:54.355562 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.355537 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-c9f9f-predictor-serving-cert\"" Apr 25 00:11:54.355661 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.355570 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-c9f9f-kube-rbac-proxy-sar-config\"" Apr 25 00:11:54.358829 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.358799 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk"] Apr 25 00:11:54.365632 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.365610 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k"] Apr 25 00:11:54.365911 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.365888 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" podUID="c70adfca-a5c8-4ccb-86b4-b51e8b0a7084" containerName="kserve-container" containerID="cri-o://54bff80a3a2c7aa81f08fe16d25f1270064c78e8c1bfade2ec9319ee32a91278" gracePeriod=30 Apr 25 00:11:54.366014 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.365981 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" podUID="c70adfca-a5c8-4ccb-86b4-b51e8b0a7084" containerName="kube-rbac-proxy" containerID="cri-o://1b1f24e375f55941e3e58ffd84bf1b361081bd50e7e80c3e2adb86dc2f6d4b2d" gracePeriod=30 Apr 25 00:11:54.441524 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.441485 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf"] Apr 25 00:11:54.445015 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.444994 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" Apr 25 00:11:54.447273 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.447246 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-c9f9f-predictor-serving-cert\"" Apr 25 00:11:54.447379 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.447254 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-c9f9f-kube-rbac-proxy-sar-config\"" Apr 25 00:11:54.454442 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.454376 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf"] Apr 25 00:11:54.500276 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.500248 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsrgj\" (UniqueName: \"kubernetes.io/projected/eaa96e56-ae3f-4672-b36f-2043818f851d-kube-api-access-fsrgj\") pod \"success-200-isvc-c9f9f-predictor-8b8564854-4sftk\" (UID: \"eaa96e56-ae3f-4672-b36f-2043818f851d\") " pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" Apr 25 00:11:54.500455 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.500301 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eaa96e56-ae3f-4672-b36f-2043818f851d-proxy-tls\") pod \"success-200-isvc-c9f9f-predictor-8b8564854-4sftk\" (UID: \"eaa96e56-ae3f-4672-b36f-2043818f851d\") " pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" Apr 25 00:11:54.500455 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.500325 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-c9f9f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eaa96e56-ae3f-4672-b36f-2043818f851d-success-200-isvc-c9f9f-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-c9f9f-predictor-8b8564854-4sftk\" (UID: \"eaa96e56-ae3f-4672-b36f-2043818f851d\") " pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" Apr 25 00:11:54.601831 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.601794 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-c9f9f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a71639bf-f015-4465-933c-b9a7b152f57d-error-404-isvc-c9f9f-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-c9f9f-predictor-75597dd9f7-czccf\" (UID: \"a71639bf-f015-4465-933c-b9a7b152f57d\") " pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" Apr 25 00:11:54.601994 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.601851 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsrgj\" (UniqueName: \"kubernetes.io/projected/eaa96e56-ae3f-4672-b36f-2043818f851d-kube-api-access-fsrgj\") pod \"success-200-isvc-c9f9f-predictor-8b8564854-4sftk\" (UID: \"eaa96e56-ae3f-4672-b36f-2043818f851d\") " pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" Apr 25 00:11:54.601994 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.601882 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpzmv\" (UniqueName: \"kubernetes.io/projected/a71639bf-f015-4465-933c-b9a7b152f57d-kube-api-access-dpzmv\") pod \"error-404-isvc-c9f9f-predictor-75597dd9f7-czccf\" (UID: \"a71639bf-f015-4465-933c-b9a7b152f57d\") " pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" Apr 25 00:11:54.601994 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.601908 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a71639bf-f015-4465-933c-b9a7b152f57d-proxy-tls\") pod \"error-404-isvc-c9f9f-predictor-75597dd9f7-czccf\" (UID: \"a71639bf-f015-4465-933c-b9a7b152f57d\") " pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" Apr 25 00:11:54.601994 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.601945 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eaa96e56-ae3f-4672-b36f-2043818f851d-proxy-tls\") pod \"success-200-isvc-c9f9f-predictor-8b8564854-4sftk\" (UID: \"eaa96e56-ae3f-4672-b36f-2043818f851d\") " pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" Apr 25 00:11:54.601994 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.601971 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-c9f9f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eaa96e56-ae3f-4672-b36f-2043818f851d-success-200-isvc-c9f9f-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-c9f9f-predictor-8b8564854-4sftk\" (UID: \"eaa96e56-ae3f-4672-b36f-2043818f851d\") " pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" Apr 25 00:11:54.602179 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:11:54.602029 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-serving-cert: secret "success-200-isvc-c9f9f-predictor-serving-cert" not found Apr 25 00:11:54.602179 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:11:54.602091 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaa96e56-ae3f-4672-b36f-2043818f851d-proxy-tls podName:eaa96e56-ae3f-4672-b36f-2043818f851d nodeName:}" failed. No retries permitted until 2026-04-25 00:11:55.102074882 +0000 UTC m=+1097.736056796 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/eaa96e56-ae3f-4672-b36f-2043818f851d-proxy-tls") pod "success-200-isvc-c9f9f-predictor-8b8564854-4sftk" (UID: "eaa96e56-ae3f-4672-b36f-2043818f851d") : secret "success-200-isvc-c9f9f-predictor-serving-cert" not found Apr 25 00:11:54.602617 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.602598 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-c9f9f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eaa96e56-ae3f-4672-b36f-2043818f851d-success-200-isvc-c9f9f-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-c9f9f-predictor-8b8564854-4sftk\" (UID: \"eaa96e56-ae3f-4672-b36f-2043818f851d\") " pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" Apr 25 00:11:54.610615 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.610584 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsrgj\" (UniqueName: \"kubernetes.io/projected/eaa96e56-ae3f-4672-b36f-2043818f851d-kube-api-access-fsrgj\") pod \"success-200-isvc-c9f9f-predictor-8b8564854-4sftk\" (UID: \"eaa96e56-ae3f-4672-b36f-2043818f851d\") " pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" Apr 25 00:11:54.702569 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.702475 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpzmv\" (UniqueName: \"kubernetes.io/projected/a71639bf-f015-4465-933c-b9a7b152f57d-kube-api-access-dpzmv\") pod \"error-404-isvc-c9f9f-predictor-75597dd9f7-czccf\" (UID: \"a71639bf-f015-4465-933c-b9a7b152f57d\") " pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" Apr 25 00:11:54.702569 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.702519 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a71639bf-f015-4465-933c-b9a7b152f57d-proxy-tls\") pod \"error-404-isvc-c9f9f-predictor-75597dd9f7-czccf\" (UID: \"a71639bf-f015-4465-933c-b9a7b152f57d\") " pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" Apr 25 00:11:54.702767 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.702627 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-c9f9f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a71639bf-f015-4465-933c-b9a7b152f57d-error-404-isvc-c9f9f-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-c9f9f-predictor-75597dd9f7-czccf\" (UID: \"a71639bf-f015-4465-933c-b9a7b152f57d\") " pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" Apr 25 00:11:54.703357 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.703279 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-c9f9f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a71639bf-f015-4465-933c-b9a7b152f57d-error-404-isvc-c9f9f-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-c9f9f-predictor-75597dd9f7-czccf\" (UID: \"a71639bf-f015-4465-933c-b9a7b152f57d\") " pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" Apr 25 00:11:54.705055 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.705032 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a71639bf-f015-4465-933c-b9a7b152f57d-proxy-tls\") pod \"error-404-isvc-c9f9f-predictor-75597dd9f7-czccf\" (UID: \"a71639bf-f015-4465-933c-b9a7b152f57d\") " pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" Apr 25 00:11:54.711012 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.710991 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpzmv\" (UniqueName: \"kubernetes.io/projected/a71639bf-f015-4465-933c-b9a7b152f57d-kube-api-access-dpzmv\") pod \"error-404-isvc-c9f9f-predictor-75597dd9f7-czccf\" (UID: \"a71639bf-f015-4465-933c-b9a7b152f57d\") " pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" Apr 25 00:11:54.756437 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.756397 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" Apr 25 00:11:54.879647 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:54.879619 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf"] Apr 25 00:11:54.882077 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:11:54.882050 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda71639bf_f015_4465_933c_b9a7b152f57d.slice/crio-8ebbddc809acb6eafa959fa1af9808b791fd586e36bc8f6985cf607c3889e08b WatchSource:0}: Error finding container 8ebbddc809acb6eafa959fa1af9808b791fd586e36bc8f6985cf607c3889e08b: Status 404 returned error can't find the container with id 8ebbddc809acb6eafa959fa1af9808b791fd586e36bc8f6985cf607c3889e08b Apr 25 00:11:55.107163 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:55.107123 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eaa96e56-ae3f-4672-b36f-2043818f851d-proxy-tls\") pod \"success-200-isvc-c9f9f-predictor-8b8564854-4sftk\" (UID: \"eaa96e56-ae3f-4672-b36f-2043818f851d\") " pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" Apr 25 00:11:55.109660 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:55.109640 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eaa96e56-ae3f-4672-b36f-2043818f851d-proxy-tls\") pod \"success-200-isvc-c9f9f-predictor-8b8564854-4sftk\" (UID: \"eaa96e56-ae3f-4672-b36f-2043818f851d\") " pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" Apr 25 00:11:55.191152 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:55.191036 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" podUID="0e78737f-9719-418f-8287-6126f486855e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 25 00:11:55.264379 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:55.264342 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" Apr 25 00:11:55.305283 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:55.305244 2578 generic.go:358] "Generic (PLEG): container finished" podID="c70adfca-a5c8-4ccb-86b4-b51e8b0a7084" containerID="1b1f24e375f55941e3e58ffd84bf1b361081bd50e7e80c3e2adb86dc2f6d4b2d" exitCode=2 Apr 25 00:11:55.305478 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:55.305325 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" event={"ID":"c70adfca-a5c8-4ccb-86b4-b51e8b0a7084","Type":"ContainerDied","Data":"1b1f24e375f55941e3e58ffd84bf1b361081bd50e7e80c3e2adb86dc2f6d4b2d"} Apr 25 00:11:55.308020 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:55.307993 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" event={"ID":"a71639bf-f015-4465-933c-b9a7b152f57d","Type":"ContainerStarted","Data":"358738c5e173297603868a9a9c8c99e4d0b5fa09ed6b314fc7e83af7a06ee692"} Apr 25 00:11:55.308136 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:55.308029 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" event={"ID":"a71639bf-f015-4465-933c-b9a7b152f57d","Type":"ContainerStarted","Data":"b9a33f99bd39df47029a8e6c8b7d85c3568b0adae0b52528351f6d8b59d68803"} Apr 25 00:11:55.308136 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:55.308043 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" event={"ID":"a71639bf-f015-4465-933c-b9a7b152f57d","Type":"ContainerStarted","Data":"8ebbddc809acb6eafa959fa1af9808b791fd586e36bc8f6985cf607c3889e08b"} Apr 25 00:11:55.308270 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:55.308139 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" Apr 25 00:11:55.314644 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:55.314607 2578 generic.go:358] "Generic (PLEG): container finished" podID="64f2aa52-331d-430a-9dcd-ac49d6f610e4" containerID="0d92a951d415182a6a6354e3c09600935a1ec07b912eea0222c67fa243eea4a0" exitCode=2 Apr 25 00:11:55.314918 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:55.314897 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" event={"ID":"64f2aa52-331d-430a-9dcd-ac49d6f610e4","Type":"ContainerDied","Data":"0d92a951d415182a6a6354e3c09600935a1ec07b912eea0222c67fa243eea4a0"} Apr 25 00:11:55.330756 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:55.330614 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" podStartSLOduration=1.3305933699999999 podStartE2EDuration="1.33059337s" podCreationTimestamp="2026-04-25 00:11:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:11:55.329236697 +0000 UTC m=+1097.963218644" watchObservedRunningTime="2026-04-25 00:11:55.33059337 +0000 UTC m=+1097.964575307" Apr 25 00:11:55.398651 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:55.398613 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk"] Apr 25 00:11:55.402379 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:11:55.402333 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaa96e56_ae3f_4672_b36f_2043818f851d.slice/crio-a736187680aa474ec13eacfdf795c58a63bbb54a4eb814270bdde214558f6efb WatchSource:0}: Error finding container a736187680aa474ec13eacfdf795c58a63bbb54a4eb814270bdde214558f6efb: Status 404 returned error can't find the container with id a736187680aa474ec13eacfdf795c58a63bbb54a4eb814270bdde214558f6efb Apr 25 00:11:56.319097 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:56.319058 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" event={"ID":"eaa96e56-ae3f-4672-b36f-2043818f851d","Type":"ContainerStarted","Data":"d8eb0e6d8cc28eec926ee6a4f94be5f694a2861b3a8191341299fb5bc10f1a59"} Apr 25 00:11:56.319097 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:56.319103 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" event={"ID":"eaa96e56-ae3f-4672-b36f-2043818f851d","Type":"ContainerStarted","Data":"354c0f600ff0f0c6faec688dacaea2d6932328ee3ba272c1332d751f672c2ef9"} Apr 25 00:11:56.319606 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:56.319117 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" event={"ID":"eaa96e56-ae3f-4672-b36f-2043818f851d","Type":"ContainerStarted","Data":"a736187680aa474ec13eacfdf795c58a63bbb54a4eb814270bdde214558f6efb"} Apr 25 00:11:56.319676 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:56.319640 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" Apr 25 00:11:56.319676 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:56.319667 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" Apr 25 00:11:56.319676 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:56.319676 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" Apr 25 00:11:56.320521 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:56.320493 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" podUID="eaa96e56-ae3f-4672-b36f-2043818f851d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 25 00:11:56.320639 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:56.320615 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" podUID="a71639bf-f015-4465-933c-b9a7b152f57d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 25 00:11:56.335830 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:56.335787 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" podStartSLOduration=2.335773771 podStartE2EDuration="2.335773771s" podCreationTimestamp="2026-04-25 00:11:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:11:56.334430368 +0000 UTC m=+1098.968412298" watchObservedRunningTime="2026-04-25 00:11:56.335773771 +0000 UTC m=+1098.969755707" Apr 25 00:11:56.856182 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:56.856140 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" podUID="b6972d18-0bab-4ab8-ad84-f7ccf650b8ad" containerName="sequence-graph-b9ddf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:11:57.205079 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:57.204985 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" podUID="e196dbd6-90c3-4051-8b86-dc0bbe6b98cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 25 00:11:57.322840 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:57.322799 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" podUID="a71639bf-f015-4465-933c-b9a7b152f57d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 25 00:11:57.323217 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:57.322889 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" podUID="eaa96e56-ae3f-4672-b36f-2043818f851d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 25 00:11:57.668028 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:57.667983 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" podUID="c70adfca-a5c8-4ccb-86b4-b51e8b0a7084" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 25 00:11:57.668216 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:57.667988 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" podUID="64f2aa52-331d-430a-9dcd-ac49d6f610e4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.26:8643/healthz\": dial tcp 10.133.0.26:8643: connect: connection refused" Apr 25 00:11:57.673141 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:57.673115 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" podUID="64f2aa52-331d-430a-9dcd-ac49d6f610e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 25 00:11:57.673216 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:57.673118 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" podUID="c70adfca-a5c8-4ccb-86b4-b51e8b0a7084" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 25 00:11:58.328813 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.328783 2578 generic.go:358] "Generic (PLEG): container finished" podID="64f2aa52-331d-430a-9dcd-ac49d6f610e4" containerID="2c3c142c4d92c29a77469d8c3246eb7913f79e7895aca1b07774d2c6c5b11d3b" exitCode=0 Apr 25 00:11:58.329169 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.328843 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" event={"ID":"64f2aa52-331d-430a-9dcd-ac49d6f610e4","Type":"ContainerDied","Data":"2c3c142c4d92c29a77469d8c3246eb7913f79e7895aca1b07774d2c6c5b11d3b"} Apr 25 00:11:58.329169 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.328887 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" event={"ID":"64f2aa52-331d-430a-9dcd-ac49d6f610e4","Type":"ContainerDied","Data":"3dbd45727dbd68eb05889c749d9438f47d0e355d170bd0b3b13487efc1b63e18"} Apr 25 00:11:58.329169 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.328901 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dbd45727dbd68eb05889c749d9438f47d0e355d170bd0b3b13487efc1b63e18" Apr 25 00:11:58.330648 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.330624 2578 generic.go:358] "Generic (PLEG): container finished" podID="c70adfca-a5c8-4ccb-86b4-b51e8b0a7084" containerID="54bff80a3a2c7aa81f08fe16d25f1270064c78e8c1bfade2ec9319ee32a91278" exitCode=0 Apr 25 00:11:58.330761 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.330699 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" event={"ID":"c70adfca-a5c8-4ccb-86b4-b51e8b0a7084","Type":"ContainerDied","Data":"54bff80a3a2c7aa81f08fe16d25f1270064c78e8c1bfade2ec9319ee32a91278"} Apr 25 00:11:58.331059 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.331020 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" podUID="eaa96e56-ae3f-4672-b36f-2043818f851d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 25 00:11:58.340389 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.340372 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" Apr 25 00:11:58.401350 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.401327 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" Apr 25 00:11:58.534793 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.534755 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-b9ddf-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c70adfca-a5c8-4ccb-86b4-b51e8b0a7084-error-404-isvc-b9ddf-kube-rbac-proxy-sar-config\") pod \"c70adfca-a5c8-4ccb-86b4-b51e8b0a7084\" (UID: \"c70adfca-a5c8-4ccb-86b4-b51e8b0a7084\") " Apr 25 00:11:58.534971 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.534814 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vth7\" (UniqueName: \"kubernetes.io/projected/c70adfca-a5c8-4ccb-86b4-b51e8b0a7084-kube-api-access-9vth7\") pod \"c70adfca-a5c8-4ccb-86b4-b51e8b0a7084\" (UID: \"c70adfca-a5c8-4ccb-86b4-b51e8b0a7084\") " Apr 25 00:11:58.534971 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.534860 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-b9ddf-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/64f2aa52-331d-430a-9dcd-ac49d6f610e4-success-200-isvc-b9ddf-kube-rbac-proxy-sar-config\") pod \"64f2aa52-331d-430a-9dcd-ac49d6f610e4\" (UID: \"64f2aa52-331d-430a-9dcd-ac49d6f610e4\") " Apr 25 00:11:58.534971 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.534885 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c70adfca-a5c8-4ccb-86b4-b51e8b0a7084-proxy-tls\") pod \"c70adfca-a5c8-4ccb-86b4-b51e8b0a7084\" (UID: \"c70adfca-a5c8-4ccb-86b4-b51e8b0a7084\") " Apr 25 00:11:58.534971 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.534915 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sk5n\" (UniqueName: \"kubernetes.io/projected/64f2aa52-331d-430a-9dcd-ac49d6f610e4-kube-api-access-7sk5n\") pod \"64f2aa52-331d-430a-9dcd-ac49d6f610e4\" (UID: \"64f2aa52-331d-430a-9dcd-ac49d6f610e4\") " Apr 25 00:11:58.535199 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.535023 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/64f2aa52-331d-430a-9dcd-ac49d6f610e4-proxy-tls\") pod \"64f2aa52-331d-430a-9dcd-ac49d6f610e4\" (UID: \"64f2aa52-331d-430a-9dcd-ac49d6f610e4\") " Apr 25 00:11:58.535199 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.535172 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c70adfca-a5c8-4ccb-86b4-b51e8b0a7084-error-404-isvc-b9ddf-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-b9ddf-kube-rbac-proxy-sar-config") pod "c70adfca-a5c8-4ccb-86b4-b51e8b0a7084" (UID: "c70adfca-a5c8-4ccb-86b4-b51e8b0a7084"). InnerVolumeSpecName "error-404-isvc-b9ddf-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:11:58.535312 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.535257 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64f2aa52-331d-430a-9dcd-ac49d6f610e4-success-200-isvc-b9ddf-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-b9ddf-kube-rbac-proxy-sar-config") pod "64f2aa52-331d-430a-9dcd-ac49d6f610e4" (UID: "64f2aa52-331d-430a-9dcd-ac49d6f610e4"). InnerVolumeSpecName "success-200-isvc-b9ddf-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:11:58.535461 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.535339 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-b9ddf-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c70adfca-a5c8-4ccb-86b4-b51e8b0a7084-error-404-isvc-b9ddf-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:11:58.535461 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.535356 2578 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-b9ddf-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/64f2aa52-331d-430a-9dcd-ac49d6f610e4-success-200-isvc-b9ddf-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:11:58.537370 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.537338 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64f2aa52-331d-430a-9dcd-ac49d6f610e4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "64f2aa52-331d-430a-9dcd-ac49d6f610e4" (UID: "64f2aa52-331d-430a-9dcd-ac49d6f610e4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:11:58.537481 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.537371 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70adfca-a5c8-4ccb-86b4-b51e8b0a7084-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c70adfca-a5c8-4ccb-86b4-b51e8b0a7084" (UID: "c70adfca-a5c8-4ccb-86b4-b51e8b0a7084"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:11:58.537481 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.537385 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64f2aa52-331d-430a-9dcd-ac49d6f610e4-kube-api-access-7sk5n" (OuterVolumeSpecName: "kube-api-access-7sk5n") pod "64f2aa52-331d-430a-9dcd-ac49d6f610e4" (UID: "64f2aa52-331d-430a-9dcd-ac49d6f610e4"). InnerVolumeSpecName "kube-api-access-7sk5n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:11:58.537481 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.537456 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c70adfca-a5c8-4ccb-86b4-b51e8b0a7084-kube-api-access-9vth7" (OuterVolumeSpecName: "kube-api-access-9vth7") pod "c70adfca-a5c8-4ccb-86b4-b51e8b0a7084" (UID: "c70adfca-a5c8-4ccb-86b4-b51e8b0a7084"). InnerVolumeSpecName "kube-api-access-9vth7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:11:58.636015 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.635980 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9vth7\" (UniqueName: \"kubernetes.io/projected/c70adfca-a5c8-4ccb-86b4-b51e8b0a7084-kube-api-access-9vth7\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:11:58.636015 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.636009 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c70adfca-a5c8-4ccb-86b4-b51e8b0a7084-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:11:58.636015 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.636022 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7sk5n\" (UniqueName: \"kubernetes.io/projected/64f2aa52-331d-430a-9dcd-ac49d6f610e4-kube-api-access-7sk5n\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:11:58.636239 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:58.636030 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/64f2aa52-331d-430a-9dcd-ac49d6f610e4-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:11:59.334938 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:59.334911 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" Apr 25 00:11:59.334938 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:59.334937 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z" Apr 25 00:11:59.335440 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:59.334910 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k" event={"ID":"c70adfca-a5c8-4ccb-86b4-b51e8b0a7084","Type":"ContainerDied","Data":"d6f80bce6a2551a6f18658ba65d1a056e695a9f99b2009c286a98073234ec2b6"} Apr 25 00:11:59.335440 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:59.335020 2578 scope.go:117] "RemoveContainer" containerID="1b1f24e375f55941e3e58ffd84bf1b361081bd50e7e80c3e2adb86dc2f6d4b2d" Apr 25 00:11:59.344364 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:59.344344 2578 scope.go:117] "RemoveContainer" containerID="54bff80a3a2c7aa81f08fe16d25f1270064c78e8c1bfade2ec9319ee32a91278" Apr 25 00:11:59.358463 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:59.358436 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z"] Apr 25 00:11:59.360162 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:59.360142 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b9ddf-predictor-75548d844b-lf92z"] Apr 25 00:11:59.369904 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:59.369878 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k"] Apr 25 00:11:59.372317 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:59.372297 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b9ddf-predictor-66459c556f-np74k"] Apr 25 00:11:59.912340 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:59.912306 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64f2aa52-331d-430a-9dcd-ac49d6f610e4" path="/var/lib/kubelet/pods/64f2aa52-331d-430a-9dcd-ac49d6f610e4/volumes" Apr 25 00:11:59.912773 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:11:59.912760 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c70adfca-a5c8-4ccb-86b4-b51e8b0a7084" path="/var/lib/kubelet/pods/c70adfca-a5c8-4ccb-86b4-b51e8b0a7084/volumes" Apr 25 00:12:01.856038 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:01.856000 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" podUID="b6972d18-0bab-4ab8-ad84-f7ccf650b8ad" containerName="sequence-graph-b9ddf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:12:02.327355 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:02.327329 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" Apr 25 00:12:02.327910 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:02.327883 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" podUID="a71639bf-f015-4465-933c-b9a7b152f57d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 25 00:12:03.335456 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:03.335409 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" Apr 25 00:12:03.335944 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:03.335919 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" podUID="eaa96e56-ae3f-4672-b36f-2043818f851d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 25 00:12:05.190179 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:05.190132 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" podUID="0e78737f-9719-418f-8287-6126f486855e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 25 00:12:06.856435 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:06.856388 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" podUID="b6972d18-0bab-4ab8-ad84-f7ccf650b8ad" containerName="sequence-graph-b9ddf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:12:06.856926 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:06.856503 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" Apr 25 00:12:07.205155 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:07.205076 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" Apr 25 00:12:11.856103 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:11.856060 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" podUID="b6972d18-0bab-4ab8-ad84-f7ccf650b8ad" containerName="sequence-graph-b9ddf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:12:12.328331 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:12.328292 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" podUID="a71639bf-f015-4465-933c-b9a7b152f57d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 25 00:12:13.336533 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:13.336491 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" podUID="eaa96e56-ae3f-4672-b36f-2043818f851d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 25 00:12:15.190339 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:15.190303 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" Apr 25 00:12:16.856594 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:16.856545 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" podUID="b6972d18-0bab-4ab8-ad84-f7ccf650b8ad" containerName="sequence-graph-b9ddf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:12:21.856449 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:21.856388 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" podUID="b6972d18-0bab-4ab8-ad84-f7ccf650b8ad" containerName="sequence-graph-b9ddf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:12:22.327992 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:22.327955 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" podUID="a71639bf-f015-4465-933c-b9a7b152f57d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 25 00:12:23.335945 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:23.335908 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" podUID="eaa96e56-ae3f-4672-b36f-2043818f851d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 25 00:12:24.350862 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:24.350838 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" Apr 25 00:12:24.412839 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:24.412806 2578 generic.go:358] "Generic (PLEG): container finished" podID="b6972d18-0bab-4ab8-ad84-f7ccf650b8ad" containerID="38c364fc2f6dba60c2e6ae5b51f7715e97b5330997b25d775f65f07bac301671" exitCode=0 Apr 25 00:12:24.412997 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:24.412866 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" Apr 25 00:12:24.412997 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:24.412890 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" event={"ID":"b6972d18-0bab-4ab8-ad84-f7ccf650b8ad","Type":"ContainerDied","Data":"38c364fc2f6dba60c2e6ae5b51f7715e97b5330997b25d775f65f07bac301671"} Apr 25 00:12:24.412997 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:24.412930 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx" event={"ID":"b6972d18-0bab-4ab8-ad84-f7ccf650b8ad","Type":"ContainerDied","Data":"a6ddf1816d66043b41459b8085a83ea2ae2e4400515df9182e9cbf4fc100be6e"} Apr 25 00:12:24.412997 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:24.412947 2578 scope.go:117] "RemoveContainer" containerID="38c364fc2f6dba60c2e6ae5b51f7715e97b5330997b25d775f65f07bac301671" Apr 25 00:12:24.420992 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:24.420977 2578 scope.go:117] "RemoveContainer" containerID="38c364fc2f6dba60c2e6ae5b51f7715e97b5330997b25d775f65f07bac301671" Apr 25 00:12:24.421235 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:12:24.421218 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38c364fc2f6dba60c2e6ae5b51f7715e97b5330997b25d775f65f07bac301671\": container with ID starting with 38c364fc2f6dba60c2e6ae5b51f7715e97b5330997b25d775f65f07bac301671 not found: ID does not exist" containerID="38c364fc2f6dba60c2e6ae5b51f7715e97b5330997b25d775f65f07bac301671" Apr 25 00:12:24.421279 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:24.421246 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38c364fc2f6dba60c2e6ae5b51f7715e97b5330997b25d775f65f07bac301671"} err="failed to get container status \"38c364fc2f6dba60c2e6ae5b51f7715e97b5330997b25d775f65f07bac301671\": rpc error: code = NotFound desc = could not find container \"38c364fc2f6dba60c2e6ae5b51f7715e97b5330997b25d775f65f07bac301671\": container with ID starting with 38c364fc2f6dba60c2e6ae5b51f7715e97b5330997b25d775f65f07bac301671 not found: ID does not exist" Apr 25 00:12:24.463096 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:24.463032 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6972d18-0bab-4ab8-ad84-f7ccf650b8ad-openshift-service-ca-bundle\") pod \"b6972d18-0bab-4ab8-ad84-f7ccf650b8ad\" (UID: \"b6972d18-0bab-4ab8-ad84-f7ccf650b8ad\") " Apr 25 00:12:24.463201 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:24.463120 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6972d18-0bab-4ab8-ad84-f7ccf650b8ad-proxy-tls\") pod \"b6972d18-0bab-4ab8-ad84-f7ccf650b8ad\" (UID: \"b6972d18-0bab-4ab8-ad84-f7ccf650b8ad\") " Apr 25 00:12:24.463459 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:24.463408 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6972d18-0bab-4ab8-ad84-f7ccf650b8ad-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "b6972d18-0bab-4ab8-ad84-f7ccf650b8ad" (UID: "b6972d18-0bab-4ab8-ad84-f7ccf650b8ad"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:12:24.465294 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:24.465277 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6972d18-0bab-4ab8-ad84-f7ccf650b8ad-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b6972d18-0bab-4ab8-ad84-f7ccf650b8ad" (UID: "b6972d18-0bab-4ab8-ad84-f7ccf650b8ad"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:12:24.563815 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:24.563767 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6972d18-0bab-4ab8-ad84-f7ccf650b8ad-openshift-service-ca-bundle\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:12:24.563815 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:24.563810 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6972d18-0bab-4ab8-ad84-f7ccf650b8ad-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:12:24.733807 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:24.733761 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx"] Apr 25 00:12:24.735273 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:24.735250 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b9ddf-6dfbc67fd6-5m5nx"] Apr 25 00:12:25.912050 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:25.912019 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6972d18-0bab-4ab8-ad84-f7ccf650b8ad" path="/var/lib/kubelet/pods/b6972d18-0bab-4ab8-ad84-f7ccf650b8ad/volumes" Apr 25 00:12:28.422880 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.422846 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc"] Apr 25 00:12:28.423239 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.423212 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6972d18-0bab-4ab8-ad84-f7ccf650b8ad" containerName="sequence-graph-b9ddf" Apr 25 00:12:28.423239 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.423226 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6972d18-0bab-4ab8-ad84-f7ccf650b8ad" containerName="sequence-graph-b9ddf" Apr 25 00:12:28.423239 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.423238 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="64f2aa52-331d-430a-9dcd-ac49d6f610e4" containerName="kube-rbac-proxy" Apr 25 00:12:28.423338 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.423244 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="64f2aa52-331d-430a-9dcd-ac49d6f610e4" containerName="kube-rbac-proxy" Apr 25 00:12:28.423338 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.423251 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c70adfca-a5c8-4ccb-86b4-b51e8b0a7084" containerName="kserve-container" Apr 25 00:12:28.423338 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.423257 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70adfca-a5c8-4ccb-86b4-b51e8b0a7084" containerName="kserve-container" Apr 25 00:12:28.423338 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.423268 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="64f2aa52-331d-430a-9dcd-ac49d6f610e4" containerName="kserve-container" Apr 25 00:12:28.423338 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.423273 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="64f2aa52-331d-430a-9dcd-ac49d6f610e4" containerName="kserve-container" Apr 25 00:12:28.423338 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.423281 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c70adfca-a5c8-4ccb-86b4-b51e8b0a7084" containerName="kube-rbac-proxy" Apr 25 00:12:28.423338 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.423286 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70adfca-a5c8-4ccb-86b4-b51e8b0a7084" containerName="kube-rbac-proxy" Apr 25 00:12:28.423338 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.423338 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="64f2aa52-331d-430a-9dcd-ac49d6f610e4" containerName="kube-rbac-proxy" Apr 25 00:12:28.423621 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.423347 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c70adfca-a5c8-4ccb-86b4-b51e8b0a7084" containerName="kube-rbac-proxy" Apr 25 00:12:28.423621 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.423355 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c70adfca-a5c8-4ccb-86b4-b51e8b0a7084" containerName="kserve-container" Apr 25 00:12:28.423621 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.423364 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b6972d18-0bab-4ab8-ad84-f7ccf650b8ad" containerName="sequence-graph-b9ddf" Apr 25 00:12:28.423621 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.423371 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="64f2aa52-331d-430a-9dcd-ac49d6f610e4" containerName="kserve-container" Apr 25 00:12:28.426295 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.426275 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" Apr 25 00:12:28.428561 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.428540 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-d627c-kube-rbac-proxy-sar-config\"" Apr 25 00:12:28.428753 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.428574 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-d627c-serving-cert\"" Apr 25 00:12:28.433489 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.433277 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc"] Apr 25 00:12:28.495762 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.495726 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5aaf2d5-bec8-4036-9be1-94e6fafc9910-proxy-tls\") pod \"ensemble-graph-d627c-66c94ccd96-ml4rc\" (UID: \"d5aaf2d5-bec8-4036-9be1-94e6fafc9910\") " pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" Apr 25 00:12:28.495921 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.495790 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5aaf2d5-bec8-4036-9be1-94e6fafc9910-openshift-service-ca-bundle\") pod \"ensemble-graph-d627c-66c94ccd96-ml4rc\" (UID: \"d5aaf2d5-bec8-4036-9be1-94e6fafc9910\") " pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" Apr 25 00:12:28.596503 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.596468 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5aaf2d5-bec8-4036-9be1-94e6fafc9910-proxy-tls\") pod \"ensemble-graph-d627c-66c94ccd96-ml4rc\" (UID: \"d5aaf2d5-bec8-4036-9be1-94e6fafc9910\") " pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" Apr 25 00:12:28.596644 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.596535 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5aaf2d5-bec8-4036-9be1-94e6fafc9910-openshift-service-ca-bundle\") pod \"ensemble-graph-d627c-66c94ccd96-ml4rc\" (UID: \"d5aaf2d5-bec8-4036-9be1-94e6fafc9910\") " pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" Apr 25 00:12:28.596644 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:12:28.596615 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-d627c-serving-cert: secret "ensemble-graph-d627c-serving-cert" not found Apr 25 00:12:28.596713 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:12:28.596686 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5aaf2d5-bec8-4036-9be1-94e6fafc9910-proxy-tls podName:d5aaf2d5-bec8-4036-9be1-94e6fafc9910 nodeName:}" failed. No retries permitted until 2026-04-25 00:12:29.096668099 +0000 UTC m=+1131.730650015 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d5aaf2d5-bec8-4036-9be1-94e6fafc9910-proxy-tls") pod "ensemble-graph-d627c-66c94ccd96-ml4rc" (UID: "d5aaf2d5-bec8-4036-9be1-94e6fafc9910") : secret "ensemble-graph-d627c-serving-cert" not found Apr 25 00:12:28.597129 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:28.597109 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5aaf2d5-bec8-4036-9be1-94e6fafc9910-openshift-service-ca-bundle\") pod \"ensemble-graph-d627c-66c94ccd96-ml4rc\" (UID: \"d5aaf2d5-bec8-4036-9be1-94e6fafc9910\") " pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" Apr 25 00:12:29.101006 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:29.100960 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5aaf2d5-bec8-4036-9be1-94e6fafc9910-proxy-tls\") pod \"ensemble-graph-d627c-66c94ccd96-ml4rc\" (UID: \"d5aaf2d5-bec8-4036-9be1-94e6fafc9910\") " pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" Apr 25 00:12:29.101187 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:12:29.101108 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-d627c-serving-cert: secret "ensemble-graph-d627c-serving-cert" not found Apr 25 00:12:29.101187 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:12:29.101184 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5aaf2d5-bec8-4036-9be1-94e6fafc9910-proxy-tls podName:d5aaf2d5-bec8-4036-9be1-94e6fafc9910 nodeName:}" failed. No retries permitted until 2026-04-25 00:12:30.101165574 +0000 UTC m=+1132.735147488 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d5aaf2d5-bec8-4036-9be1-94e6fafc9910-proxy-tls") pod "ensemble-graph-d627c-66c94ccd96-ml4rc" (UID: "d5aaf2d5-bec8-4036-9be1-94e6fafc9910") : secret "ensemble-graph-d627c-serving-cert" not found Apr 25 00:12:30.110671 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:30.110627 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5aaf2d5-bec8-4036-9be1-94e6fafc9910-proxy-tls\") pod \"ensemble-graph-d627c-66c94ccd96-ml4rc\" (UID: \"d5aaf2d5-bec8-4036-9be1-94e6fafc9910\") " pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" Apr 25 00:12:30.113325 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:30.113299 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5aaf2d5-bec8-4036-9be1-94e6fafc9910-proxy-tls\") pod \"ensemble-graph-d627c-66c94ccd96-ml4rc\" (UID: \"d5aaf2d5-bec8-4036-9be1-94e6fafc9910\") " pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" Apr 25 00:12:30.237745 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:30.237707 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" Apr 25 00:12:30.358190 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:30.358167 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc"] Apr 25 00:12:30.360844 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:12:30.360764 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5aaf2d5_bec8_4036_9be1_94e6fafc9910.slice/crio-8bce6038206526d1a115c2fea9e66f5f3f437bdcfa1f6a574a81c2dec6b5ebaf WatchSource:0}: Error finding container 8bce6038206526d1a115c2fea9e66f5f3f437bdcfa1f6a574a81c2dec6b5ebaf: Status 404 returned error can't find the container with id 8bce6038206526d1a115c2fea9e66f5f3f437bdcfa1f6a574a81c2dec6b5ebaf Apr 25 00:12:30.433583 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:30.433541 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" event={"ID":"d5aaf2d5-bec8-4036-9be1-94e6fafc9910","Type":"ContainerStarted","Data":"e2435470ddd87807dbfdda1d236d69ee4a69e18e3f30a1a9a194d4d1867d85af"} Apr 25 00:12:30.433732 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:30.433587 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" event={"ID":"d5aaf2d5-bec8-4036-9be1-94e6fafc9910","Type":"ContainerStarted","Data":"8bce6038206526d1a115c2fea9e66f5f3f437bdcfa1f6a574a81c2dec6b5ebaf"} Apr 25 00:12:30.433732 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:30.433633 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" Apr 25 00:12:30.447666 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:30.447612 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" podStartSLOduration=2.447595767 podStartE2EDuration="2.447595767s" podCreationTimestamp="2026-04-25 00:12:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:12:30.447297677 +0000 UTC m=+1133.081279623" watchObservedRunningTime="2026-04-25 00:12:30.447595767 +0000 UTC m=+1133.081577694" Apr 25 00:12:32.328906 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:32.328867 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" podUID="a71639bf-f015-4465-933c-b9a7b152f57d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 25 00:12:33.336125 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:33.336084 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" podUID="eaa96e56-ae3f-4672-b36f-2043818f851d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 25 00:12:36.444292 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:36.444262 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" Apr 25 00:12:37.911199 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:37.911163 2578 scope.go:117] "RemoveContainer" containerID="2c3c142c4d92c29a77469d8c3246eb7913f79e7895aca1b07774d2c6c5b11d3b" Apr 25 00:12:37.919161 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:37.919140 2578 scope.go:117] "RemoveContainer" containerID="0d92a951d415182a6a6354e3c09600935a1ec07b912eea0222c67fa243eea4a0" Apr 25 00:12:38.477224 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.477194 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc"] Apr 25 00:12:38.477466 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.477444 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" podUID="d5aaf2d5-bec8-4036-9be1-94e6fafc9910" containerName="ensemble-graph-d627c" containerID="cri-o://e2435470ddd87807dbfdda1d236d69ee4a69e18e3f30a1a9a194d4d1867d85af" gracePeriod=30 Apr 25 00:12:38.580504 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.580468 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j"] Apr 25 00:12:38.580841 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.580813 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" podUID="0e78737f-9719-418f-8287-6126f486855e" containerName="kserve-container" containerID="cri-o://55651d9b6384f4c2cb612171cb2aab3a9bf9b0978b1e13ce6212ea25595bab3a" gracePeriod=30 Apr 25 00:12:38.580925 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.580842 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" podUID="0e78737f-9719-418f-8287-6126f486855e" containerName="kube-rbac-proxy" containerID="cri-o://f68ee2d1d9f115e6ce17b987f1eb999486e7208a8ea12aac4f1178ed0d6ab5d0" gracePeriod=30 Apr 25 00:12:38.601470 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.601435 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc"] Apr 25 00:12:38.606554 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.606537 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" Apr 25 00:12:38.608727 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.608707 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-ac6c8-predictor-serving-cert\"" Apr 25 00:12:38.608727 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.608718 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-ac6c8-kube-rbac-proxy-sar-config\"" Apr 25 00:12:38.615170 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.615149 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc"] Apr 25 00:12:38.673958 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.673912 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq"] Apr 25 00:12:38.674228 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.674206 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" podUID="e196dbd6-90c3-4051-8b86-dc0bbe6b98cf" containerName="kserve-container" containerID="cri-o://da1334e9f2c3471a957429422676e5d6adc3a51822da51b254cc4f000abeed65" gracePeriod=30 Apr 25 00:12:38.674324 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.674300 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" podUID="e196dbd6-90c3-4051-8b86-dc0bbe6b98cf" containerName="kube-rbac-proxy" containerID="cri-o://f50b476428d3569bcb9ab6e0f547eb5ae355b5815703665db2dbb36c25f9d6fa" gracePeriod=30 Apr 25 00:12:38.681308 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.681282 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3043916b-aef2-46c0-b90a-f63cfec5c4e1-proxy-tls\") pod \"success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc\" (UID: \"3043916b-aef2-46c0-b90a-f63cfec5c4e1\") " pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" Apr 25 00:12:38.681479 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.681361 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx5nk\" (UniqueName: \"kubernetes.io/projected/3043916b-aef2-46c0-b90a-f63cfec5c4e1-kube-api-access-dx5nk\") pod \"success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc\" (UID: \"3043916b-aef2-46c0-b90a-f63cfec5c4e1\") " pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" Apr 25 00:12:38.681479 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.681460 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-ac6c8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3043916b-aef2-46c0-b90a-f63cfec5c4e1-success-200-isvc-ac6c8-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc\" (UID: \"3043916b-aef2-46c0-b90a-f63cfec5c4e1\") " pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" Apr 25 00:12:38.712770 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.712742 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf"] Apr 25 00:12:38.716165 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.716150 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" Apr 25 00:12:38.718438 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.718403 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-ac6c8-kube-rbac-proxy-sar-config\"" Apr 25 00:12:38.718560 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.718506 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-ac6c8-predictor-serving-cert\"" Apr 25 00:12:38.724145 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.724122 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf"] Apr 25 00:12:38.782248 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.782217 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3043916b-aef2-46c0-b90a-f63cfec5c4e1-proxy-tls\") pod \"success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc\" (UID: \"3043916b-aef2-46c0-b90a-f63cfec5c4e1\") " pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" Apr 25 00:12:38.782378 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.782314 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dx5nk\" (UniqueName: \"kubernetes.io/projected/3043916b-aef2-46c0-b90a-f63cfec5c4e1-kube-api-access-dx5nk\") pod \"success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc\" (UID: \"3043916b-aef2-46c0-b90a-f63cfec5c4e1\") " pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" Apr 25 00:12:38.782378 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:12:38.782362 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-serving-cert: secret "success-200-isvc-ac6c8-predictor-serving-cert" not found Apr 25 00:12:38.782522 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.782390 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-ac6c8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3043916b-aef2-46c0-b90a-f63cfec5c4e1-success-200-isvc-ac6c8-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc\" (UID: \"3043916b-aef2-46c0-b90a-f63cfec5c4e1\") " pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" Apr 25 00:12:38.782522 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:12:38.782468 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3043916b-aef2-46c0-b90a-f63cfec5c4e1-proxy-tls podName:3043916b-aef2-46c0-b90a-f63cfec5c4e1 nodeName:}" failed. No retries permitted until 2026-04-25 00:12:39.282442593 +0000 UTC m=+1141.916424511 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/3043916b-aef2-46c0-b90a-f63cfec5c4e1-proxy-tls") pod "success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" (UID: "3043916b-aef2-46c0-b90a-f63cfec5c4e1") : secret "success-200-isvc-ac6c8-predictor-serving-cert" not found Apr 25 00:12:38.783113 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.783087 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-ac6c8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3043916b-aef2-46c0-b90a-f63cfec5c4e1-success-200-isvc-ac6c8-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc\" (UID: \"3043916b-aef2-46c0-b90a-f63cfec5c4e1\") " pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" Apr 25 00:12:38.791486 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.791456 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx5nk\" (UniqueName: \"kubernetes.io/projected/3043916b-aef2-46c0-b90a-f63cfec5c4e1-kube-api-access-dx5nk\") pod \"success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc\" (UID: \"3043916b-aef2-46c0-b90a-f63cfec5c4e1\") " pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" Apr 25 00:12:38.883515 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.883473 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a07e833-145e-41dc-bec3-c09231467d16-proxy-tls\") pod \"error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf\" (UID: \"3a07e833-145e-41dc-bec3-c09231467d16\") " pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" Apr 25 00:12:38.883705 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.883612 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-ac6c8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3a07e833-145e-41dc-bec3-c09231467d16-error-404-isvc-ac6c8-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf\" (UID: \"3a07e833-145e-41dc-bec3-c09231467d16\") " pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" Apr 25 00:12:38.883705 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.883694 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sml2h\" (UniqueName: \"kubernetes.io/projected/3a07e833-145e-41dc-bec3-c09231467d16-kube-api-access-sml2h\") pod \"error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf\" (UID: \"3a07e833-145e-41dc-bec3-c09231467d16\") " pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" Apr 25 00:12:38.984518 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.984476 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a07e833-145e-41dc-bec3-c09231467d16-proxy-tls\") pod \"error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf\" (UID: \"3a07e833-145e-41dc-bec3-c09231467d16\") " pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" Apr 25 00:12:38.984975 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.984579 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-ac6c8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3a07e833-145e-41dc-bec3-c09231467d16-error-404-isvc-ac6c8-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf\" (UID: \"3a07e833-145e-41dc-bec3-c09231467d16\") " pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" Apr 25 00:12:38.984975 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.984634 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sml2h\" (UniqueName: \"kubernetes.io/projected/3a07e833-145e-41dc-bec3-c09231467d16-kube-api-access-sml2h\") pod \"error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf\" (UID: \"3a07e833-145e-41dc-bec3-c09231467d16\") " pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" Apr 25 00:12:38.985908 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.985879 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-ac6c8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3a07e833-145e-41dc-bec3-c09231467d16-error-404-isvc-ac6c8-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf\" (UID: \"3a07e833-145e-41dc-bec3-c09231467d16\") " pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" Apr 25 00:12:38.988218 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.988190 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a07e833-145e-41dc-bec3-c09231467d16-proxy-tls\") pod \"error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf\" (UID: \"3a07e833-145e-41dc-bec3-c09231467d16\") " pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" Apr 25 00:12:38.993593 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:38.993571 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sml2h\" (UniqueName: \"kubernetes.io/projected/3a07e833-145e-41dc-bec3-c09231467d16-kube-api-access-sml2h\") pod \"error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf\" (UID: \"3a07e833-145e-41dc-bec3-c09231467d16\") " pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" Apr 25 00:12:39.028132 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:39.028102 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" Apr 25 00:12:39.153816 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:39.153792 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf"] Apr 25 00:12:39.155739 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:12:39.155717 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a07e833_145e_41dc_bec3_c09231467d16.slice/crio-1f309ebb92b2027af56f6f2c3e45c35dc31c26094774c6726f4de7924b506f0a WatchSource:0}: Error finding container 1f309ebb92b2027af56f6f2c3e45c35dc31c26094774c6726f4de7924b506f0a: Status 404 returned error can't find the container with id 1f309ebb92b2027af56f6f2c3e45c35dc31c26094774c6726f4de7924b506f0a Apr 25 00:12:39.287806 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:39.287722 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3043916b-aef2-46c0-b90a-f63cfec5c4e1-proxy-tls\") pod \"success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc\" (UID: \"3043916b-aef2-46c0-b90a-f63cfec5c4e1\") " pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" Apr 25 00:12:39.290867 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:39.290840 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3043916b-aef2-46c0-b90a-f63cfec5c4e1-proxy-tls\") pod \"success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc\" (UID: \"3043916b-aef2-46c0-b90a-f63cfec5c4e1\") " pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" Apr 25 00:12:39.465121 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:39.465079 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" event={"ID":"3a07e833-145e-41dc-bec3-c09231467d16","Type":"ContainerStarted","Data":"30014a48c33fa8753ebbf40ae1644b3dd09bf791003add40c14ae55441331b77"} Apr 25 00:12:39.465121 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:39.465124 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" event={"ID":"3a07e833-145e-41dc-bec3-c09231467d16","Type":"ContainerStarted","Data":"e0e20dadf1b35bb1666d3394eed92e4f6c1b15079ee3d33656ead05235f82722"} Apr 25 00:12:39.465380 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:39.465137 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" event={"ID":"3a07e833-145e-41dc-bec3-c09231467d16","Type":"ContainerStarted","Data":"1f309ebb92b2027af56f6f2c3e45c35dc31c26094774c6726f4de7924b506f0a"} Apr 25 00:12:39.465380 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:39.465221 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" Apr 25 00:12:39.466671 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:39.466643 2578 generic.go:358] "Generic (PLEG): container finished" podID="0e78737f-9719-418f-8287-6126f486855e" containerID="f68ee2d1d9f115e6ce17b987f1eb999486e7208a8ea12aac4f1178ed0d6ab5d0" exitCode=2 Apr 25 00:12:39.466812 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:39.466715 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" event={"ID":"0e78737f-9719-418f-8287-6126f486855e","Type":"ContainerDied","Data":"f68ee2d1d9f115e6ce17b987f1eb999486e7208a8ea12aac4f1178ed0d6ab5d0"} Apr 25 00:12:39.468205 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:39.468183 2578 generic.go:358] "Generic (PLEG): container finished" podID="e196dbd6-90c3-4051-8b86-dc0bbe6b98cf" containerID="f50b476428d3569bcb9ab6e0f547eb5ae355b5815703665db2dbb36c25f9d6fa" exitCode=2 Apr 25 00:12:39.468332 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:39.468211 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" event={"ID":"e196dbd6-90c3-4051-8b86-dc0bbe6b98cf","Type":"ContainerDied","Data":"f50b476428d3569bcb9ab6e0f547eb5ae355b5815703665db2dbb36c25f9d6fa"} Apr 25 00:12:39.483136 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:39.483080 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" podStartSLOduration=1.483062883 podStartE2EDuration="1.483062883s" podCreationTimestamp="2026-04-25 00:12:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:12:39.480695452 +0000 UTC m=+1142.114677389" watchObservedRunningTime="2026-04-25 00:12:39.483062883 +0000 UTC m=+1142.117044816" Apr 25 00:12:39.516590 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:39.516554 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" Apr 25 00:12:39.643313 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:39.643288 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc"] Apr 25 00:12:39.645669 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:12:39.645631 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3043916b_aef2_46c0_b90a_f63cfec5c4e1.slice/crio-c3e9a08c2d42334e5ffe763c6b5420679baac92276209f870dee0aa434f2ad54 WatchSource:0}: Error finding container c3e9a08c2d42334e5ffe763c6b5420679baac92276209f870dee0aa434f2ad54: Status 404 returned error can't find the container with id c3e9a08c2d42334e5ffe763c6b5420679baac92276209f870dee0aa434f2ad54 Apr 25 00:12:40.186244 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:40.186185 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" podUID="0e78737f-9719-418f-8287-6126f486855e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.30:8643/healthz\": dial tcp 10.133.0.30:8643: connect: connection refused" Apr 25 00:12:40.473670 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:40.473578 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" event={"ID":"3043916b-aef2-46c0-b90a-f63cfec5c4e1","Type":"ContainerStarted","Data":"77bf0be5beb38dbcd1196700152ace65684a8e839a8df1bb0c9f08bcea679162"} Apr 25 00:12:40.473670 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:40.473618 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" event={"ID":"3043916b-aef2-46c0-b90a-f63cfec5c4e1","Type":"ContainerStarted","Data":"fd27b8134b6511bc61b98a2731afe713e6a6e4a8282c597cca4fb3e4d2d6e7aa"} Apr 25 00:12:40.473670 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:40.473627 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" event={"ID":"3043916b-aef2-46c0-b90a-f63cfec5c4e1","Type":"ContainerStarted","Data":"c3e9a08c2d42334e5ffe763c6b5420679baac92276209f870dee0aa434f2ad54"} Apr 25 00:12:40.473938 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:40.473697 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" Apr 25 00:12:40.474024 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:40.473994 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" Apr 25 00:12:40.475308 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:40.475284 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" podUID="3a07e833-145e-41dc-bec3-c09231467d16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 25 00:12:40.490216 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:40.490163 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" podStartSLOduration=2.490145972 podStartE2EDuration="2.490145972s" podCreationTimestamp="2026-04-25 00:12:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:12:40.489209008 +0000 UTC m=+1143.123190944" watchObservedRunningTime="2026-04-25 00:12:40.490145972 +0000 UTC m=+1143.124127909" Apr 25 00:12:41.442980 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:41.442937 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" podUID="d5aaf2d5-bec8-4036-9be1-94e6fafc9910" containerName="ensemble-graph-d627c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:12:41.477267 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:41.477219 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" Apr 25 00:12:41.477467 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:41.477259 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" podUID="3a07e833-145e-41dc-bec3-c09231467d16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 25 00:12:41.478523 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:41.478499 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" podUID="3043916b-aef2-46c0-b90a-f63cfec5c4e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 25 00:12:42.199961 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.199926 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" podUID="e196dbd6-90c3-4051-8b86-dc0bbe6b98cf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.31:8643/healthz\": dial tcp 10.133.0.31:8643: connect: connection refused" Apr 25 00:12:42.317436 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.317390 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" Apr 25 00:12:42.328723 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.328702 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" Apr 25 00:12:42.418253 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.418224 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96s72\" (UniqueName: \"kubernetes.io/projected/e196dbd6-90c3-4051-8b86-dc0bbe6b98cf-kube-api-access-96s72\") pod \"e196dbd6-90c3-4051-8b86-dc0bbe6b98cf\" (UID: \"e196dbd6-90c3-4051-8b86-dc0bbe6b98cf\") " Apr 25 00:12:42.418401 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.418269 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-d627c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e196dbd6-90c3-4051-8b86-dc0bbe6b98cf-error-404-isvc-d627c-kube-rbac-proxy-sar-config\") pod \"e196dbd6-90c3-4051-8b86-dc0bbe6b98cf\" (UID: \"e196dbd6-90c3-4051-8b86-dc0bbe6b98cf\") " Apr 25 00:12:42.418401 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.418299 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e196dbd6-90c3-4051-8b86-dc0bbe6b98cf-proxy-tls\") pod \"e196dbd6-90c3-4051-8b86-dc0bbe6b98cf\" (UID: \"e196dbd6-90c3-4051-8b86-dc0bbe6b98cf\") " Apr 25 00:12:42.418711 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.418681 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e196dbd6-90c3-4051-8b86-dc0bbe6b98cf-error-404-isvc-d627c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-d627c-kube-rbac-proxy-sar-config") pod "e196dbd6-90c3-4051-8b86-dc0bbe6b98cf" (UID: "e196dbd6-90c3-4051-8b86-dc0bbe6b98cf"). InnerVolumeSpecName "error-404-isvc-d627c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:12:42.420551 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.420531 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e196dbd6-90c3-4051-8b86-dc0bbe6b98cf-kube-api-access-96s72" (OuterVolumeSpecName: "kube-api-access-96s72") pod "e196dbd6-90c3-4051-8b86-dc0bbe6b98cf" (UID: "e196dbd6-90c3-4051-8b86-dc0bbe6b98cf"). InnerVolumeSpecName "kube-api-access-96s72". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:12:42.420633 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.420530 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e196dbd6-90c3-4051-8b86-dc0bbe6b98cf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e196dbd6-90c3-4051-8b86-dc0bbe6b98cf" (UID: "e196dbd6-90c3-4051-8b86-dc0bbe6b98cf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:12:42.481186 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.481154 2578 generic.go:358] "Generic (PLEG): container finished" podID="e196dbd6-90c3-4051-8b86-dc0bbe6b98cf" containerID="da1334e9f2c3471a957429422676e5d6adc3a51822da51b254cc4f000abeed65" exitCode=0 Apr 25 00:12:42.481624 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.481224 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" Apr 25 00:12:42.481624 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.481236 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" event={"ID":"e196dbd6-90c3-4051-8b86-dc0bbe6b98cf","Type":"ContainerDied","Data":"da1334e9f2c3471a957429422676e5d6adc3a51822da51b254cc4f000abeed65"} Apr 25 00:12:42.481624 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.481276 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq" event={"ID":"e196dbd6-90c3-4051-8b86-dc0bbe6b98cf","Type":"ContainerDied","Data":"d2ed47823f15816ecce904609056dc6ed0783d90f509065e1caf1ff37d2881ff"} Apr 25 00:12:42.481624 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.481291 2578 scope.go:117] "RemoveContainer" containerID="f50b476428d3569bcb9ab6e0f547eb5ae355b5815703665db2dbb36c25f9d6fa" Apr 25 00:12:42.481822 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.481773 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" podUID="3043916b-aef2-46c0-b90a-f63cfec5c4e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 25 00:12:42.493894 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.493875 2578 scope.go:117] "RemoveContainer" containerID="da1334e9f2c3471a957429422676e5d6adc3a51822da51b254cc4f000abeed65" Apr 25 00:12:42.501877 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.501812 2578 scope.go:117] "RemoveContainer" containerID="f50b476428d3569bcb9ab6e0f547eb5ae355b5815703665db2dbb36c25f9d6fa" Apr 25 00:12:42.502144 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:12:42.502124 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f50b476428d3569bcb9ab6e0f547eb5ae355b5815703665db2dbb36c25f9d6fa\": container with ID starting with f50b476428d3569bcb9ab6e0f547eb5ae355b5815703665db2dbb36c25f9d6fa not found: ID does not exist" containerID="f50b476428d3569bcb9ab6e0f547eb5ae355b5815703665db2dbb36c25f9d6fa" Apr 25 00:12:42.502200 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.502153 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f50b476428d3569bcb9ab6e0f547eb5ae355b5815703665db2dbb36c25f9d6fa"} err="failed to get container status \"f50b476428d3569bcb9ab6e0f547eb5ae355b5815703665db2dbb36c25f9d6fa\": rpc error: code = NotFound desc = could not find container \"f50b476428d3569bcb9ab6e0f547eb5ae355b5815703665db2dbb36c25f9d6fa\": container with ID starting with f50b476428d3569bcb9ab6e0f547eb5ae355b5815703665db2dbb36c25f9d6fa not found: ID does not exist" Apr 25 00:12:42.502200 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.502169 2578 scope.go:117] "RemoveContainer" containerID="da1334e9f2c3471a957429422676e5d6adc3a51822da51b254cc4f000abeed65" Apr 25 00:12:42.502427 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:12:42.502393 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da1334e9f2c3471a957429422676e5d6adc3a51822da51b254cc4f000abeed65\": container with ID starting with da1334e9f2c3471a957429422676e5d6adc3a51822da51b254cc4f000abeed65 not found: ID does not exist" containerID="da1334e9f2c3471a957429422676e5d6adc3a51822da51b254cc4f000abeed65" Apr 25 00:12:42.502491 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.502450 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1334e9f2c3471a957429422676e5d6adc3a51822da51b254cc4f000abeed65"} err="failed to get container status \"da1334e9f2c3471a957429422676e5d6adc3a51822da51b254cc4f000abeed65\": rpc error: code = NotFound desc = could not find container \"da1334e9f2c3471a957429422676e5d6adc3a51822da51b254cc4f000abeed65\": container with ID starting with da1334e9f2c3471a957429422676e5d6adc3a51822da51b254cc4f000abeed65 not found: ID does not exist" Apr 25 00:12:42.506661 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.506623 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq"] Apr 25 00:12:42.511802 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.511777 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d627c-predictor-859777b8f9-r2nqq"] Apr 25 00:12:42.519432 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.519395 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-96s72\" (UniqueName: \"kubernetes.io/projected/e196dbd6-90c3-4051-8b86-dc0bbe6b98cf-kube-api-access-96s72\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:12:42.519504 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.519435 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-d627c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e196dbd6-90c3-4051-8b86-dc0bbe6b98cf-error-404-isvc-d627c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:12:42.519504 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.519448 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e196dbd6-90c3-4051-8b86-dc0bbe6b98cf-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:12:42.614387 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.614365 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" Apr 25 00:12:42.721165 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.721063 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0e78737f-9719-418f-8287-6126f486855e-proxy-tls\") pod \"0e78737f-9719-418f-8287-6126f486855e\" (UID: \"0e78737f-9719-418f-8287-6126f486855e\") " Apr 25 00:12:42.721165 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.721164 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv7dn\" (UniqueName: \"kubernetes.io/projected/0e78737f-9719-418f-8287-6126f486855e-kube-api-access-fv7dn\") pod \"0e78737f-9719-418f-8287-6126f486855e\" (UID: \"0e78737f-9719-418f-8287-6126f486855e\") " Apr 25 00:12:42.721390 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.721214 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-d627c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0e78737f-9719-418f-8287-6126f486855e-success-200-isvc-d627c-kube-rbac-proxy-sar-config\") pod \"0e78737f-9719-418f-8287-6126f486855e\" (UID: \"0e78737f-9719-418f-8287-6126f486855e\") " Apr 25 00:12:42.721751 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.721682 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e78737f-9719-418f-8287-6126f486855e-success-200-isvc-d627c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-d627c-kube-rbac-proxy-sar-config") pod "0e78737f-9719-418f-8287-6126f486855e" (UID: "0e78737f-9719-418f-8287-6126f486855e"). InnerVolumeSpecName "success-200-isvc-d627c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:12:42.723389 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.723362 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e78737f-9719-418f-8287-6126f486855e-kube-api-access-fv7dn" (OuterVolumeSpecName: "kube-api-access-fv7dn") pod "0e78737f-9719-418f-8287-6126f486855e" (UID: "0e78737f-9719-418f-8287-6126f486855e"). InnerVolumeSpecName "kube-api-access-fv7dn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:12:42.723389 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.723372 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e78737f-9719-418f-8287-6126f486855e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0e78737f-9719-418f-8287-6126f486855e" (UID: "0e78737f-9719-418f-8287-6126f486855e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:12:42.822865 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.822822 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fv7dn\" (UniqueName: \"kubernetes.io/projected/0e78737f-9719-418f-8287-6126f486855e-kube-api-access-fv7dn\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:12:42.822865 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.822854 2578 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-d627c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0e78737f-9719-418f-8287-6126f486855e-success-200-isvc-d627c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:12:42.822865 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:42.822865 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0e78737f-9719-418f-8287-6126f486855e-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:12:43.337240 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:43.337203 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" Apr 25 00:12:43.487149 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:43.487063 2578 generic.go:358] "Generic (PLEG): container finished" podID="0e78737f-9719-418f-8287-6126f486855e" containerID="55651d9b6384f4c2cb612171cb2aab3a9bf9b0978b1e13ce6212ea25595bab3a" exitCode=0 Apr 25 00:12:43.487149 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:43.487104 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" event={"ID":"0e78737f-9719-418f-8287-6126f486855e","Type":"ContainerDied","Data":"55651d9b6384f4c2cb612171cb2aab3a9bf9b0978b1e13ce6212ea25595bab3a"} Apr 25 00:12:43.487149 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:43.487127 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" event={"ID":"0e78737f-9719-418f-8287-6126f486855e","Type":"ContainerDied","Data":"d7ed0756ce880c69349a34cb287e90bb4ac10f4d49bd99d6ca89978976c65136"} Apr 25 00:12:43.487149 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:43.487149 2578 scope.go:117] "RemoveContainer" containerID="f68ee2d1d9f115e6ce17b987f1eb999486e7208a8ea12aac4f1178ed0d6ab5d0" Apr 25 00:12:43.487754 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:43.487193 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j" Apr 25 00:12:43.495406 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:43.495390 2578 scope.go:117] "RemoveContainer" containerID="55651d9b6384f4c2cb612171cb2aab3a9bf9b0978b1e13ce6212ea25595bab3a" Apr 25 00:12:43.502872 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:43.502849 2578 scope.go:117] "RemoveContainer" containerID="f68ee2d1d9f115e6ce17b987f1eb999486e7208a8ea12aac4f1178ed0d6ab5d0" Apr 25 00:12:43.503110 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:12:43.503088 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f68ee2d1d9f115e6ce17b987f1eb999486e7208a8ea12aac4f1178ed0d6ab5d0\": container with ID starting with f68ee2d1d9f115e6ce17b987f1eb999486e7208a8ea12aac4f1178ed0d6ab5d0 not found: ID does not exist" containerID="f68ee2d1d9f115e6ce17b987f1eb999486e7208a8ea12aac4f1178ed0d6ab5d0" Apr 25 00:12:43.503187 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:43.503122 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f68ee2d1d9f115e6ce17b987f1eb999486e7208a8ea12aac4f1178ed0d6ab5d0"} err="failed to get container status \"f68ee2d1d9f115e6ce17b987f1eb999486e7208a8ea12aac4f1178ed0d6ab5d0\": rpc error: code = NotFound desc = could not find container \"f68ee2d1d9f115e6ce17b987f1eb999486e7208a8ea12aac4f1178ed0d6ab5d0\": container with ID starting with f68ee2d1d9f115e6ce17b987f1eb999486e7208a8ea12aac4f1178ed0d6ab5d0 not found: ID does not exist" Apr 25 00:12:43.503187 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:43.503148 2578 scope.go:117] "RemoveContainer" containerID="55651d9b6384f4c2cb612171cb2aab3a9bf9b0978b1e13ce6212ea25595bab3a" Apr 25 00:12:43.503385 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:12:43.503369 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55651d9b6384f4c2cb612171cb2aab3a9bf9b0978b1e13ce6212ea25595bab3a\": container with ID starting with 55651d9b6384f4c2cb612171cb2aab3a9bf9b0978b1e13ce6212ea25595bab3a not found: ID does not exist" containerID="55651d9b6384f4c2cb612171cb2aab3a9bf9b0978b1e13ce6212ea25595bab3a" Apr 25 00:12:43.503447 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:43.503391 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55651d9b6384f4c2cb612171cb2aab3a9bf9b0978b1e13ce6212ea25595bab3a"} err="failed to get container status \"55651d9b6384f4c2cb612171cb2aab3a9bf9b0978b1e13ce6212ea25595bab3a\": rpc error: code = NotFound desc = could not find container \"55651d9b6384f4c2cb612171cb2aab3a9bf9b0978b1e13ce6212ea25595bab3a\": container with ID starting with 55651d9b6384f4c2cb612171cb2aab3a9bf9b0978b1e13ce6212ea25595bab3a not found: ID does not exist" Apr 25 00:12:43.509728 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:43.509705 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j"] Apr 25 00:12:43.513855 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:43.513832 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d627c-predictor-549977b56d-df86j"] Apr 25 00:12:43.912825 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:43.912783 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e78737f-9719-418f-8287-6126f486855e" path="/var/lib/kubelet/pods/0e78737f-9719-418f-8287-6126f486855e/volumes" Apr 25 00:12:43.913258 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:43.913243 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e196dbd6-90c3-4051-8b86-dc0bbe6b98cf" path="/var/lib/kubelet/pods/e196dbd6-90c3-4051-8b86-dc0bbe6b98cf/volumes" Apr 25 00:12:46.442841 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:46.442806 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" podUID="d5aaf2d5-bec8-4036-9be1-94e6fafc9910" containerName="ensemble-graph-d627c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:12:46.481300 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:46.481272 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" Apr 25 00:12:46.481765 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:46.481735 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" podUID="3a07e833-145e-41dc-bec3-c09231467d16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 25 00:12:47.487068 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:47.487040 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" Apr 25 00:12:47.487660 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:47.487632 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" podUID="3043916b-aef2-46c0-b90a-f63cfec5c4e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 25 00:12:51.443356 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:51.443267 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" podUID="d5aaf2d5-bec8-4036-9be1-94e6fafc9910" containerName="ensemble-graph-d627c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:12:51.443921 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:51.443400 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" Apr 25 00:12:54.347151 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.347114 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s"] Apr 25 00:12:54.348539 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.348513 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e196dbd6-90c3-4051-8b86-dc0bbe6b98cf" containerName="kserve-container" Apr 25 00:12:54.348701 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.348689 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e196dbd6-90c3-4051-8b86-dc0bbe6b98cf" containerName="kserve-container" Apr 25 00:12:54.348831 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.348820 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e78737f-9719-418f-8287-6126f486855e" containerName="kserve-container" Apr 25 00:12:54.348903 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.348895 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e78737f-9719-418f-8287-6126f486855e" containerName="kserve-container" Apr 25 00:12:54.348985 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.348976 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e78737f-9719-418f-8287-6126f486855e" containerName="kube-rbac-proxy" Apr 25 00:12:54.349051 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.349042 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e78737f-9719-418f-8287-6126f486855e" containerName="kube-rbac-proxy" Apr 25 00:12:54.349133 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.349124 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e196dbd6-90c3-4051-8b86-dc0bbe6b98cf" containerName="kube-rbac-proxy" Apr 25 00:12:54.349210 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.349202 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e196dbd6-90c3-4051-8b86-dc0bbe6b98cf" containerName="kube-rbac-proxy" Apr 25 00:12:54.349489 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.349476 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e196dbd6-90c3-4051-8b86-dc0bbe6b98cf" containerName="kube-rbac-proxy" Apr 25 00:12:54.349602 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.349592 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e78737f-9719-418f-8287-6126f486855e" containerName="kube-rbac-proxy" Apr 25 00:12:54.349682 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.349674 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e196dbd6-90c3-4051-8b86-dc0bbe6b98cf" containerName="kserve-container" Apr 25 00:12:54.349762 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.349753 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e78737f-9719-418f-8287-6126f486855e" containerName="kserve-container" Apr 25 00:12:54.354050 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.354028 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" Apr 25 00:12:54.355558 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.355534 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s"] Apr 25 00:12:54.356025 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.356009 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-c9f9f-kube-rbac-proxy-sar-config\"" Apr 25 00:12:54.356181 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.356166 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-c9f9f-serving-cert\"" Apr 25 00:12:54.420929 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.420891 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3-openshift-service-ca-bundle\") pod \"sequence-graph-c9f9f-5dd8ccddf4-44z4s\" (UID: \"51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3\") " pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" Apr 25 00:12:54.421102 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.421006 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3-proxy-tls\") pod \"sequence-graph-c9f9f-5dd8ccddf4-44z4s\" (UID: \"51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3\") " pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" Apr 25 00:12:54.521877 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.521841 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3-proxy-tls\") pod \"sequence-graph-c9f9f-5dd8ccddf4-44z4s\" (UID: \"51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3\") " pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" Apr 25 00:12:54.522046 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.521899 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3-openshift-service-ca-bundle\") pod \"sequence-graph-c9f9f-5dd8ccddf4-44z4s\" (UID: \"51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3\") " pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" Apr 25 00:12:54.522598 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.522571 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3-openshift-service-ca-bundle\") pod \"sequence-graph-c9f9f-5dd8ccddf4-44z4s\" (UID: \"51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3\") " pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" Apr 25 00:12:54.524359 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.524340 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3-proxy-tls\") pod \"sequence-graph-c9f9f-5dd8ccddf4-44z4s\" (UID: \"51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3\") " pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" Apr 25 00:12:54.666598 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.666513 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" Apr 25 00:12:54.790945 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:54.790764 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s"] Apr 25 00:12:54.793998 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:12:54.793971 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51ff0f0e_8ec3_46d3_b0f7_2431460ff6d3.slice/crio-4de1198269835ee5f3b79b219b93671b251b67b89eb9d5f264c5d96972504d76 WatchSource:0}: Error finding container 4de1198269835ee5f3b79b219b93671b251b67b89eb9d5f264c5d96972504d76: Status 404 returned error can't find the container with id 4de1198269835ee5f3b79b219b93671b251b67b89eb9d5f264c5d96972504d76 Apr 25 00:12:55.524937 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:55.524901 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" event={"ID":"51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3","Type":"ContainerStarted","Data":"37c76e5936438c1184b1105baf7447f0fd6a37c523f4fc01a14492bc7b31aa7d"} Apr 25 00:12:55.524937 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:55.524945 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" event={"ID":"51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3","Type":"ContainerStarted","Data":"4de1198269835ee5f3b79b219b93671b251b67b89eb9d5f264c5d96972504d76"} Apr 25 00:12:55.525481 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:55.525011 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" Apr 25 00:12:55.540090 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:55.540040 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" podStartSLOduration=1.540025862 podStartE2EDuration="1.540025862s" podCreationTimestamp="2026-04-25 00:12:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:12:55.538734653 +0000 UTC m=+1158.172716588" watchObservedRunningTime="2026-04-25 00:12:55.540025862 +0000 UTC m=+1158.174007809" Apr 25 00:12:56.442089 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:56.442043 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" podUID="d5aaf2d5-bec8-4036-9be1-94e6fafc9910" containerName="ensemble-graph-d627c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:12:56.482608 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:56.482570 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" podUID="3a07e833-145e-41dc-bec3-c09231467d16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 25 00:12:57.487743 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:12:57.487694 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" podUID="3043916b-aef2-46c0-b90a-f63cfec5c4e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 25 00:13:01.442912 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:01.442864 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" podUID="d5aaf2d5-bec8-4036-9be1-94e6fafc9910" containerName="ensemble-graph-d627c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:13:01.535045 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:01.535015 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" Apr 25 00:13:04.416682 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.416646 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s"] Apr 25 00:13:04.417050 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.416861 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" podUID="51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3" containerName="sequence-graph-c9f9f" containerID="cri-o://37c76e5936438c1184b1105baf7447f0fd6a37c523f4fc01a14492bc7b31aa7d" gracePeriod=30 Apr 25 00:13:04.513441 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.512376 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk"] Apr 25 00:13:04.513441 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.512758 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" podUID="eaa96e56-ae3f-4672-b36f-2043818f851d" containerName="kserve-container" containerID="cri-o://354c0f600ff0f0c6faec688dacaea2d6932328ee3ba272c1332d751f672c2ef9" gracePeriod=30 Apr 25 00:13:04.513441 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.513082 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" podUID="eaa96e56-ae3f-4672-b36f-2043818f851d" containerName="kube-rbac-proxy" containerID="cri-o://d8eb0e6d8cc28eec926ee6a4f94be5f694a2861b3a8191341299fb5bc10f1a59" gracePeriod=30 Apr 25 00:13:04.536020 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.535993 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp"] Apr 25 00:13:04.539637 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.539619 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" Apr 25 00:13:04.541918 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.541898 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-e1f95-predictor-serving-cert\"" Apr 25 00:13:04.542025 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.541916 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-e1f95-kube-rbac-proxy-sar-config\"" Apr 25 00:13:04.548778 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.548756 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp"] Apr 25 00:13:04.595316 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.595284 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf"] Apr 25 00:13:04.595661 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.595629 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" podUID="a71639bf-f015-4465-933c-b9a7b152f57d" containerName="kserve-container" containerID="cri-o://b9a33f99bd39df47029a8e6c8b7d85c3568b0adae0b52528351f6d8b59d68803" gracePeriod=30 Apr 25 00:13:04.595887 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.595668 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" podUID="a71639bf-f015-4465-933c-b9a7b152f57d" containerName="kube-rbac-proxy" containerID="cri-o://358738c5e173297603868a9a9c8c99e4d0b5fa09ed6b314fc7e83af7a06ee692" gracePeriod=30 Apr 25 00:13:04.620188 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.620158 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdm7w\" (UniqueName: \"kubernetes.io/projected/ccf9b311-61f4-49c6-b521-0cc24798e111-kube-api-access-vdm7w\") pod \"success-200-isvc-e1f95-predictor-78c8c57784-68slp\" (UID: \"ccf9b311-61f4-49c6-b521-0cc24798e111\") " pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" Apr 25 00:13:04.620294 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.620221 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-e1f95-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ccf9b311-61f4-49c6-b521-0cc24798e111-success-200-isvc-e1f95-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-e1f95-predictor-78c8c57784-68slp\" (UID: \"ccf9b311-61f4-49c6-b521-0cc24798e111\") " pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" Apr 25 00:13:04.620342 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.620300 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ccf9b311-61f4-49c6-b521-0cc24798e111-proxy-tls\") pod \"success-200-isvc-e1f95-predictor-78c8c57784-68slp\" (UID: \"ccf9b311-61f4-49c6-b521-0cc24798e111\") " pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" Apr 25 00:13:04.652544 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.652516 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm"] Apr 25 00:13:04.659021 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.656944 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" Apr 25 00:13:04.659753 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.659712 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-e1f95-kube-rbac-proxy-sar-config\"" Apr 25 00:13:04.659831 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.659808 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-e1f95-predictor-serving-cert\"" Apr 25 00:13:04.664358 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.664334 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm"] Apr 25 00:13:04.721389 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.721364 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdm7w\" (UniqueName: \"kubernetes.io/projected/ccf9b311-61f4-49c6-b521-0cc24798e111-kube-api-access-vdm7w\") pod \"success-200-isvc-e1f95-predictor-78c8c57784-68slp\" (UID: \"ccf9b311-61f4-49c6-b521-0cc24798e111\") " pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" Apr 25 00:13:04.721512 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.721465 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/698b477f-a5ba-428e-bac0-96f0f0ee89fc-proxy-tls\") pod \"error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm\" (UID: \"698b477f-a5ba-428e-bac0-96f0f0ee89fc\") " pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" Apr 25 00:13:04.721512 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.721502 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-e1f95-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ccf9b311-61f4-49c6-b521-0cc24798e111-success-200-isvc-e1f95-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-e1f95-predictor-78c8c57784-68slp\" (UID: \"ccf9b311-61f4-49c6-b521-0cc24798e111\") " pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" Apr 25 00:13:04.721638 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.721613 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-e1f95-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/698b477f-a5ba-428e-bac0-96f0f0ee89fc-error-404-isvc-e1f95-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm\" (UID: \"698b477f-a5ba-428e-bac0-96f0f0ee89fc\") " pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" Apr 25 00:13:04.721738 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.721716 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ccf9b311-61f4-49c6-b521-0cc24798e111-proxy-tls\") pod \"success-200-isvc-e1f95-predictor-78c8c57784-68slp\" (UID: \"ccf9b311-61f4-49c6-b521-0cc24798e111\") " pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" Apr 25 00:13:04.721802 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.721769 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55db6\" (UniqueName: \"kubernetes.io/projected/698b477f-a5ba-428e-bac0-96f0f0ee89fc-kube-api-access-55db6\") pod \"error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm\" (UID: \"698b477f-a5ba-428e-bac0-96f0f0ee89fc\") " pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" Apr 25 00:13:04.721857 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:13:04.721836 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-serving-cert: secret "success-200-isvc-e1f95-predictor-serving-cert" not found Apr 25 00:13:04.721907 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:13:04.721901 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccf9b311-61f4-49c6-b521-0cc24798e111-proxy-tls podName:ccf9b311-61f4-49c6-b521-0cc24798e111 nodeName:}" failed. No retries permitted until 2026-04-25 00:13:05.221884639 +0000 UTC m=+1167.855866552 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ccf9b311-61f4-49c6-b521-0cc24798e111-proxy-tls") pod "success-200-isvc-e1f95-predictor-78c8c57784-68slp" (UID: "ccf9b311-61f4-49c6-b521-0cc24798e111") : secret "success-200-isvc-e1f95-predictor-serving-cert" not found Apr 25 00:13:04.722120 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.722104 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-e1f95-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ccf9b311-61f4-49c6-b521-0cc24798e111-success-200-isvc-e1f95-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-e1f95-predictor-78c8c57784-68slp\" (UID: \"ccf9b311-61f4-49c6-b521-0cc24798e111\") " pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" Apr 25 00:13:04.730116 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.730092 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdm7w\" (UniqueName: \"kubernetes.io/projected/ccf9b311-61f4-49c6-b521-0cc24798e111-kube-api-access-vdm7w\") pod \"success-200-isvc-e1f95-predictor-78c8c57784-68slp\" (UID: \"ccf9b311-61f4-49c6-b521-0cc24798e111\") " pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" Apr 25 00:13:04.822483 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.822448 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/698b477f-a5ba-428e-bac0-96f0f0ee89fc-proxy-tls\") pod \"error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm\" (UID: \"698b477f-a5ba-428e-bac0-96f0f0ee89fc\") " pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" Apr 25 00:13:04.822671 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.822491 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-e1f95-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/698b477f-a5ba-428e-bac0-96f0f0ee89fc-error-404-isvc-e1f95-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm\" (UID: \"698b477f-a5ba-428e-bac0-96f0f0ee89fc\") " pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" Apr 25 00:13:04.822671 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.822541 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55db6\" (UniqueName: \"kubernetes.io/projected/698b477f-a5ba-428e-bac0-96f0f0ee89fc-kube-api-access-55db6\") pod \"error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm\" (UID: \"698b477f-a5ba-428e-bac0-96f0f0ee89fc\") " pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" Apr 25 00:13:04.823318 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.823298 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-e1f95-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/698b477f-a5ba-428e-bac0-96f0f0ee89fc-error-404-isvc-e1f95-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm\" (UID: \"698b477f-a5ba-428e-bac0-96f0f0ee89fc\") " pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" Apr 25 00:13:04.825179 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.825155 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/698b477f-a5ba-428e-bac0-96f0f0ee89fc-proxy-tls\") pod \"error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm\" (UID: \"698b477f-a5ba-428e-bac0-96f0f0ee89fc\") " pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" Apr 25 00:13:04.831126 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.831101 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55db6\" (UniqueName: \"kubernetes.io/projected/698b477f-a5ba-428e-bac0-96f0f0ee89fc-kube-api-access-55db6\") pod \"error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm\" (UID: \"698b477f-a5ba-428e-bac0-96f0f0ee89fc\") " pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" Apr 25 00:13:04.971056 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:04.970967 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" Apr 25 00:13:05.094991 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:05.094965 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm"] Apr 25 00:13:05.096959 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:13:05.096931 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod698b477f_a5ba_428e_bac0_96f0f0ee89fc.slice/crio-03d13a91ab4b92f8e51650684cfbf032d1fc85e6ead5e776400f3982cdb48bb9 WatchSource:0}: Error finding container 03d13a91ab4b92f8e51650684cfbf032d1fc85e6ead5e776400f3982cdb48bb9: Status 404 returned error can't find the container with id 03d13a91ab4b92f8e51650684cfbf032d1fc85e6ead5e776400f3982cdb48bb9 Apr 25 00:13:05.227617 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:05.227558 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ccf9b311-61f4-49c6-b521-0cc24798e111-proxy-tls\") pod \"success-200-isvc-e1f95-predictor-78c8c57784-68slp\" (UID: \"ccf9b311-61f4-49c6-b521-0cc24798e111\") " pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" Apr 25 00:13:05.229780 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:05.229748 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ccf9b311-61f4-49c6-b521-0cc24798e111-proxy-tls\") pod \"success-200-isvc-e1f95-predictor-78c8c57784-68slp\" (UID: \"ccf9b311-61f4-49c6-b521-0cc24798e111\") " pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" Apr 25 00:13:05.450290 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:05.450253 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" Apr 25 00:13:05.559748 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:05.559720 2578 generic.go:358] "Generic (PLEG): container finished" podID="eaa96e56-ae3f-4672-b36f-2043818f851d" containerID="d8eb0e6d8cc28eec926ee6a4f94be5f694a2861b3a8191341299fb5bc10f1a59" exitCode=2 Apr 25 00:13:05.559891 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:05.559794 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" event={"ID":"eaa96e56-ae3f-4672-b36f-2043818f851d","Type":"ContainerDied","Data":"d8eb0e6d8cc28eec926ee6a4f94be5f694a2861b3a8191341299fb5bc10f1a59"} Apr 25 00:13:05.561385 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:05.561364 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" event={"ID":"698b477f-a5ba-428e-bac0-96f0f0ee89fc","Type":"ContainerStarted","Data":"e4133c2e49c252b13a198725a5da87386fed76a5774bb38f24b3ccf3f28cceb3"} Apr 25 00:13:05.561543 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:05.561392 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" event={"ID":"698b477f-a5ba-428e-bac0-96f0f0ee89fc","Type":"ContainerStarted","Data":"064456d4971b2838ca5ad99cb47524ab1989ed6e41ab458e6f40433ec1d4772c"} Apr 25 00:13:05.561543 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:05.561405 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" event={"ID":"698b477f-a5ba-428e-bac0-96f0f0ee89fc","Type":"ContainerStarted","Data":"03d13a91ab4b92f8e51650684cfbf032d1fc85e6ead5e776400f3982cdb48bb9"} Apr 25 00:13:05.562981 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:05.562963 2578 generic.go:358] "Generic (PLEG): container finished" podID="a71639bf-f015-4465-933c-b9a7b152f57d" containerID="358738c5e173297603868a9a9c8c99e4d0b5fa09ed6b314fc7e83af7a06ee692" exitCode=2 Apr 25 00:13:05.563081 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:05.563017 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" event={"ID":"a71639bf-f015-4465-933c-b9a7b152f57d","Type":"ContainerDied","Data":"358738c5e173297603868a9a9c8c99e4d0b5fa09ed6b314fc7e83af7a06ee692"} Apr 25 00:13:05.574016 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:05.573999 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp"] Apr 25 00:13:05.576497 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:13:05.576456 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccf9b311_61f4_49c6_b521_0cc24798e111.slice/crio-e1f5e988e767cfbf57871c43102651d53fa03b195f112082c6e3459a5384122c WatchSource:0}: Error finding container e1f5e988e767cfbf57871c43102651d53fa03b195f112082c6e3459a5384122c: Status 404 returned error can't find the container with id e1f5e988e767cfbf57871c43102651d53fa03b195f112082c6e3459a5384122c Apr 25 00:13:05.578687 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:05.578652 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" podStartSLOduration=1.578640718 podStartE2EDuration="1.578640718s" podCreationTimestamp="2026-04-25 00:13:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:13:05.576357472 +0000 UTC m=+1168.210339409" watchObservedRunningTime="2026-04-25 00:13:05.578640718 +0000 UTC m=+1168.212622654" Apr 25 00:13:06.442270 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:06.442231 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" podUID="d5aaf2d5-bec8-4036-9be1-94e6fafc9910" containerName="ensemble-graph-d627c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:13:06.482147 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:06.482116 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" podUID="3a07e833-145e-41dc-bec3-c09231467d16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 25 00:13:06.533774 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:06.533731 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" podUID="51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3" containerName="sequence-graph-c9f9f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:13:06.567856 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:06.567817 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" event={"ID":"ccf9b311-61f4-49c6-b521-0cc24798e111","Type":"ContainerStarted","Data":"5814d9f2982cdd6bcf5c8d2b5c7b24bffedcbfc99cbbb1881ee2728227daccba"} Apr 25 00:13:06.568016 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:06.567862 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" event={"ID":"ccf9b311-61f4-49c6-b521-0cc24798e111","Type":"ContainerStarted","Data":"46b70bf1f37b88e1084c5c2b0e9ba6491fe8a492a11f135f573ecb78b87276f2"} Apr 25 00:13:06.568016 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:06.567875 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" event={"ID":"ccf9b311-61f4-49c6-b521-0cc24798e111","Type":"ContainerStarted","Data":"e1f5e988e767cfbf57871c43102651d53fa03b195f112082c6e3459a5384122c"} Apr 25 00:13:06.568243 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:06.568216 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" Apr 25 00:13:06.568311 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:06.568256 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" Apr 25 00:13:06.568311 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:06.568270 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" Apr 25 00:13:06.569275 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:06.569251 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" podUID="698b477f-a5ba-428e-bac0-96f0f0ee89fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 25 00:13:06.583991 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:06.583943 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" podStartSLOduration=2.583927159 podStartE2EDuration="2.583927159s" podCreationTimestamp="2026-04-25 00:13:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:13:06.583086691 +0000 UTC m=+1169.217068627" watchObservedRunningTime="2026-04-25 00:13:06.583927159 +0000 UTC m=+1169.217909097" Apr 25 00:13:07.323313 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:07.323270 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" podUID="a71639bf-f015-4465-933c-b9a7b152f57d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.33:8643/healthz\": dial tcp 10.133.0.33:8643: connect: connection refused" Apr 25 00:13:07.488474 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:07.488409 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" podUID="3043916b-aef2-46c0-b90a-f63cfec5c4e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 25 00:13:07.574473 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:07.574356 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" Apr 25 00:13:07.574473 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:07.574450 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" podUID="698b477f-a5ba-428e-bac0-96f0f0ee89fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 25 00:13:07.575565 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:07.575540 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" podUID="ccf9b311-61f4-49c6-b521-0cc24798e111" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 25 00:13:08.043477 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.043456 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" Apr 25 00:13:08.154740 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.154704 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-c9f9f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a71639bf-f015-4465-933c-b9a7b152f57d-error-404-isvc-c9f9f-kube-rbac-proxy-sar-config\") pod \"a71639bf-f015-4465-933c-b9a7b152f57d\" (UID: \"a71639bf-f015-4465-933c-b9a7b152f57d\") " Apr 25 00:13:08.154892 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.154818 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpzmv\" (UniqueName: \"kubernetes.io/projected/a71639bf-f015-4465-933c-b9a7b152f57d-kube-api-access-dpzmv\") pod \"a71639bf-f015-4465-933c-b9a7b152f57d\" (UID: \"a71639bf-f015-4465-933c-b9a7b152f57d\") " Apr 25 00:13:08.154892 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.154872 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a71639bf-f015-4465-933c-b9a7b152f57d-proxy-tls\") pod \"a71639bf-f015-4465-933c-b9a7b152f57d\" (UID: \"a71639bf-f015-4465-933c-b9a7b152f57d\") " Apr 25 00:13:08.155085 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.155064 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a71639bf-f015-4465-933c-b9a7b152f57d-error-404-isvc-c9f9f-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-c9f9f-kube-rbac-proxy-sar-config") pod "a71639bf-f015-4465-933c-b9a7b152f57d" (UID: "a71639bf-f015-4465-933c-b9a7b152f57d"). InnerVolumeSpecName "error-404-isvc-c9f9f-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:13:08.155193 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.155169 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-c9f9f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a71639bf-f015-4465-933c-b9a7b152f57d-error-404-isvc-c9f9f-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:13:08.157014 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.156986 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a71639bf-f015-4465-933c-b9a7b152f57d-kube-api-access-dpzmv" (OuterVolumeSpecName: "kube-api-access-dpzmv") pod "a71639bf-f015-4465-933c-b9a7b152f57d" (UID: "a71639bf-f015-4465-933c-b9a7b152f57d"). InnerVolumeSpecName "kube-api-access-dpzmv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:13:08.157119 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.157005 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a71639bf-f015-4465-933c-b9a7b152f57d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a71639bf-f015-4465-933c-b9a7b152f57d" (UID: "a71639bf-f015-4465-933c-b9a7b152f57d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:13:08.247330 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.247307 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" Apr 25 00:13:08.255805 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.255782 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dpzmv\" (UniqueName: \"kubernetes.io/projected/a71639bf-f015-4465-933c-b9a7b152f57d-kube-api-access-dpzmv\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:13:08.255898 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.255807 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a71639bf-f015-4465-933c-b9a7b152f57d-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:13:08.356310 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.356276 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eaa96e56-ae3f-4672-b36f-2043818f851d-proxy-tls\") pod \"eaa96e56-ae3f-4672-b36f-2043818f851d\" (UID: \"eaa96e56-ae3f-4672-b36f-2043818f851d\") " Apr 25 00:13:08.356499 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.356374 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-c9f9f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eaa96e56-ae3f-4672-b36f-2043818f851d-success-200-isvc-c9f9f-kube-rbac-proxy-sar-config\") pod \"eaa96e56-ae3f-4672-b36f-2043818f851d\" (UID: \"eaa96e56-ae3f-4672-b36f-2043818f851d\") " Apr 25 00:13:08.356499 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.356404 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsrgj\" (UniqueName: \"kubernetes.io/projected/eaa96e56-ae3f-4672-b36f-2043818f851d-kube-api-access-fsrgj\") pod \"eaa96e56-ae3f-4672-b36f-2043818f851d\" (UID: \"eaa96e56-ae3f-4672-b36f-2043818f851d\") " Apr 25 00:13:08.356803 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.356774 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaa96e56-ae3f-4672-b36f-2043818f851d-success-200-isvc-c9f9f-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-c9f9f-kube-rbac-proxy-sar-config") pod "eaa96e56-ae3f-4672-b36f-2043818f851d" (UID: "eaa96e56-ae3f-4672-b36f-2043818f851d"). InnerVolumeSpecName "success-200-isvc-c9f9f-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:13:08.358558 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.358535 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaa96e56-ae3f-4672-b36f-2043818f851d-kube-api-access-fsrgj" (OuterVolumeSpecName: "kube-api-access-fsrgj") pod "eaa96e56-ae3f-4672-b36f-2043818f851d" (UID: "eaa96e56-ae3f-4672-b36f-2043818f851d"). InnerVolumeSpecName "kube-api-access-fsrgj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:13:08.358558 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.358539 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa96e56-ae3f-4672-b36f-2043818f851d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "eaa96e56-ae3f-4672-b36f-2043818f851d" (UID: "eaa96e56-ae3f-4672-b36f-2043818f851d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:13:08.457437 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.457349 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eaa96e56-ae3f-4672-b36f-2043818f851d-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:13:08.457437 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.457375 2578 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-c9f9f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eaa96e56-ae3f-4672-b36f-2043818f851d-success-200-isvc-c9f9f-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:13:08.457437 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.457385 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fsrgj\" (UniqueName: \"kubernetes.io/projected/eaa96e56-ae3f-4672-b36f-2043818f851d-kube-api-access-fsrgj\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:13:08.579507 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.579467 2578 generic.go:358] "Generic (PLEG): container finished" podID="eaa96e56-ae3f-4672-b36f-2043818f851d" containerID="354c0f600ff0f0c6faec688dacaea2d6932328ee3ba272c1332d751f672c2ef9" exitCode=0 Apr 25 00:13:08.579906 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.579558 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" Apr 25 00:13:08.579906 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.579547 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" event={"ID":"eaa96e56-ae3f-4672-b36f-2043818f851d","Type":"ContainerDied","Data":"354c0f600ff0f0c6faec688dacaea2d6932328ee3ba272c1332d751f672c2ef9"} Apr 25 00:13:08.579906 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.579605 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk" event={"ID":"eaa96e56-ae3f-4672-b36f-2043818f851d","Type":"ContainerDied","Data":"a736187680aa474ec13eacfdf795c58a63bbb54a4eb814270bdde214558f6efb"} Apr 25 00:13:08.579906 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.579622 2578 scope.go:117] "RemoveContainer" containerID="d8eb0e6d8cc28eec926ee6a4f94be5f694a2861b3a8191341299fb5bc10f1a59" Apr 25 00:13:08.581434 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.581389 2578 generic.go:358] "Generic (PLEG): container finished" podID="a71639bf-f015-4465-933c-b9a7b152f57d" containerID="b9a33f99bd39df47029a8e6c8b7d85c3568b0adae0b52528351f6d8b59d68803" exitCode=0 Apr 25 00:13:08.581540 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.581504 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" Apr 25 00:13:08.581540 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.581460 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" event={"ID":"a71639bf-f015-4465-933c-b9a7b152f57d","Type":"ContainerDied","Data":"b9a33f99bd39df47029a8e6c8b7d85c3568b0adae0b52528351f6d8b59d68803"} Apr 25 00:13:08.581649 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.581560 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf" event={"ID":"a71639bf-f015-4465-933c-b9a7b152f57d","Type":"ContainerDied","Data":"8ebbddc809acb6eafa959fa1af9808b791fd586e36bc8f6985cf607c3889e08b"} Apr 25 00:13:08.584450 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.583500 2578 generic.go:358] "Generic (PLEG): container finished" podID="d5aaf2d5-bec8-4036-9be1-94e6fafc9910" containerID="e2435470ddd87807dbfdda1d236d69ee4a69e18e3f30a1a9a194d4d1867d85af" exitCode=0 Apr 25 00:13:08.584450 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.584139 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" event={"ID":"d5aaf2d5-bec8-4036-9be1-94e6fafc9910","Type":"ContainerDied","Data":"e2435470ddd87807dbfdda1d236d69ee4a69e18e3f30a1a9a194d4d1867d85af"} Apr 25 00:13:08.586215 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.585065 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" podUID="ccf9b311-61f4-49c6-b521-0cc24798e111" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 25 00:13:08.593963 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.593940 2578 scope.go:117] "RemoveContainer" containerID="354c0f600ff0f0c6faec688dacaea2d6932328ee3ba272c1332d751f672c2ef9" Apr 25 00:13:08.607184 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.607165 2578 scope.go:117] "RemoveContainer" containerID="d8eb0e6d8cc28eec926ee6a4f94be5f694a2861b3a8191341299fb5bc10f1a59" Apr 25 00:13:08.607467 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:13:08.607446 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8eb0e6d8cc28eec926ee6a4f94be5f694a2861b3a8191341299fb5bc10f1a59\": container with ID starting with d8eb0e6d8cc28eec926ee6a4f94be5f694a2861b3a8191341299fb5bc10f1a59 not found: ID does not exist" containerID="d8eb0e6d8cc28eec926ee6a4f94be5f694a2861b3a8191341299fb5bc10f1a59" Apr 25 00:13:08.607556 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.607476 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8eb0e6d8cc28eec926ee6a4f94be5f694a2861b3a8191341299fb5bc10f1a59"} err="failed to get container status \"d8eb0e6d8cc28eec926ee6a4f94be5f694a2861b3a8191341299fb5bc10f1a59\": rpc error: code = NotFound desc = could not find container \"d8eb0e6d8cc28eec926ee6a4f94be5f694a2861b3a8191341299fb5bc10f1a59\": container with ID starting with d8eb0e6d8cc28eec926ee6a4f94be5f694a2861b3a8191341299fb5bc10f1a59 not found: ID does not exist" Apr 25 00:13:08.607556 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.607493 2578 scope.go:117] "RemoveContainer" containerID="354c0f600ff0f0c6faec688dacaea2d6932328ee3ba272c1332d751f672c2ef9" Apr 25 00:13:08.607733 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:13:08.607718 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"354c0f600ff0f0c6faec688dacaea2d6932328ee3ba272c1332d751f672c2ef9\": container with ID starting with 354c0f600ff0f0c6faec688dacaea2d6932328ee3ba272c1332d751f672c2ef9 not found: ID does not exist" containerID="354c0f600ff0f0c6faec688dacaea2d6932328ee3ba272c1332d751f672c2ef9" Apr 25 00:13:08.607793 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.607738 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"354c0f600ff0f0c6faec688dacaea2d6932328ee3ba272c1332d751f672c2ef9"} err="failed to get container status \"354c0f600ff0f0c6faec688dacaea2d6932328ee3ba272c1332d751f672c2ef9\": rpc error: code = NotFound desc = could not find container \"354c0f600ff0f0c6faec688dacaea2d6932328ee3ba272c1332d751f672c2ef9\": container with ID starting with 354c0f600ff0f0c6faec688dacaea2d6932328ee3ba272c1332d751f672c2ef9 not found: ID does not exist" Apr 25 00:13:08.607793 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.607753 2578 scope.go:117] "RemoveContainer" containerID="358738c5e173297603868a9a9c8c99e4d0b5fa09ed6b314fc7e83af7a06ee692" Apr 25 00:13:08.611627 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.611604 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk"] Apr 25 00:13:08.615237 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.615214 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c9f9f-predictor-8b8564854-4sftk"] Apr 25 00:13:08.617039 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.617022 2578 scope.go:117] "RemoveContainer" containerID="b9a33f99bd39df47029a8e6c8b7d85c3568b0adae0b52528351f6d8b59d68803" Apr 25 00:13:08.624706 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.624684 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf"] Apr 25 00:13:08.624848 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.624784 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" Apr 25 00:13:08.624891 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.624859 2578 scope.go:117] "RemoveContainer" containerID="358738c5e173297603868a9a9c8c99e4d0b5fa09ed6b314fc7e83af7a06ee692" Apr 25 00:13:08.625096 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:13:08.625080 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"358738c5e173297603868a9a9c8c99e4d0b5fa09ed6b314fc7e83af7a06ee692\": container with ID starting with 358738c5e173297603868a9a9c8c99e4d0b5fa09ed6b314fc7e83af7a06ee692 not found: ID does not exist" containerID="358738c5e173297603868a9a9c8c99e4d0b5fa09ed6b314fc7e83af7a06ee692" Apr 25 00:13:08.625153 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.625103 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"358738c5e173297603868a9a9c8c99e4d0b5fa09ed6b314fc7e83af7a06ee692"} err="failed to get container status \"358738c5e173297603868a9a9c8c99e4d0b5fa09ed6b314fc7e83af7a06ee692\": rpc error: code = NotFound desc = could not find container \"358738c5e173297603868a9a9c8c99e4d0b5fa09ed6b314fc7e83af7a06ee692\": container with ID starting with 358738c5e173297603868a9a9c8c99e4d0b5fa09ed6b314fc7e83af7a06ee692 not found: ID does not exist" Apr 25 00:13:08.625153 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.625120 2578 scope.go:117] "RemoveContainer" containerID="b9a33f99bd39df47029a8e6c8b7d85c3568b0adae0b52528351f6d8b59d68803" Apr 25 00:13:08.625326 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:13:08.625310 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9a33f99bd39df47029a8e6c8b7d85c3568b0adae0b52528351f6d8b59d68803\": container with ID starting with b9a33f99bd39df47029a8e6c8b7d85c3568b0adae0b52528351f6d8b59d68803 not found: ID does not exist" containerID="b9a33f99bd39df47029a8e6c8b7d85c3568b0adae0b52528351f6d8b59d68803" Apr 25 00:13:08.625371 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.625333 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9a33f99bd39df47029a8e6c8b7d85c3568b0adae0b52528351f6d8b59d68803"} err="failed to get container status \"b9a33f99bd39df47029a8e6c8b7d85c3568b0adae0b52528351f6d8b59d68803\": rpc error: code = NotFound desc = could not find container \"b9a33f99bd39df47029a8e6c8b7d85c3568b0adae0b52528351f6d8b59d68803\": container with ID starting with b9a33f99bd39df47029a8e6c8b7d85c3568b0adae0b52528351f6d8b59d68803 not found: ID does not exist" Apr 25 00:13:08.628104 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.628084 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c9f9f-predictor-75597dd9f7-czccf"] Apr 25 00:13:08.759321 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.759299 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5aaf2d5-bec8-4036-9be1-94e6fafc9910-proxy-tls\") pod \"d5aaf2d5-bec8-4036-9be1-94e6fafc9910\" (UID: \"d5aaf2d5-bec8-4036-9be1-94e6fafc9910\") " Apr 25 00:13:08.759494 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.759390 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5aaf2d5-bec8-4036-9be1-94e6fafc9910-openshift-service-ca-bundle\") pod \"d5aaf2d5-bec8-4036-9be1-94e6fafc9910\" (UID: \"d5aaf2d5-bec8-4036-9be1-94e6fafc9910\") " Apr 25 00:13:08.759764 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.759736 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5aaf2d5-bec8-4036-9be1-94e6fafc9910-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d5aaf2d5-bec8-4036-9be1-94e6fafc9910" (UID: "d5aaf2d5-bec8-4036-9be1-94e6fafc9910"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:13:08.761436 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.761401 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5aaf2d5-bec8-4036-9be1-94e6fafc9910-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d5aaf2d5-bec8-4036-9be1-94e6fafc9910" (UID: "d5aaf2d5-bec8-4036-9be1-94e6fafc9910"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:13:08.859977 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.859949 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5aaf2d5-bec8-4036-9be1-94e6fafc9910-openshift-service-ca-bundle\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:13:08.859977 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:08.859975 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5aaf2d5-bec8-4036-9be1-94e6fafc9910-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:13:09.589558 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:09.589524 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" event={"ID":"d5aaf2d5-bec8-4036-9be1-94e6fafc9910","Type":"ContainerDied","Data":"8bce6038206526d1a115c2fea9e66f5f3f437bdcfa1f6a574a81c2dec6b5ebaf"} Apr 25 00:13:09.590093 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:09.590072 2578 scope.go:117] "RemoveContainer" containerID="e2435470ddd87807dbfdda1d236d69ee4a69e18e3f30a1a9a194d4d1867d85af" Apr 25 00:13:09.590259 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:09.589533 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc" Apr 25 00:13:09.610828 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:09.610804 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc"] Apr 25 00:13:09.614374 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:09.614353 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-d627c-66c94ccd96-ml4rc"] Apr 25 00:13:09.912522 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:09.912438 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a71639bf-f015-4465-933c-b9a7b152f57d" path="/var/lib/kubelet/pods/a71639bf-f015-4465-933c-b9a7b152f57d/volumes" Apr 25 00:13:09.913009 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:09.912980 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5aaf2d5-bec8-4036-9be1-94e6fafc9910" path="/var/lib/kubelet/pods/d5aaf2d5-bec8-4036-9be1-94e6fafc9910/volumes" Apr 25 00:13:09.913381 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:09.913369 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaa96e56-ae3f-4672-b36f-2043818f851d" path="/var/lib/kubelet/pods/eaa96e56-ae3f-4672-b36f-2043818f851d/volumes" Apr 25 00:13:11.534323 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:11.534283 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" podUID="51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3" containerName="sequence-graph-c9f9f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:13:12.579014 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:12.578990 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" Apr 25 00:13:12.579572 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:12.579545 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" podUID="698b477f-a5ba-428e-bac0-96f0f0ee89fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 25 00:13:13.589171 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:13.589137 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" Apr 25 00:13:13.589785 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:13.589755 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" podUID="ccf9b311-61f4-49c6-b521-0cc24798e111" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 25 00:13:16.482346 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:16.482304 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" podUID="3a07e833-145e-41dc-bec3-c09231467d16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 25 00:13:16.534455 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:16.534392 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" podUID="51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3" containerName="sequence-graph-c9f9f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:13:16.534598 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:16.534530 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" Apr 25 00:13:17.488687 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:17.488635 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" podUID="3043916b-aef2-46c0-b90a-f63cfec5c4e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 25 00:13:21.534050 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:21.533999 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" podUID="51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3" containerName="sequence-graph-c9f9f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:13:22.579665 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:22.579628 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" podUID="698b477f-a5ba-428e-bac0-96f0f0ee89fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 25 00:13:23.589809 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:23.589762 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" podUID="ccf9b311-61f4-49c6-b521-0cc24798e111" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 25 00:13:26.482718 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:26.482659 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" podUID="3a07e833-145e-41dc-bec3-c09231467d16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 25 00:13:26.534065 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:26.534024 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" podUID="51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3" containerName="sequence-graph-c9f9f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:13:27.488379 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:27.488334 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" podUID="3043916b-aef2-46c0-b90a-f63cfec5c4e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 25 00:13:31.533957 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:31.533903 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" podUID="51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3" containerName="sequence-graph-c9f9f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:13:32.580441 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:32.580382 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" podUID="698b477f-a5ba-428e-bac0-96f0f0ee89fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 25 00:13:33.590648 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:33.590608 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" podUID="ccf9b311-61f4-49c6-b521-0cc24798e111" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 25 00:13:34.576811 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:34.576790 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" Apr 25 00:13:34.676408 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:34.676333 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3-proxy-tls\") pod \"51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3\" (UID: \"51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3\") " Apr 25 00:13:34.676408 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:34.676373 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3-openshift-service-ca-bundle\") pod \"51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3\" (UID: \"51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3\") " Apr 25 00:13:34.676847 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:34.676772 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3" (UID: "51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:13:34.678694 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:34.678668 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3" (UID: "51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:13:34.680527 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:34.680498 2578 generic.go:358] "Generic (PLEG): container finished" podID="51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3" containerID="37c76e5936438c1184b1105baf7447f0fd6a37c523f4fc01a14492bc7b31aa7d" exitCode=0 Apr 25 00:13:34.680638 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:34.680565 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" Apr 25 00:13:34.680638 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:34.680583 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" event={"ID":"51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3","Type":"ContainerDied","Data":"37c76e5936438c1184b1105baf7447f0fd6a37c523f4fc01a14492bc7b31aa7d"} Apr 25 00:13:34.680638 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:34.680631 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s" event={"ID":"51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3","Type":"ContainerDied","Data":"4de1198269835ee5f3b79b219b93671b251b67b89eb9d5f264c5d96972504d76"} Apr 25 00:13:34.680789 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:34.680651 2578 scope.go:117] "RemoveContainer" containerID="37c76e5936438c1184b1105baf7447f0fd6a37c523f4fc01a14492bc7b31aa7d" Apr 25 00:13:34.689385 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:34.689368 2578 scope.go:117] "RemoveContainer" containerID="37c76e5936438c1184b1105baf7447f0fd6a37c523f4fc01a14492bc7b31aa7d" Apr 25 00:13:34.689700 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:13:34.689676 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37c76e5936438c1184b1105baf7447f0fd6a37c523f4fc01a14492bc7b31aa7d\": container with ID starting with 37c76e5936438c1184b1105baf7447f0fd6a37c523f4fc01a14492bc7b31aa7d not found: ID does not exist" containerID="37c76e5936438c1184b1105baf7447f0fd6a37c523f4fc01a14492bc7b31aa7d" Apr 25 00:13:34.689809 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:34.689707 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37c76e5936438c1184b1105baf7447f0fd6a37c523f4fc01a14492bc7b31aa7d"} err="failed to get container status \"37c76e5936438c1184b1105baf7447f0fd6a37c523f4fc01a14492bc7b31aa7d\": rpc error: code = NotFound desc = could not find container \"37c76e5936438c1184b1105baf7447f0fd6a37c523f4fc01a14492bc7b31aa7d\": container with ID starting with 37c76e5936438c1184b1105baf7447f0fd6a37c523f4fc01a14492bc7b31aa7d not found: ID does not exist" Apr 25 00:13:34.703461 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:34.703439 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s"] Apr 25 00:13:34.709404 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:34.709378 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c9f9f-5dd8ccddf4-44z4s"] Apr 25 00:13:34.777848 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:34.777825 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:13:34.777848 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:34.777849 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3-openshift-service-ca-bundle\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:13:35.911517 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:35.911479 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3" path="/var/lib/kubelet/pods/51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3/volumes" Apr 25 00:13:36.482615 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:36.482583 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" Apr 25 00:13:37.489240 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:37.489208 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" Apr 25 00:13:37.903999 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:37.903971 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 25 00:13:37.906194 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:37.906174 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 25 00:13:42.579709 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:42.579670 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" podUID="698b477f-a5ba-428e-bac0-96f0f0ee89fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 25 00:13:43.590061 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:43.590007 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" podUID="ccf9b311-61f4-49c6-b521-0cc24798e111" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 25 00:13:48.675651 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.675611 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9"] Apr 25 00:13:48.676122 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.676103 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a71639bf-f015-4465-933c-b9a7b152f57d" containerName="kserve-container" Apr 25 00:13:48.676207 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.676126 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a71639bf-f015-4465-933c-b9a7b152f57d" containerName="kserve-container" Apr 25 00:13:48.676207 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.676150 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3" containerName="sequence-graph-c9f9f" Apr 25 00:13:48.676207 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.676159 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3" containerName="sequence-graph-c9f9f" Apr 25 00:13:48.676207 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.676172 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eaa96e56-ae3f-4672-b36f-2043818f851d" containerName="kube-rbac-proxy" Apr 25 00:13:48.676207 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.676181 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaa96e56-ae3f-4672-b36f-2043818f851d" containerName="kube-rbac-proxy" Apr 25 00:13:48.676207 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.676195 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a71639bf-f015-4465-933c-b9a7b152f57d" containerName="kube-rbac-proxy" Apr 25 00:13:48.676207 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.676203 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a71639bf-f015-4465-933c-b9a7b152f57d" containerName="kube-rbac-proxy" Apr 25 00:13:48.676629 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.676214 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eaa96e56-ae3f-4672-b36f-2043818f851d" containerName="kserve-container" Apr 25 00:13:48.676629 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.676231 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaa96e56-ae3f-4672-b36f-2043818f851d" containerName="kserve-container" Apr 25 00:13:48.676629 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.676250 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5aaf2d5-bec8-4036-9be1-94e6fafc9910" containerName="ensemble-graph-d627c" Apr 25 00:13:48.676629 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.676258 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5aaf2d5-bec8-4036-9be1-94e6fafc9910" containerName="ensemble-graph-d627c" Apr 25 00:13:48.676629 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.676336 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="eaa96e56-ae3f-4672-b36f-2043818f851d" containerName="kserve-container" Apr 25 00:13:48.676629 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.676352 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a71639bf-f015-4465-933c-b9a7b152f57d" containerName="kserve-container" Apr 25 00:13:48.676629 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.676362 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d5aaf2d5-bec8-4036-9be1-94e6fafc9910" containerName="ensemble-graph-d627c" Apr 25 00:13:48.676629 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.676375 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a71639bf-f015-4465-933c-b9a7b152f57d" containerName="kube-rbac-proxy" Apr 25 00:13:48.676629 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.676383 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="51ff0f0e-8ec3-46d3-b0f7-2431460ff6d3" containerName="sequence-graph-c9f9f" Apr 25 00:13:48.676629 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.676392 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="eaa96e56-ae3f-4672-b36f-2043818f851d" containerName="kube-rbac-proxy" Apr 25 00:13:48.681652 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.681632 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" Apr 25 00:13:48.683721 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.683696 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-ac6c8-serving-cert\"" Apr 25 00:13:48.683859 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.683701 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-ac6c8-kube-rbac-proxy-sar-config\"" Apr 25 00:13:48.687996 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.687931 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9"] Apr 25 00:13:48.704640 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.704602 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0aad686e-898e-4b26-87e0-3dff2ad29a46-openshift-service-ca-bundle\") pod \"ensemble-graph-ac6c8-87cf995b8-z6wk9\" (UID: \"0aad686e-898e-4b26-87e0-3dff2ad29a46\") " pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" Apr 25 00:13:48.704768 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.704681 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0aad686e-898e-4b26-87e0-3dff2ad29a46-proxy-tls\") pod \"ensemble-graph-ac6c8-87cf995b8-z6wk9\" (UID: \"0aad686e-898e-4b26-87e0-3dff2ad29a46\") " pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" Apr 25 00:13:48.805574 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.805541 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0aad686e-898e-4b26-87e0-3dff2ad29a46-proxy-tls\") pod \"ensemble-graph-ac6c8-87cf995b8-z6wk9\" (UID: \"0aad686e-898e-4b26-87e0-3dff2ad29a46\") " pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" Apr 25 00:13:48.805767 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.805617 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0aad686e-898e-4b26-87e0-3dff2ad29a46-openshift-service-ca-bundle\") pod \"ensemble-graph-ac6c8-87cf995b8-z6wk9\" (UID: \"0aad686e-898e-4b26-87e0-3dff2ad29a46\") " pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" Apr 25 00:13:48.806187 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.806168 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0aad686e-898e-4b26-87e0-3dff2ad29a46-openshift-service-ca-bundle\") pod \"ensemble-graph-ac6c8-87cf995b8-z6wk9\" (UID: \"0aad686e-898e-4b26-87e0-3dff2ad29a46\") " pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" Apr 25 00:13:48.808167 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.808144 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0aad686e-898e-4b26-87e0-3dff2ad29a46-proxy-tls\") pod \"ensemble-graph-ac6c8-87cf995b8-z6wk9\" (UID: \"0aad686e-898e-4b26-87e0-3dff2ad29a46\") " pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" Apr 25 00:13:48.993335 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:48.993253 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" Apr 25 00:13:49.113952 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:49.113918 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9"] Apr 25 00:13:49.117227 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:13:49.117203 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0aad686e_898e_4b26_87e0_3dff2ad29a46.slice/crio-e8e212462f80da9ec2f1a452fa10a2f33199ba280ffa2ca33e4ee4e0a79bc6d4 WatchSource:0}: Error finding container e8e212462f80da9ec2f1a452fa10a2f33199ba280ffa2ca33e4ee4e0a79bc6d4: Status 404 returned error can't find the container with id e8e212462f80da9ec2f1a452fa10a2f33199ba280ffa2ca33e4ee4e0a79bc6d4 Apr 25 00:13:49.729915 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:49.729877 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" event={"ID":"0aad686e-898e-4b26-87e0-3dff2ad29a46","Type":"ContainerStarted","Data":"2d3d210739a247da851053276b7f8de178d966eb8c34bd99939ddab2659d5345"} Apr 25 00:13:49.729915 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:49.729914 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" event={"ID":"0aad686e-898e-4b26-87e0-3dff2ad29a46","Type":"ContainerStarted","Data":"e8e212462f80da9ec2f1a452fa10a2f33199ba280ffa2ca33e4ee4e0a79bc6d4"} Apr 25 00:13:49.730337 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:49.730135 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" Apr 25 00:13:49.745569 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:49.745531 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" podStartSLOduration=1.7455194330000001 podStartE2EDuration="1.745519433s" podCreationTimestamp="2026-04-25 00:13:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:13:49.744631094 +0000 UTC m=+1212.378613052" watchObservedRunningTime="2026-04-25 00:13:49.745519433 +0000 UTC m=+1212.379501411" Apr 25 00:13:52.580122 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:52.580094 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" Apr 25 00:13:53.590525 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:53.590496 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" Apr 25 00:13:55.740166 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:13:55.740132 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" Apr 25 00:14:04.592751 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:14:04.592715 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g"] Apr 25 00:14:04.596145 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:14:04.596129 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" Apr 25 00:14:04.598340 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:14:04.598314 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-e1f95-kube-rbac-proxy-sar-config\"" Apr 25 00:14:04.598485 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:14:04.598319 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-e1f95-serving-cert\"" Apr 25 00:14:04.605284 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:14:04.605260 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g"] Apr 25 00:14:04.753253 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:14:04.753218 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce-proxy-tls\") pod \"sequence-graph-e1f95-6b4654b7f9-bt48g\" (UID: \"c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce\") " pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" Apr 25 00:14:04.753448 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:14:04.753306 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce-openshift-service-ca-bundle\") pod \"sequence-graph-e1f95-6b4654b7f9-bt48g\" (UID: \"c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce\") " pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" Apr 25 00:14:04.854571 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:14:04.854485 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce-proxy-tls\") pod \"sequence-graph-e1f95-6b4654b7f9-bt48g\" (UID: \"c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce\") " pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" Apr 25 00:14:04.854726 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:14:04.854592 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce-openshift-service-ca-bundle\") pod \"sequence-graph-e1f95-6b4654b7f9-bt48g\" (UID: \"c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce\") " pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" Apr 25 00:14:04.855323 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:14:04.855259 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce-openshift-service-ca-bundle\") pod \"sequence-graph-e1f95-6b4654b7f9-bt48g\" (UID: \"c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce\") " pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" Apr 25 00:14:04.857142 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:14:04.857118 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce-proxy-tls\") pod \"sequence-graph-e1f95-6b4654b7f9-bt48g\" (UID: \"c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce\") " pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" Apr 25 00:14:04.907346 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:14:04.907312 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" Apr 25 00:14:05.028818 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:14:05.028792 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g"] Apr 25 00:14:05.030981 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:14:05.030946 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6cb4fdc_9ffd_4291_89f5_c4ff8dc483ce.slice/crio-1d37ff0bc6f95cd6404d0f862fa684ef232ad807e790d22d45df3f26bb9e1c20 WatchSource:0}: Error finding container 1d37ff0bc6f95cd6404d0f862fa684ef232ad807e790d22d45df3f26bb9e1c20: Status 404 returned error can't find the container with id 1d37ff0bc6f95cd6404d0f862fa684ef232ad807e790d22d45df3f26bb9e1c20 Apr 25 00:14:05.779194 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:14:05.779157 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" event={"ID":"c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce","Type":"ContainerStarted","Data":"7d8c7a68ecaecb560cdcc65354abd36181876c3b9763e78d0cf2907dd572beec"} Apr 25 00:14:05.779194 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:14:05.779196 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" event={"ID":"c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce","Type":"ContainerStarted","Data":"1d37ff0bc6f95cd6404d0f862fa684ef232ad807e790d22d45df3f26bb9e1c20"} Apr 25 00:14:05.779628 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:14:05.779286 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" Apr 25 00:14:05.798118 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:14:05.798058 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" podStartSLOduration=1.798037183 podStartE2EDuration="1.798037183s" podCreationTimestamp="2026-04-25 00:14:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:14:05.795354799 +0000 UTC m=+1228.429336746" watchObservedRunningTime="2026-04-25 00:14:05.798037183 +0000 UTC m=+1228.432019122" Apr 25 00:14:11.788199 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:14:11.788167 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" Apr 25 00:18:37.929019 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:18:37.928987 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 25 00:18:37.931874 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:18:37.931852 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 25 00:22:03.221282 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.221207 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9"] Apr 25 00:22:03.223745 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.221475 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" podUID="0aad686e-898e-4b26-87e0-3dff2ad29a46" containerName="ensemble-graph-ac6c8" containerID="cri-o://2d3d210739a247da851053276b7f8de178d966eb8c34bd99939ddab2659d5345" gracePeriod=30 Apr 25 00:22:03.322041 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.322006 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc"] Apr 25 00:22:03.322639 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.322305 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" podUID="3043916b-aef2-46c0-b90a-f63cfec5c4e1" containerName="kserve-container" containerID="cri-o://fd27b8134b6511bc61b98a2731afe713e6a6e4a8282c597cca4fb3e4d2d6e7aa" gracePeriod=30 Apr 25 00:22:03.322639 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.322338 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" podUID="3043916b-aef2-46c0-b90a-f63cfec5c4e1" containerName="kube-rbac-proxy" containerID="cri-o://77bf0be5beb38dbcd1196700152ace65684a8e839a8df1bb0c9f08bcea679162" gracePeriod=30 Apr 25 00:22:03.384922 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.384887 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf"] Apr 25 00:22:03.385200 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.385163 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" podUID="3a07e833-145e-41dc-bec3-c09231467d16" containerName="kserve-container" containerID="cri-o://e0e20dadf1b35bb1666d3394eed92e4f6c1b15079ee3d33656ead05235f82722" gracePeriod=30 Apr 25 00:22:03.385271 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.385215 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" podUID="3a07e833-145e-41dc-bec3-c09231467d16" containerName="kube-rbac-proxy" containerID="cri-o://30014a48c33fa8753ebbf40ae1644b3dd09bf791003add40c14ae55441331b77" gracePeriod=30 Apr 25 00:22:03.407572 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.407547 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf"] Apr 25 00:22:03.411011 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.410982 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" Apr 25 00:22:03.413178 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.413147 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-7621b-predictor-serving-cert\"" Apr 25 00:22:03.413178 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.413163 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-7621b-kube-rbac-proxy-sar-config\"" Apr 25 00:22:03.428095 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.428068 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf"] Apr 25 00:22:03.466430 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.466386 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt"] Apr 25 00:22:03.469748 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.469728 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" Apr 25 00:22:03.471850 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.471794 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-7621b-kube-rbac-proxy-sar-config\"" Apr 25 00:22:03.471850 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.471799 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-7621b-predictor-serving-cert\"" Apr 25 00:22:03.476791 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.476752 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt"] Apr 25 00:22:03.552148 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.552117 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzb59\" (UniqueName: \"kubernetes.io/projected/e5fb70f9-c86b-41ae-a60b-60c0574f3ccf-kube-api-access-fzb59\") pod \"success-200-isvc-7621b-predictor-5f89d47788-7mplf\" (UID: \"e5fb70f9-c86b-41ae-a60b-60c0574f3ccf\") " pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" Apr 25 00:22:03.552315 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.552173 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5fb70f9-c86b-41ae-a60b-60c0574f3ccf-proxy-tls\") pod \"success-200-isvc-7621b-predictor-5f89d47788-7mplf\" (UID: \"e5fb70f9-c86b-41ae-a60b-60c0574f3ccf\") " pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" Apr 25 00:22:03.552315 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.552214 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-7621b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e5fb70f9-c86b-41ae-a60b-60c0574f3ccf-success-200-isvc-7621b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-7621b-predictor-5f89d47788-7mplf\" (UID: \"e5fb70f9-c86b-41ae-a60b-60c0574f3ccf\") " pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" Apr 25 00:22:03.652957 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.652925 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b73d0fa-8973-4fa6-9893-f71b996c8fe0-proxy-tls\") pod \"error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt\" (UID: \"7b73d0fa-8973-4fa6-9893-f71b996c8fe0\") " pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" Apr 25 00:22:03.653160 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.652992 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-7621b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e5fb70f9-c86b-41ae-a60b-60c0574f3ccf-success-200-isvc-7621b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-7621b-predictor-5f89d47788-7mplf\" (UID: \"e5fb70f9-c86b-41ae-a60b-60c0574f3ccf\") " pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" Apr 25 00:22:03.653160 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.653045 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-7621b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7b73d0fa-8973-4fa6-9893-f71b996c8fe0-error-404-isvc-7621b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt\" (UID: \"7b73d0fa-8973-4fa6-9893-f71b996c8fe0\") " pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" Apr 25 00:22:03.653160 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.653109 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fzb59\" (UniqueName: \"kubernetes.io/projected/e5fb70f9-c86b-41ae-a60b-60c0574f3ccf-kube-api-access-fzb59\") pod \"success-200-isvc-7621b-predictor-5f89d47788-7mplf\" (UID: \"e5fb70f9-c86b-41ae-a60b-60c0574f3ccf\") " pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" Apr 25 00:22:03.653160 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.653145 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6p58\" (UniqueName: \"kubernetes.io/projected/7b73d0fa-8973-4fa6-9893-f71b996c8fe0-kube-api-access-x6p58\") pod \"error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt\" (UID: \"7b73d0fa-8973-4fa6-9893-f71b996c8fe0\") " pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" Apr 25 00:22:03.653337 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.653186 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5fb70f9-c86b-41ae-a60b-60c0574f3ccf-proxy-tls\") pod \"success-200-isvc-7621b-predictor-5f89d47788-7mplf\" (UID: \"e5fb70f9-c86b-41ae-a60b-60c0574f3ccf\") " pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" Apr 25 00:22:03.654067 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.653992 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-7621b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e5fb70f9-c86b-41ae-a60b-60c0574f3ccf-success-200-isvc-7621b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-7621b-predictor-5f89d47788-7mplf\" (UID: \"e5fb70f9-c86b-41ae-a60b-60c0574f3ccf\") " pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" Apr 25 00:22:03.655785 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.655765 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5fb70f9-c86b-41ae-a60b-60c0574f3ccf-proxy-tls\") pod \"success-200-isvc-7621b-predictor-5f89d47788-7mplf\" (UID: \"e5fb70f9-c86b-41ae-a60b-60c0574f3ccf\") " pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" Apr 25 00:22:03.660682 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.660661 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzb59\" (UniqueName: \"kubernetes.io/projected/e5fb70f9-c86b-41ae-a60b-60c0574f3ccf-kube-api-access-fzb59\") pod \"success-200-isvc-7621b-predictor-5f89d47788-7mplf\" (UID: \"e5fb70f9-c86b-41ae-a60b-60c0574f3ccf\") " pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" Apr 25 00:22:03.724587 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.724504 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" Apr 25 00:22:03.754556 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.754528 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6p58\" (UniqueName: \"kubernetes.io/projected/7b73d0fa-8973-4fa6-9893-f71b996c8fe0-kube-api-access-x6p58\") pod \"error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt\" (UID: \"7b73d0fa-8973-4fa6-9893-f71b996c8fe0\") " pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" Apr 25 00:22:03.754708 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.754596 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b73d0fa-8973-4fa6-9893-f71b996c8fe0-proxy-tls\") pod \"error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt\" (UID: \"7b73d0fa-8973-4fa6-9893-f71b996c8fe0\") " pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" Apr 25 00:22:03.754708 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.754646 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-7621b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7b73d0fa-8973-4fa6-9893-f71b996c8fe0-error-404-isvc-7621b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt\" (UID: \"7b73d0fa-8973-4fa6-9893-f71b996c8fe0\") " pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" Apr 25 00:22:03.755288 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.755256 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-7621b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7b73d0fa-8973-4fa6-9893-f71b996c8fe0-error-404-isvc-7621b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt\" (UID: \"7b73d0fa-8973-4fa6-9893-f71b996c8fe0\") " pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" Apr 25 00:22:03.757232 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.757209 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b73d0fa-8973-4fa6-9893-f71b996c8fe0-proxy-tls\") pod \"error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt\" (UID: \"7b73d0fa-8973-4fa6-9893-f71b996c8fe0\") " pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" Apr 25 00:22:03.762917 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.762892 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6p58\" (UniqueName: \"kubernetes.io/projected/7b73d0fa-8973-4fa6-9893-f71b996c8fe0-kube-api-access-x6p58\") pod \"error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt\" (UID: \"7b73d0fa-8973-4fa6-9893-f71b996c8fe0\") " pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" Apr 25 00:22:03.780765 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.780717 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" Apr 25 00:22:03.855942 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.855917 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf"] Apr 25 00:22:03.858308 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:22:03.858276 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5fb70f9_c86b_41ae_a60b_60c0574f3ccf.slice/crio-0524f667cfff384027aa2fd418a9fb68bbd1ffb3ed1080a820158641e78c56f1 WatchSource:0}: Error finding container 0524f667cfff384027aa2fd418a9fb68bbd1ffb3ed1080a820158641e78c56f1: Status 404 returned error can't find the container with id 0524f667cfff384027aa2fd418a9fb68bbd1ffb3ed1080a820158641e78c56f1 Apr 25 00:22:03.860283 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.860269 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:22:03.917544 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:03.915881 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt"] Apr 25 00:22:03.920086 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:22:03.920054 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b73d0fa_8973_4fa6_9893_f71b996c8fe0.slice/crio-6853df002fa621eeb354658a32ea9a9629fddfd956fcc63e32a52d2ee2a70b74 WatchSource:0}: Error finding container 6853df002fa621eeb354658a32ea9a9629fddfd956fcc63e32a52d2ee2a70b74: Status 404 returned error can't find the container with id 6853df002fa621eeb354658a32ea9a9629fddfd956fcc63e32a52d2ee2a70b74 Apr 25 00:22:04.225350 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:04.225313 2578 generic.go:358] "Generic (PLEG): container finished" podID="3a07e833-145e-41dc-bec3-c09231467d16" containerID="30014a48c33fa8753ebbf40ae1644b3dd09bf791003add40c14ae55441331b77" exitCode=2 Apr 25 00:22:04.225841 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:04.225379 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" event={"ID":"3a07e833-145e-41dc-bec3-c09231467d16","Type":"ContainerDied","Data":"30014a48c33fa8753ebbf40ae1644b3dd09bf791003add40c14ae55441331b77"} Apr 25 00:22:04.226878 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:04.226858 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" event={"ID":"7b73d0fa-8973-4fa6-9893-f71b996c8fe0","Type":"ContainerStarted","Data":"fc9277bfe0796ed395fefbf74565aa6e8a7cf328794c5a1ac24c975062e8ff0e"} Apr 25 00:22:04.226999 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:04.226885 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" event={"ID":"7b73d0fa-8973-4fa6-9893-f71b996c8fe0","Type":"ContainerStarted","Data":"40dedf9164970811c38cedead00e978199217561127f6f0d719d4cdf932520b8"} Apr 25 00:22:04.226999 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:04.226899 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" event={"ID":"7b73d0fa-8973-4fa6-9893-f71b996c8fe0","Type":"ContainerStarted","Data":"6853df002fa621eeb354658a32ea9a9629fddfd956fcc63e32a52d2ee2a70b74"} Apr 25 00:22:04.227102 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:04.227086 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" Apr 25 00:22:04.228503 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:04.228480 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" event={"ID":"e5fb70f9-c86b-41ae-a60b-60c0574f3ccf","Type":"ContainerStarted","Data":"33922d3d86674608630d8aec3fe8b685535da0b37c0374cdc9a6dc2e3b60b4a4"} Apr 25 00:22:04.228615 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:04.228506 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" event={"ID":"e5fb70f9-c86b-41ae-a60b-60c0574f3ccf","Type":"ContainerStarted","Data":"e354d4a29e23fcbbb847fe6cb8cf9c74d5afcc34cb00e1f6ef5bf514dc1428e3"} Apr 25 00:22:04.228615 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:04.228516 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" event={"ID":"e5fb70f9-c86b-41ae-a60b-60c0574f3ccf","Type":"ContainerStarted","Data":"0524f667cfff384027aa2fd418a9fb68bbd1ffb3ed1080a820158641e78c56f1"} Apr 25 00:22:04.228732 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:04.228690 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" Apr 25 00:22:04.228732 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:04.228707 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" Apr 25 00:22:04.229953 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:04.229926 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" podUID="e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 25 00:22:04.230154 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:04.230134 2578 generic.go:358] "Generic (PLEG): container finished" podID="3043916b-aef2-46c0-b90a-f63cfec5c4e1" containerID="77bf0be5beb38dbcd1196700152ace65684a8e839a8df1bb0c9f08bcea679162" exitCode=2 Apr 25 00:22:04.230235 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:04.230188 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" event={"ID":"3043916b-aef2-46c0-b90a-f63cfec5c4e1","Type":"ContainerDied","Data":"77bf0be5beb38dbcd1196700152ace65684a8e839a8df1bb0c9f08bcea679162"} Apr 25 00:22:04.243812 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:04.243774 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" podStartSLOduration=1.243763804 podStartE2EDuration="1.243763804s" podCreationTimestamp="2026-04-25 00:22:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:22:04.242361667 +0000 UTC m=+1706.876343625" watchObservedRunningTime="2026-04-25 00:22:04.243763804 +0000 UTC m=+1706.877745817" Apr 25 00:22:04.258536 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:04.258492 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" podStartSLOduration=1.258479463 podStartE2EDuration="1.258479463s" podCreationTimestamp="2026-04-25 00:22:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:22:04.257052382 +0000 UTC m=+1706.891034317" watchObservedRunningTime="2026-04-25 00:22:04.258479463 +0000 UTC m=+1706.892461398" Apr 25 00:22:05.233565 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:05.233525 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" Apr 25 00:22:05.233565 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:05.233549 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" podUID="e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 25 00:22:05.234832 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:05.234805 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" podUID="7b73d0fa-8973-4fa6-9893-f71b996c8fe0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 25 00:22:05.737579 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:05.737540 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" podUID="0aad686e-898e-4b26-87e0-3dff2ad29a46" containerName="ensemble-graph-ac6c8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:22:06.236527 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:06.236481 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" podUID="7b73d0fa-8973-4fa6-9893-f71b996c8fe0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 25 00:22:06.478274 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:06.478229 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" podUID="3a07e833-145e-41dc-bec3-c09231467d16" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.36:8643/healthz\": dial tcp 10.133.0.36:8643: connect: connection refused" Apr 25 00:22:06.482733 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:06.482694 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" podUID="3a07e833-145e-41dc-bec3-c09231467d16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 25 00:22:06.771211 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:06.771187 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" Apr 25 00:22:06.835854 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:06.835825 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" Apr 25 00:22:06.882448 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:06.882342 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-ac6c8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3043916b-aef2-46c0-b90a-f63cfec5c4e1-success-200-isvc-ac6c8-kube-rbac-proxy-sar-config\") pod \"3043916b-aef2-46c0-b90a-f63cfec5c4e1\" (UID: \"3043916b-aef2-46c0-b90a-f63cfec5c4e1\") " Apr 25 00:22:06.882613 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:06.882446 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3043916b-aef2-46c0-b90a-f63cfec5c4e1-proxy-tls\") pod \"3043916b-aef2-46c0-b90a-f63cfec5c4e1\" (UID: \"3043916b-aef2-46c0-b90a-f63cfec5c4e1\") " Apr 25 00:22:06.882613 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:06.882529 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx5nk\" (UniqueName: \"kubernetes.io/projected/3043916b-aef2-46c0-b90a-f63cfec5c4e1-kube-api-access-dx5nk\") pod \"3043916b-aef2-46c0-b90a-f63cfec5c4e1\" (UID: \"3043916b-aef2-46c0-b90a-f63cfec5c4e1\") " Apr 25 00:22:06.882786 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:06.882758 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3043916b-aef2-46c0-b90a-f63cfec5c4e1-success-200-isvc-ac6c8-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-ac6c8-kube-rbac-proxy-sar-config") pod "3043916b-aef2-46c0-b90a-f63cfec5c4e1" (UID: "3043916b-aef2-46c0-b90a-f63cfec5c4e1"). InnerVolumeSpecName "success-200-isvc-ac6c8-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:22:06.884764 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:06.884736 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3043916b-aef2-46c0-b90a-f63cfec5c4e1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3043916b-aef2-46c0-b90a-f63cfec5c4e1" (UID: "3043916b-aef2-46c0-b90a-f63cfec5c4e1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:22:06.884871 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:06.884829 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3043916b-aef2-46c0-b90a-f63cfec5c4e1-kube-api-access-dx5nk" (OuterVolumeSpecName: "kube-api-access-dx5nk") pod "3043916b-aef2-46c0-b90a-f63cfec5c4e1" (UID: "3043916b-aef2-46c0-b90a-f63cfec5c4e1"). InnerVolumeSpecName "kube-api-access-dx5nk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:22:06.983047 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:06.983010 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sml2h\" (UniqueName: \"kubernetes.io/projected/3a07e833-145e-41dc-bec3-c09231467d16-kube-api-access-sml2h\") pod \"3a07e833-145e-41dc-bec3-c09231467d16\" (UID: \"3a07e833-145e-41dc-bec3-c09231467d16\") " Apr 25 00:22:06.983226 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:06.983131 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-ac6c8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3a07e833-145e-41dc-bec3-c09231467d16-error-404-isvc-ac6c8-kube-rbac-proxy-sar-config\") pod \"3a07e833-145e-41dc-bec3-c09231467d16\" (UID: \"3a07e833-145e-41dc-bec3-c09231467d16\") " Apr 25 00:22:06.983226 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:06.983167 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a07e833-145e-41dc-bec3-c09231467d16-proxy-tls\") pod \"3a07e833-145e-41dc-bec3-c09231467d16\" (UID: \"3a07e833-145e-41dc-bec3-c09231467d16\") " Apr 25 00:22:06.983432 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:06.983396 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3043916b-aef2-46c0-b90a-f63cfec5c4e1-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:22:06.983552 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:06.983448 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dx5nk\" (UniqueName: \"kubernetes.io/projected/3043916b-aef2-46c0-b90a-f63cfec5c4e1-kube-api-access-dx5nk\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:22:06.983552 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:06.983465 2578 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-ac6c8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3043916b-aef2-46c0-b90a-f63cfec5c4e1-success-200-isvc-ac6c8-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:22:06.983693 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:06.983617 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a07e833-145e-41dc-bec3-c09231467d16-error-404-isvc-ac6c8-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-ac6c8-kube-rbac-proxy-sar-config") pod "3a07e833-145e-41dc-bec3-c09231467d16" (UID: "3a07e833-145e-41dc-bec3-c09231467d16"). InnerVolumeSpecName "error-404-isvc-ac6c8-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:22:06.985639 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:06.985606 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a07e833-145e-41dc-bec3-c09231467d16-kube-api-access-sml2h" (OuterVolumeSpecName: "kube-api-access-sml2h") pod "3a07e833-145e-41dc-bec3-c09231467d16" (UID: "3a07e833-145e-41dc-bec3-c09231467d16"). InnerVolumeSpecName "kube-api-access-sml2h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:22:06.985977 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:06.985950 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a07e833-145e-41dc-bec3-c09231467d16-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3a07e833-145e-41dc-bec3-c09231467d16" (UID: "3a07e833-145e-41dc-bec3-c09231467d16"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:22:07.084864 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.084828 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-ac6c8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3a07e833-145e-41dc-bec3-c09231467d16-error-404-isvc-ac6c8-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:22:07.084864 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.084866 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a07e833-145e-41dc-bec3-c09231467d16-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:22:07.085054 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.084880 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sml2h\" (UniqueName: \"kubernetes.io/projected/3a07e833-145e-41dc-bec3-c09231467d16-kube-api-access-sml2h\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:22:07.241451 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.241347 2578 generic.go:358] "Generic (PLEG): container finished" podID="3043916b-aef2-46c0-b90a-f63cfec5c4e1" containerID="fd27b8134b6511bc61b98a2731afe713e6a6e4a8282c597cca4fb3e4d2d6e7aa" exitCode=0 Apr 25 00:22:07.241451 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.241433 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" Apr 25 00:22:07.241965 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.241449 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" event={"ID":"3043916b-aef2-46c0-b90a-f63cfec5c4e1","Type":"ContainerDied","Data":"fd27b8134b6511bc61b98a2731afe713e6a6e4a8282c597cca4fb3e4d2d6e7aa"} Apr 25 00:22:07.241965 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.241482 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc" event={"ID":"3043916b-aef2-46c0-b90a-f63cfec5c4e1","Type":"ContainerDied","Data":"c3e9a08c2d42334e5ffe763c6b5420679baac92276209f870dee0aa434f2ad54"} Apr 25 00:22:07.241965 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.241497 2578 scope.go:117] "RemoveContainer" containerID="77bf0be5beb38dbcd1196700152ace65684a8e839a8df1bb0c9f08bcea679162" Apr 25 00:22:07.242969 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.242950 2578 generic.go:358] "Generic (PLEG): container finished" podID="3a07e833-145e-41dc-bec3-c09231467d16" containerID="e0e20dadf1b35bb1666d3394eed92e4f6c1b15079ee3d33656ead05235f82722" exitCode=0 Apr 25 00:22:07.243077 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.242982 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" event={"ID":"3a07e833-145e-41dc-bec3-c09231467d16","Type":"ContainerDied","Data":"e0e20dadf1b35bb1666d3394eed92e4f6c1b15079ee3d33656ead05235f82722"} Apr 25 00:22:07.243077 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.243006 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" event={"ID":"3a07e833-145e-41dc-bec3-c09231467d16","Type":"ContainerDied","Data":"1f309ebb92b2027af56f6f2c3e45c35dc31c26094774c6726f4de7924b506f0a"} Apr 25 00:22:07.243077 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.243047 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf" Apr 25 00:22:07.250284 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.250256 2578 scope.go:117] "RemoveContainer" containerID="fd27b8134b6511bc61b98a2731afe713e6a6e4a8282c597cca4fb3e4d2d6e7aa" Apr 25 00:22:07.258095 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.258080 2578 scope.go:117] "RemoveContainer" containerID="77bf0be5beb38dbcd1196700152ace65684a8e839a8df1bb0c9f08bcea679162" Apr 25 00:22:07.258338 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:22:07.258322 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77bf0be5beb38dbcd1196700152ace65684a8e839a8df1bb0c9f08bcea679162\": container with ID starting with 77bf0be5beb38dbcd1196700152ace65684a8e839a8df1bb0c9f08bcea679162 not found: ID does not exist" containerID="77bf0be5beb38dbcd1196700152ace65684a8e839a8df1bb0c9f08bcea679162" Apr 25 00:22:07.258377 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.258347 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77bf0be5beb38dbcd1196700152ace65684a8e839a8df1bb0c9f08bcea679162"} err="failed to get container status \"77bf0be5beb38dbcd1196700152ace65684a8e839a8df1bb0c9f08bcea679162\": rpc error: code = NotFound desc = could not find container \"77bf0be5beb38dbcd1196700152ace65684a8e839a8df1bb0c9f08bcea679162\": container with ID starting with 77bf0be5beb38dbcd1196700152ace65684a8e839a8df1bb0c9f08bcea679162 not found: ID does not exist" Apr 25 00:22:07.258377 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.258364 2578 scope.go:117] "RemoveContainer" containerID="fd27b8134b6511bc61b98a2731afe713e6a6e4a8282c597cca4fb3e4d2d6e7aa" Apr 25 00:22:07.258586 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:22:07.258568 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd27b8134b6511bc61b98a2731afe713e6a6e4a8282c597cca4fb3e4d2d6e7aa\": container with ID starting with fd27b8134b6511bc61b98a2731afe713e6a6e4a8282c597cca4fb3e4d2d6e7aa not found: ID does not exist" containerID="fd27b8134b6511bc61b98a2731afe713e6a6e4a8282c597cca4fb3e4d2d6e7aa" Apr 25 00:22:07.258632 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.258592 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd27b8134b6511bc61b98a2731afe713e6a6e4a8282c597cca4fb3e4d2d6e7aa"} err="failed to get container status \"fd27b8134b6511bc61b98a2731afe713e6a6e4a8282c597cca4fb3e4d2d6e7aa\": rpc error: code = NotFound desc = could not find container \"fd27b8134b6511bc61b98a2731afe713e6a6e4a8282c597cca4fb3e4d2d6e7aa\": container with ID starting with fd27b8134b6511bc61b98a2731afe713e6a6e4a8282c597cca4fb3e4d2d6e7aa not found: ID does not exist" Apr 25 00:22:07.258632 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.258608 2578 scope.go:117] "RemoveContainer" containerID="30014a48c33fa8753ebbf40ae1644b3dd09bf791003add40c14ae55441331b77" Apr 25 00:22:07.265581 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.265554 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf"] Apr 25 00:22:07.265819 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.265807 2578 scope.go:117] "RemoveContainer" containerID="e0e20dadf1b35bb1666d3394eed92e4f6c1b15079ee3d33656ead05235f82722" Apr 25 00:22:07.269128 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.269107 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ac6c8-predictor-6bb86d49d9-btthf"] Apr 25 00:22:07.273123 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.273093 2578 scope.go:117] "RemoveContainer" containerID="30014a48c33fa8753ebbf40ae1644b3dd09bf791003add40c14ae55441331b77" Apr 25 00:22:07.273341 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:22:07.273324 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30014a48c33fa8753ebbf40ae1644b3dd09bf791003add40c14ae55441331b77\": container with ID starting with 30014a48c33fa8753ebbf40ae1644b3dd09bf791003add40c14ae55441331b77 not found: ID does not exist" containerID="30014a48c33fa8753ebbf40ae1644b3dd09bf791003add40c14ae55441331b77" Apr 25 00:22:07.273390 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.273348 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30014a48c33fa8753ebbf40ae1644b3dd09bf791003add40c14ae55441331b77"} err="failed to get container status \"30014a48c33fa8753ebbf40ae1644b3dd09bf791003add40c14ae55441331b77\": rpc error: code = NotFound desc = could not find container \"30014a48c33fa8753ebbf40ae1644b3dd09bf791003add40c14ae55441331b77\": container with ID starting with 30014a48c33fa8753ebbf40ae1644b3dd09bf791003add40c14ae55441331b77 not found: ID does not exist" Apr 25 00:22:07.273390 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.273366 2578 scope.go:117] "RemoveContainer" containerID="e0e20dadf1b35bb1666d3394eed92e4f6c1b15079ee3d33656ead05235f82722" Apr 25 00:22:07.273633 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:22:07.273613 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0e20dadf1b35bb1666d3394eed92e4f6c1b15079ee3d33656ead05235f82722\": container with ID starting with e0e20dadf1b35bb1666d3394eed92e4f6c1b15079ee3d33656ead05235f82722 not found: ID does not exist" containerID="e0e20dadf1b35bb1666d3394eed92e4f6c1b15079ee3d33656ead05235f82722" Apr 25 00:22:07.273675 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.273634 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0e20dadf1b35bb1666d3394eed92e4f6c1b15079ee3d33656ead05235f82722"} err="failed to get container status \"e0e20dadf1b35bb1666d3394eed92e4f6c1b15079ee3d33656ead05235f82722\": rpc error: code = NotFound desc = could not find container \"e0e20dadf1b35bb1666d3394eed92e4f6c1b15079ee3d33656ead05235f82722\": container with ID starting with e0e20dadf1b35bb1666d3394eed92e4f6c1b15079ee3d33656ead05235f82722 not found: ID does not exist" Apr 25 00:22:07.278880 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.278861 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc"] Apr 25 00:22:07.282741 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.282723 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ac6c8-predictor-864c9c7dfd-5fqlc"] Apr 25 00:22:07.918313 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.918230 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3043916b-aef2-46c0-b90a-f63cfec5c4e1" path="/var/lib/kubelet/pods/3043916b-aef2-46c0-b90a-f63cfec5c4e1/volumes" Apr 25 00:22:07.918690 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:07.918676 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a07e833-145e-41dc-bec3-c09231467d16" path="/var/lib/kubelet/pods/3a07e833-145e-41dc-bec3-c09231467d16/volumes" Apr 25 00:22:10.238886 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:10.238856 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" Apr 25 00:22:10.239410 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:10.239383 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" podUID="e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 25 00:22:10.737024 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:10.736984 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" podUID="0aad686e-898e-4b26-87e0-3dff2ad29a46" containerName="ensemble-graph-ac6c8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:22:11.240954 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:11.240927 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" Apr 25 00:22:11.241519 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:11.241492 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" podUID="7b73d0fa-8973-4fa6-9893-f71b996c8fe0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 25 00:22:15.737778 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:15.737739 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" podUID="0aad686e-898e-4b26-87e0-3dff2ad29a46" containerName="ensemble-graph-ac6c8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:22:15.738177 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:15.737874 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" Apr 25 00:22:19.176308 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.176276 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g"] Apr 25 00:22:19.176804 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.176591 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" podUID="c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce" containerName="sequence-graph-e1f95" containerID="cri-o://7d8c7a68ecaecb560cdcc65354abd36181876c3b9763e78d0cf2907dd572beec" gracePeriod=30 Apr 25 00:22:19.245123 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.245093 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm"] Apr 25 00:22:19.245402 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.245373 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" podUID="698b477f-a5ba-428e-bac0-96f0f0ee89fc" containerName="kserve-container" containerID="cri-o://064456d4971b2838ca5ad99cb47524ab1989ed6e41ab458e6f40433ec1d4772c" gracePeriod=30 Apr 25 00:22:19.245492 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.245445 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" podUID="698b477f-a5ba-428e-bac0-96f0f0ee89fc" containerName="kube-rbac-proxy" containerID="cri-o://e4133c2e49c252b13a198725a5da87386fed76a5774bb38f24b3ccf3f28cceb3" gracePeriod=30 Apr 25 00:22:19.291499 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.291466 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp"] Apr 25 00:22:19.291836 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.291793 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" podUID="ccf9b311-61f4-49c6-b521-0cc24798e111" containerName="kserve-container" containerID="cri-o://46b70bf1f37b88e1084c5c2b0e9ba6491fe8a492a11f135f573ecb78b87276f2" gracePeriod=30 Apr 25 00:22:19.291934 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.291836 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" podUID="ccf9b311-61f4-49c6-b521-0cc24798e111" containerName="kube-rbac-proxy" containerID="cri-o://5814d9f2982cdd6bcf5c8d2b5c7b24bffedcbfc99cbbb1881ee2728227daccba" gracePeriod=30 Apr 25 00:22:19.308842 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.308817 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w"] Apr 25 00:22:19.309169 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.309157 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a07e833-145e-41dc-bec3-c09231467d16" containerName="kserve-container" Apr 25 00:22:19.309212 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.309171 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a07e833-145e-41dc-bec3-c09231467d16" containerName="kserve-container" Apr 25 00:22:19.309212 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.309179 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3043916b-aef2-46c0-b90a-f63cfec5c4e1" containerName="kserve-container" Apr 25 00:22:19.309212 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.309186 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3043916b-aef2-46c0-b90a-f63cfec5c4e1" containerName="kserve-container" Apr 25 00:22:19.309212 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.309193 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3043916b-aef2-46c0-b90a-f63cfec5c4e1" containerName="kube-rbac-proxy" Apr 25 00:22:19.309212 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.309200 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3043916b-aef2-46c0-b90a-f63cfec5c4e1" containerName="kube-rbac-proxy" Apr 25 00:22:19.309212 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.309207 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a07e833-145e-41dc-bec3-c09231467d16" containerName="kube-rbac-proxy" Apr 25 00:22:19.309212 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.309212 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a07e833-145e-41dc-bec3-c09231467d16" containerName="kube-rbac-proxy" Apr 25 00:22:19.309482 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.309279 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="3043916b-aef2-46c0-b90a-f63cfec5c4e1" containerName="kserve-container" Apr 25 00:22:19.309482 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.309289 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="3043916b-aef2-46c0-b90a-f63cfec5c4e1" containerName="kube-rbac-proxy" Apr 25 00:22:19.309482 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.309295 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a07e833-145e-41dc-bec3-c09231467d16" containerName="kserve-container" Apr 25 00:22:19.309482 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.309304 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a07e833-145e-41dc-bec3-c09231467d16" containerName="kube-rbac-proxy" Apr 25 00:22:19.314062 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.314045 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" Apr 25 00:22:19.316369 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.316350 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-a37f5-predictor-serving-cert\"" Apr 25 00:22:19.316803 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.316789 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-a37f5-kube-rbac-proxy-sar-config\"" Apr 25 00:22:19.324079 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.324056 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w"] Apr 25 00:22:19.388056 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.388023 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e12b5b8f-d177-46cd-b4a8-07accb6972eb-proxy-tls\") pod \"success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w\" (UID: \"e12b5b8f-d177-46cd-b4a8-07accb6972eb\") " pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" Apr 25 00:22:19.388056 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.388058 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4scl6\" (UniqueName: \"kubernetes.io/projected/e12b5b8f-d177-46cd-b4a8-07accb6972eb-kube-api-access-4scl6\") pod \"success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w\" (UID: \"e12b5b8f-d177-46cd-b4a8-07accb6972eb\") " pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" Apr 25 00:22:19.388290 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.388088 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-a37f5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e12b5b8f-d177-46cd-b4a8-07accb6972eb-success-200-isvc-a37f5-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w\" (UID: \"e12b5b8f-d177-46cd-b4a8-07accb6972eb\") " pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" Apr 25 00:22:19.400477 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.400451 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm"] Apr 25 00:22:19.403977 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.403957 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" Apr 25 00:22:19.406010 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.405990 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-a37f5-predictor-serving-cert\"" Apr 25 00:22:19.406104 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.406017 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-a37f5-kube-rbac-proxy-sar-config\"" Apr 25 00:22:19.413140 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.413116 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm"] Apr 25 00:22:19.489078 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.488945 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-a37f5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb303152-c58d-4a84-bf8f-5dd213041b6d-error-404-isvc-a37f5-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a37f5-predictor-547f4c9447-g2whm\" (UID: \"cb303152-c58d-4a84-bf8f-5dd213041b6d\") " pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" Apr 25 00:22:19.489078 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.489020 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e12b5b8f-d177-46cd-b4a8-07accb6972eb-proxy-tls\") pod \"success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w\" (UID: \"e12b5b8f-d177-46cd-b4a8-07accb6972eb\") " pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" Apr 25 00:22:19.489078 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.489050 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4scl6\" (UniqueName: \"kubernetes.io/projected/e12b5b8f-d177-46cd-b4a8-07accb6972eb-kube-api-access-4scl6\") pod \"success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w\" (UID: \"e12b5b8f-d177-46cd-b4a8-07accb6972eb\") " pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" Apr 25 00:22:19.489393 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.489092 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb303152-c58d-4a84-bf8f-5dd213041b6d-proxy-tls\") pod \"error-404-isvc-a37f5-predictor-547f4c9447-g2whm\" (UID: \"cb303152-c58d-4a84-bf8f-5dd213041b6d\") " pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" Apr 25 00:22:19.489393 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.489133 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-a37f5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e12b5b8f-d177-46cd-b4a8-07accb6972eb-success-200-isvc-a37f5-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w\" (UID: \"e12b5b8f-d177-46cd-b4a8-07accb6972eb\") " pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" Apr 25 00:22:19.489393 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.489213 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpfwc\" (UniqueName: \"kubernetes.io/projected/cb303152-c58d-4a84-bf8f-5dd213041b6d-kube-api-access-zpfwc\") pod \"error-404-isvc-a37f5-predictor-547f4c9447-g2whm\" (UID: \"cb303152-c58d-4a84-bf8f-5dd213041b6d\") " pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" Apr 25 00:22:19.490279 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.490236 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-a37f5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e12b5b8f-d177-46cd-b4a8-07accb6972eb-success-200-isvc-a37f5-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w\" (UID: \"e12b5b8f-d177-46cd-b4a8-07accb6972eb\") " pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" Apr 25 00:22:19.492825 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.492801 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e12b5b8f-d177-46cd-b4a8-07accb6972eb-proxy-tls\") pod \"success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w\" (UID: \"e12b5b8f-d177-46cd-b4a8-07accb6972eb\") " pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" Apr 25 00:22:19.497048 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.497026 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4scl6\" (UniqueName: \"kubernetes.io/projected/e12b5b8f-d177-46cd-b4a8-07accb6972eb-kube-api-access-4scl6\") pod \"success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w\" (UID: \"e12b5b8f-d177-46cd-b4a8-07accb6972eb\") " pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" Apr 25 00:22:19.590316 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.590264 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpfwc\" (UniqueName: \"kubernetes.io/projected/cb303152-c58d-4a84-bf8f-5dd213041b6d-kube-api-access-zpfwc\") pod \"error-404-isvc-a37f5-predictor-547f4c9447-g2whm\" (UID: \"cb303152-c58d-4a84-bf8f-5dd213041b6d\") " pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" Apr 25 00:22:19.590531 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.590359 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-a37f5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb303152-c58d-4a84-bf8f-5dd213041b6d-error-404-isvc-a37f5-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a37f5-predictor-547f4c9447-g2whm\" (UID: \"cb303152-c58d-4a84-bf8f-5dd213041b6d\") " pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" Apr 25 00:22:19.590531 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.590443 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb303152-c58d-4a84-bf8f-5dd213041b6d-proxy-tls\") pod \"error-404-isvc-a37f5-predictor-547f4c9447-g2whm\" (UID: \"cb303152-c58d-4a84-bf8f-5dd213041b6d\") " pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" Apr 25 00:22:19.590646 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:22:19.590568 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-serving-cert: secret "error-404-isvc-a37f5-predictor-serving-cert" not found Apr 25 00:22:19.590646 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:22:19.590630 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb303152-c58d-4a84-bf8f-5dd213041b6d-proxy-tls podName:cb303152-c58d-4a84-bf8f-5dd213041b6d nodeName:}" failed. No retries permitted until 2026-04-25 00:22:20.090609088 +0000 UTC m=+1722.724591002 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/cb303152-c58d-4a84-bf8f-5dd213041b6d-proxy-tls") pod "error-404-isvc-a37f5-predictor-547f4c9447-g2whm" (UID: "cb303152-c58d-4a84-bf8f-5dd213041b6d") : secret "error-404-isvc-a37f5-predictor-serving-cert" not found Apr 25 00:22:19.591449 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.591392 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-a37f5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb303152-c58d-4a84-bf8f-5dd213041b6d-error-404-isvc-a37f5-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a37f5-predictor-547f4c9447-g2whm\" (UID: \"cb303152-c58d-4a84-bf8f-5dd213041b6d\") " pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" Apr 25 00:22:19.598731 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.598709 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpfwc\" (UniqueName: \"kubernetes.io/projected/cb303152-c58d-4a84-bf8f-5dd213041b6d-kube-api-access-zpfwc\") pod \"error-404-isvc-a37f5-predictor-547f4c9447-g2whm\" (UID: \"cb303152-c58d-4a84-bf8f-5dd213041b6d\") " pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" Apr 25 00:22:19.624582 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.624560 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" Apr 25 00:22:19.757877 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:19.757851 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w"] Apr 25 00:22:19.759900 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:22:19.759873 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode12b5b8f_d177_46cd_b4a8_07accb6972eb.slice/crio-86f2ef976d72f0f298781f341831145c2da5d267690de95ca9ccffc822052ea8 WatchSource:0}: Error finding container 86f2ef976d72f0f298781f341831145c2da5d267690de95ca9ccffc822052ea8: Status 404 returned error can't find the container with id 86f2ef976d72f0f298781f341831145c2da5d267690de95ca9ccffc822052ea8 Apr 25 00:22:20.096043 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:20.095939 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb303152-c58d-4a84-bf8f-5dd213041b6d-proxy-tls\") pod \"error-404-isvc-a37f5-predictor-547f4c9447-g2whm\" (UID: \"cb303152-c58d-4a84-bf8f-5dd213041b6d\") " pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" Apr 25 00:22:20.098558 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:20.098535 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb303152-c58d-4a84-bf8f-5dd213041b6d-proxy-tls\") pod \"error-404-isvc-a37f5-predictor-547f4c9447-g2whm\" (UID: \"cb303152-c58d-4a84-bf8f-5dd213041b6d\") " pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" Apr 25 00:22:20.240044 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:20.240009 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" podUID="e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 25 00:22:20.288612 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:20.288525 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" event={"ID":"e12b5b8f-d177-46cd-b4a8-07accb6972eb","Type":"ContainerStarted","Data":"afa171472f7efbd7d338911411a0eb3b6f2bc9b03cd4eebac4857aa0f98decae"} Apr 25 00:22:20.288612 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:20.288573 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" event={"ID":"e12b5b8f-d177-46cd-b4a8-07accb6972eb","Type":"ContainerStarted","Data":"425784d4c965f7bb1b595ef8a5ede2b547aa87740df41b4e638cbb9efaa1a254"} Apr 25 00:22:20.288612 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:20.288587 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" event={"ID":"e12b5b8f-d177-46cd-b4a8-07accb6972eb","Type":"ContainerStarted","Data":"86f2ef976d72f0f298781f341831145c2da5d267690de95ca9ccffc822052ea8"} Apr 25 00:22:20.288921 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:20.288867 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" Apr 25 00:22:20.290950 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:20.290906 2578 generic.go:358] "Generic (PLEG): container finished" podID="698b477f-a5ba-428e-bac0-96f0f0ee89fc" containerID="e4133c2e49c252b13a198725a5da87386fed76a5774bb38f24b3ccf3f28cceb3" exitCode=2 Apr 25 00:22:20.291117 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:20.290956 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" event={"ID":"698b477f-a5ba-428e-bac0-96f0f0ee89fc","Type":"ContainerDied","Data":"e4133c2e49c252b13a198725a5da87386fed76a5774bb38f24b3ccf3f28cceb3"} Apr 25 00:22:20.293229 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:20.293206 2578 generic.go:358] "Generic (PLEG): container finished" podID="ccf9b311-61f4-49c6-b521-0cc24798e111" containerID="5814d9f2982cdd6bcf5c8d2b5c7b24bffedcbfc99cbbb1881ee2728227daccba" exitCode=2 Apr 25 00:22:20.293345 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:20.293245 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" event={"ID":"ccf9b311-61f4-49c6-b521-0cc24798e111","Type":"ContainerDied","Data":"5814d9f2982cdd6bcf5c8d2b5c7b24bffedcbfc99cbbb1881ee2728227daccba"} Apr 25 00:22:20.308668 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:20.308622 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" podStartSLOduration=1.308606814 podStartE2EDuration="1.308606814s" podCreationTimestamp="2026-04-25 00:22:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:22:20.307260452 +0000 UTC m=+1722.941242429" watchObservedRunningTime="2026-04-25 00:22:20.308606814 +0000 UTC m=+1722.942588803" Apr 25 00:22:20.315717 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:20.315695 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" Apr 25 00:22:20.445473 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:20.445449 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm"] Apr 25 00:22:20.447716 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:22:20.447683 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb303152_c58d_4a84_bf8f_5dd213041b6d.slice/crio-8c1260358f15cf3e6fcab5fccb1e7b49b4bb865f7a41ffed8c0d911524de2ca3 WatchSource:0}: Error finding container 8c1260358f15cf3e6fcab5fccb1e7b49b4bb865f7a41ffed8c0d911524de2ca3: Status 404 returned error can't find the container with id 8c1260358f15cf3e6fcab5fccb1e7b49b4bb865f7a41ffed8c0d911524de2ca3 Apr 25 00:22:20.737806 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:20.737706 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" podUID="0aad686e-898e-4b26-87e0-3dff2ad29a46" containerName="ensemble-graph-ac6c8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:22:21.241520 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:21.241477 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" podUID="7b73d0fa-8973-4fa6-9893-f71b996c8fe0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 25 00:22:21.298883 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:21.298846 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" event={"ID":"cb303152-c58d-4a84-bf8f-5dd213041b6d","Type":"ContainerStarted","Data":"f29b4a85ce18b2de9c6aa661808083f14c34842da7c9317e1feebdcc61735635"} Apr 25 00:22:21.299074 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:21.298893 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" event={"ID":"cb303152-c58d-4a84-bf8f-5dd213041b6d","Type":"ContainerStarted","Data":"e3735e37c1b4a6707bdd8f9cb1adda2f2599bbc949f92bba083cace416816680"} Apr 25 00:22:21.299074 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:21.298907 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" event={"ID":"cb303152-c58d-4a84-bf8f-5dd213041b6d","Type":"ContainerStarted","Data":"8c1260358f15cf3e6fcab5fccb1e7b49b4bb865f7a41ffed8c0d911524de2ca3"} Apr 25 00:22:21.299293 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:21.299273 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" Apr 25 00:22:21.299293 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:21.299293 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" Apr 25 00:22:21.299479 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:21.299307 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" Apr 25 00:22:21.300251 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:21.300220 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" podUID="cb303152-c58d-4a84-bf8f-5dd213041b6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 25 00:22:21.300379 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:21.300336 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" podUID="e12b5b8f-d177-46cd-b4a8-07accb6972eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 25 00:22:21.315096 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:21.315051 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" podStartSLOduration=2.315037843 podStartE2EDuration="2.315037843s" podCreationTimestamp="2026-04-25 00:22:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:22:21.313380463 +0000 UTC m=+1723.947362411" watchObservedRunningTime="2026-04-25 00:22:21.315037843 +0000 UTC m=+1723.949019779" Apr 25 00:22:21.787693 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:21.787629 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" podUID="c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce" containerName="sequence-graph-e1f95" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:22:22.303552 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:22.303509 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" podUID="cb303152-c58d-4a84-bf8f-5dd213041b6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 25 00:22:22.303975 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:22.303619 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" podUID="e12b5b8f-d177-46cd-b4a8-07accb6972eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 25 00:22:22.574901 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:22.574812 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" podUID="698b477f-a5ba-428e-bac0-96f0f0ee89fc" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.39:8643/healthz\": dial tcp 10.133.0.39:8643: connect: connection refused" Apr 25 00:22:22.579546 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:22.579523 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" podUID="698b477f-a5ba-428e-bac0-96f0f0ee89fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 25 00:22:23.309654 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.309621 2578 generic.go:358] "Generic (PLEG): container finished" podID="698b477f-a5ba-428e-bac0-96f0f0ee89fc" containerID="064456d4971b2838ca5ad99cb47524ab1989ed6e41ab458e6f40433ec1d4772c" exitCode=0 Apr 25 00:22:23.309977 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.309688 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" event={"ID":"698b477f-a5ba-428e-bac0-96f0f0ee89fc","Type":"ContainerDied","Data":"064456d4971b2838ca5ad99cb47524ab1989ed6e41ab458e6f40433ec1d4772c"} Apr 25 00:22:23.309977 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.309726 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" event={"ID":"698b477f-a5ba-428e-bac0-96f0f0ee89fc","Type":"ContainerDied","Data":"03d13a91ab4b92f8e51650684cfbf032d1fc85e6ead5e776400f3982cdb48bb9"} Apr 25 00:22:23.309977 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.309739 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03d13a91ab4b92f8e51650684cfbf032d1fc85e6ead5e776400f3982cdb48bb9" Apr 25 00:22:23.311331 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.311308 2578 generic.go:358] "Generic (PLEG): container finished" podID="ccf9b311-61f4-49c6-b521-0cc24798e111" containerID="46b70bf1f37b88e1084c5c2b0e9ba6491fe8a492a11f135f573ecb78b87276f2" exitCode=0 Apr 25 00:22:23.311465 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.311367 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" event={"ID":"ccf9b311-61f4-49c6-b521-0cc24798e111","Type":"ContainerDied","Data":"46b70bf1f37b88e1084c5c2b0e9ba6491fe8a492a11f135f573ecb78b87276f2"} Apr 25 00:22:23.360526 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.360488 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" Apr 25 00:22:23.424279 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.424253 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55db6\" (UniqueName: \"kubernetes.io/projected/698b477f-a5ba-428e-bac0-96f0f0ee89fc-kube-api-access-55db6\") pod \"698b477f-a5ba-428e-bac0-96f0f0ee89fc\" (UID: \"698b477f-a5ba-428e-bac0-96f0f0ee89fc\") " Apr 25 00:22:23.424381 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.424360 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-e1f95-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/698b477f-a5ba-428e-bac0-96f0f0ee89fc-error-404-isvc-e1f95-kube-rbac-proxy-sar-config\") pod \"698b477f-a5ba-428e-bac0-96f0f0ee89fc\" (UID: \"698b477f-a5ba-428e-bac0-96f0f0ee89fc\") " Apr 25 00:22:23.424445 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.424388 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/698b477f-a5ba-428e-bac0-96f0f0ee89fc-proxy-tls\") pod \"698b477f-a5ba-428e-bac0-96f0f0ee89fc\" (UID: \"698b477f-a5ba-428e-bac0-96f0f0ee89fc\") " Apr 25 00:22:23.424712 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.424686 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/698b477f-a5ba-428e-bac0-96f0f0ee89fc-error-404-isvc-e1f95-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-e1f95-kube-rbac-proxy-sar-config") pod "698b477f-a5ba-428e-bac0-96f0f0ee89fc" (UID: "698b477f-a5ba-428e-bac0-96f0f0ee89fc"). InnerVolumeSpecName "error-404-isvc-e1f95-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:22:23.426479 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.426454 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/698b477f-a5ba-428e-bac0-96f0f0ee89fc-kube-api-access-55db6" (OuterVolumeSpecName: "kube-api-access-55db6") pod "698b477f-a5ba-428e-bac0-96f0f0ee89fc" (UID: "698b477f-a5ba-428e-bac0-96f0f0ee89fc"). InnerVolumeSpecName "kube-api-access-55db6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:22:23.426570 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.426535 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/698b477f-a5ba-428e-bac0-96f0f0ee89fc-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "698b477f-a5ba-428e-bac0-96f0f0ee89fc" (UID: "698b477f-a5ba-428e-bac0-96f0f0ee89fc"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:22:23.441097 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.441077 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" Apr 25 00:22:23.525210 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.525171 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-e1f95-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ccf9b311-61f4-49c6-b521-0cc24798e111-success-200-isvc-e1f95-kube-rbac-proxy-sar-config\") pod \"ccf9b311-61f4-49c6-b521-0cc24798e111\" (UID: \"ccf9b311-61f4-49c6-b521-0cc24798e111\") " Apr 25 00:22:23.525349 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.525232 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ccf9b311-61f4-49c6-b521-0cc24798e111-proxy-tls\") pod \"ccf9b311-61f4-49c6-b521-0cc24798e111\" (UID: \"ccf9b311-61f4-49c6-b521-0cc24798e111\") " Apr 25 00:22:23.525349 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.525270 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdm7w\" (UniqueName: \"kubernetes.io/projected/ccf9b311-61f4-49c6-b521-0cc24798e111-kube-api-access-vdm7w\") pod \"ccf9b311-61f4-49c6-b521-0cc24798e111\" (UID: \"ccf9b311-61f4-49c6-b521-0cc24798e111\") " Apr 25 00:22:23.525511 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.525400 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-e1f95-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/698b477f-a5ba-428e-bac0-96f0f0ee89fc-error-404-isvc-e1f95-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:22:23.525511 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.525440 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/698b477f-a5ba-428e-bac0-96f0f0ee89fc-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:22:23.525511 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.525452 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-55db6\" (UniqueName: \"kubernetes.io/projected/698b477f-a5ba-428e-bac0-96f0f0ee89fc-kube-api-access-55db6\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:22:23.525666 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.525552 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccf9b311-61f4-49c6-b521-0cc24798e111-success-200-isvc-e1f95-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-e1f95-kube-rbac-proxy-sar-config") pod "ccf9b311-61f4-49c6-b521-0cc24798e111" (UID: "ccf9b311-61f4-49c6-b521-0cc24798e111"). InnerVolumeSpecName "success-200-isvc-e1f95-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:22:23.527482 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.527460 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccf9b311-61f4-49c6-b521-0cc24798e111-kube-api-access-vdm7w" (OuterVolumeSpecName: "kube-api-access-vdm7w") pod "ccf9b311-61f4-49c6-b521-0cc24798e111" (UID: "ccf9b311-61f4-49c6-b521-0cc24798e111"). InnerVolumeSpecName "kube-api-access-vdm7w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:22:23.527568 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.527517 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccf9b311-61f4-49c6-b521-0cc24798e111-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ccf9b311-61f4-49c6-b521-0cc24798e111" (UID: "ccf9b311-61f4-49c6-b521-0cc24798e111"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:22:23.626379 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.626283 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ccf9b311-61f4-49c6-b521-0cc24798e111-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:22:23.626379 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.626319 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vdm7w\" (UniqueName: \"kubernetes.io/projected/ccf9b311-61f4-49c6-b521-0cc24798e111-kube-api-access-vdm7w\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:22:23.626379 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:23.626334 2578 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-e1f95-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ccf9b311-61f4-49c6-b521-0cc24798e111-success-200-isvc-e1f95-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:22:24.316579 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:24.316547 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" Apr 25 00:22:24.316579 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:24.316553 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp" event={"ID":"ccf9b311-61f4-49c6-b521-0cc24798e111","Type":"ContainerDied","Data":"e1f5e988e767cfbf57871c43102651d53fa03b195f112082c6e3459a5384122c"} Apr 25 00:22:24.317072 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:24.316596 2578 scope.go:117] "RemoveContainer" containerID="5814d9f2982cdd6bcf5c8d2b5c7b24bffedcbfc99cbbb1881ee2728227daccba" Apr 25 00:22:24.317072 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:24.316784 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm" Apr 25 00:22:24.325061 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:24.325037 2578 scope.go:117] "RemoveContainer" containerID="46b70bf1f37b88e1084c5c2b0e9ba6491fe8a492a11f135f573ecb78b87276f2" Apr 25 00:22:24.334879 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:24.334854 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp"] Apr 25 00:22:24.337716 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:24.337696 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e1f95-predictor-78c8c57784-68slp"] Apr 25 00:22:24.347539 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:24.347517 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm"] Apr 25 00:22:24.352719 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:24.352690 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e1f95-predictor-6cb95bc7c9-qwdsm"] Apr 25 00:22:25.737685 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:25.737646 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" podUID="0aad686e-898e-4b26-87e0-3dff2ad29a46" containerName="ensemble-graph-ac6c8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:22:25.914053 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:25.914010 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="698b477f-a5ba-428e-bac0-96f0f0ee89fc" path="/var/lib/kubelet/pods/698b477f-a5ba-428e-bac0-96f0f0ee89fc/volumes" Apr 25 00:22:25.914498 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:25.914483 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccf9b311-61f4-49c6-b521-0cc24798e111" path="/var/lib/kubelet/pods/ccf9b311-61f4-49c6-b521-0cc24798e111/volumes" Apr 25 00:22:26.787793 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:26.787734 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" podUID="c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce" containerName="sequence-graph-e1f95" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:22:27.308021 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:27.307988 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" Apr 25 00:22:27.308369 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:27.308348 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" Apr 25 00:22:27.308523 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:27.308496 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" podUID="cb303152-c58d-4a84-bf8f-5dd213041b6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 25 00:22:27.308777 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:27.308748 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" podUID="e12b5b8f-d177-46cd-b4a8-07accb6972eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 25 00:22:30.239892 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:30.239856 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" podUID="e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 25 00:22:30.737063 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:30.737022 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" podUID="0aad686e-898e-4b26-87e0-3dff2ad29a46" containerName="ensemble-graph-ac6c8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:22:31.241890 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:31.241853 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" podUID="7b73d0fa-8973-4fa6-9893-f71b996c8fe0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 25 00:22:31.788377 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:31.788320 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" podUID="c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce" containerName="sequence-graph-e1f95" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:22:31.788605 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:31.788483 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" Apr 25 00:22:33.349666 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:33.349637 2578 generic.go:358] "Generic (PLEG): container finished" podID="0aad686e-898e-4b26-87e0-3dff2ad29a46" containerID="2d3d210739a247da851053276b7f8de178d966eb8c34bd99939ddab2659d5345" exitCode=0 Apr 25 00:22:33.349975 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:33.349693 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" event={"ID":"0aad686e-898e-4b26-87e0-3dff2ad29a46","Type":"ContainerDied","Data":"2d3d210739a247da851053276b7f8de178d966eb8c34bd99939ddab2659d5345"} Apr 25 00:22:33.365506 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:33.365482 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" Apr 25 00:22:33.515480 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:33.515449 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0aad686e-898e-4b26-87e0-3dff2ad29a46-proxy-tls\") pod \"0aad686e-898e-4b26-87e0-3dff2ad29a46\" (UID: \"0aad686e-898e-4b26-87e0-3dff2ad29a46\") " Apr 25 00:22:33.515642 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:33.515508 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0aad686e-898e-4b26-87e0-3dff2ad29a46-openshift-service-ca-bundle\") pod \"0aad686e-898e-4b26-87e0-3dff2ad29a46\" (UID: \"0aad686e-898e-4b26-87e0-3dff2ad29a46\") " Apr 25 00:22:33.515842 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:33.515822 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aad686e-898e-4b26-87e0-3dff2ad29a46-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "0aad686e-898e-4b26-87e0-3dff2ad29a46" (UID: "0aad686e-898e-4b26-87e0-3dff2ad29a46"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:22:33.517665 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:33.517649 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aad686e-898e-4b26-87e0-3dff2ad29a46-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0aad686e-898e-4b26-87e0-3dff2ad29a46" (UID: "0aad686e-898e-4b26-87e0-3dff2ad29a46"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:22:33.616286 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:33.616238 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0aad686e-898e-4b26-87e0-3dff2ad29a46-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:22:33.616286 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:33.616282 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0aad686e-898e-4b26-87e0-3dff2ad29a46-openshift-service-ca-bundle\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:22:34.354342 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:34.354304 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" event={"ID":"0aad686e-898e-4b26-87e0-3dff2ad29a46","Type":"ContainerDied","Data":"e8e212462f80da9ec2f1a452fa10a2f33199ba280ffa2ca33e4ee4e0a79bc6d4"} Apr 25 00:22:34.354342 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:34.354325 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9" Apr 25 00:22:34.354342 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:34.354344 2578 scope.go:117] "RemoveContainer" containerID="2d3d210739a247da851053276b7f8de178d966eb8c34bd99939ddab2659d5345" Apr 25 00:22:34.370375 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:34.370343 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9"] Apr 25 00:22:34.373507 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:34.373483 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-ac6c8-87cf995b8-z6wk9"] Apr 25 00:22:35.911621 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:35.911591 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aad686e-898e-4b26-87e0-3dff2ad29a46" path="/var/lib/kubelet/pods/0aad686e-898e-4b26-87e0-3dff2ad29a46/volumes" Apr 25 00:22:36.788515 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:36.788468 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" podUID="c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce" containerName="sequence-graph-e1f95" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:22:37.309178 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:37.309132 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" podUID="e12b5b8f-d177-46cd-b4a8-07accb6972eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 25 00:22:37.309556 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:37.309148 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" podUID="cb303152-c58d-4a84-bf8f-5dd213041b6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 25 00:22:38.013576 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:38.013542 2578 scope.go:117] "RemoveContainer" containerID="e4133c2e49c252b13a198725a5da87386fed76a5774bb38f24b3ccf3f28cceb3" Apr 25 00:22:38.021345 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:38.021325 2578 scope.go:117] "RemoveContainer" containerID="064456d4971b2838ca5ad99cb47524ab1989ed6e41ab458e6f40433ec1d4772c" Apr 25 00:22:40.239610 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:40.239569 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" podUID="e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 25 00:22:41.241777 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:41.241735 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" podUID="7b73d0fa-8973-4fa6-9893-f71b996c8fe0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 25 00:22:41.787487 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:41.787439 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" podUID="c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce" containerName="sequence-graph-e1f95" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:22:46.788389 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:46.788324 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" podUID="c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce" containerName="sequence-graph-e1f95" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:22:47.308709 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:47.308662 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" podUID="cb303152-c58d-4a84-bf8f-5dd213041b6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 25 00:22:47.308903 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:47.308793 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" podUID="e12b5b8f-d177-46cd-b4a8-07accb6972eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 25 00:22:49.338804 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:49.338780 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" Apr 25 00:22:49.408018 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:49.407982 2578 generic.go:358] "Generic (PLEG): container finished" podID="c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce" containerID="7d8c7a68ecaecb560cdcc65354abd36181876c3b9763e78d0cf2907dd572beec" exitCode=0 Apr 25 00:22:49.408175 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:49.408051 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" Apr 25 00:22:49.408175 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:49.408060 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" event={"ID":"c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce","Type":"ContainerDied","Data":"7d8c7a68ecaecb560cdcc65354abd36181876c3b9763e78d0cf2907dd572beec"} Apr 25 00:22:49.408175 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:49.408100 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g" event={"ID":"c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce","Type":"ContainerDied","Data":"1d37ff0bc6f95cd6404d0f862fa684ef232ad807e790d22d45df3f26bb9e1c20"} Apr 25 00:22:49.408175 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:49.408122 2578 scope.go:117] "RemoveContainer" containerID="7d8c7a68ecaecb560cdcc65354abd36181876c3b9763e78d0cf2907dd572beec" Apr 25 00:22:49.415842 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:49.415824 2578 scope.go:117] "RemoveContainer" containerID="7d8c7a68ecaecb560cdcc65354abd36181876c3b9763e78d0cf2907dd572beec" Apr 25 00:22:49.416081 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:22:49.416062 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d8c7a68ecaecb560cdcc65354abd36181876c3b9763e78d0cf2907dd572beec\": container with ID starting with 7d8c7a68ecaecb560cdcc65354abd36181876c3b9763e78d0cf2907dd572beec not found: ID does not exist" containerID="7d8c7a68ecaecb560cdcc65354abd36181876c3b9763e78d0cf2907dd572beec" Apr 25 00:22:49.416128 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:49.416090 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d8c7a68ecaecb560cdcc65354abd36181876c3b9763e78d0cf2907dd572beec"} err="failed to get container status \"7d8c7a68ecaecb560cdcc65354abd36181876c3b9763e78d0cf2907dd572beec\": rpc error: code = NotFound desc = could not find container \"7d8c7a68ecaecb560cdcc65354abd36181876c3b9763e78d0cf2907dd572beec\": container with ID starting with 7d8c7a68ecaecb560cdcc65354abd36181876c3b9763e78d0cf2907dd572beec not found: ID does not exist" Apr 25 00:22:49.459429 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:49.459353 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce-proxy-tls\") pod \"c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce\" (UID: \"c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce\") " Apr 25 00:22:49.459527 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:49.459447 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce-openshift-service-ca-bundle\") pod \"c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce\" (UID: \"c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce\") " Apr 25 00:22:49.459771 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:49.459748 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce" (UID: "c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:22:49.461655 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:49.461632 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce" (UID: "c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:22:49.560559 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:49.560519 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:22:49.560559 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:49.560556 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce-openshift-service-ca-bundle\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:22:49.728447 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:49.728399 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g"] Apr 25 00:22:49.730605 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:49.730582 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-e1f95-6b4654b7f9-bt48g"] Apr 25 00:22:49.912302 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:49.912260 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce" path="/var/lib/kubelet/pods/c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce/volumes" Apr 25 00:22:50.239530 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:50.239486 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" podUID="e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 25 00:22:51.242564 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:51.242531 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" Apr 25 00:22:57.309196 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:57.309154 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" podUID="cb303152-c58d-4a84-bf8f-5dd213041b6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 25 00:22:57.309851 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:22:57.309158 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" podUID="e12b5b8f-d177-46cd-b4a8-07accb6972eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 25 00:23:00.240074 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:00.240047 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" Apr 25 00:23:07.308865 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:07.308820 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" podUID="cb303152-c58d-4a84-bf8f-5dd213041b6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 25 00:23:07.309522 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:07.309504 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" Apr 25 00:23:13.456289 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.456206 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd"] Apr 25 00:23:13.456684 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.456577 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0aad686e-898e-4b26-87e0-3dff2ad29a46" containerName="ensemble-graph-ac6c8" Apr 25 00:23:13.456684 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.456589 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aad686e-898e-4b26-87e0-3dff2ad29a46" containerName="ensemble-graph-ac6c8" Apr 25 00:23:13.456684 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.456605 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ccf9b311-61f4-49c6-b521-0cc24798e111" containerName="kserve-container" Apr 25 00:23:13.456684 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.456611 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccf9b311-61f4-49c6-b521-0cc24798e111" containerName="kserve-container" Apr 25 00:23:13.456684 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.456624 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="698b477f-a5ba-428e-bac0-96f0f0ee89fc" containerName="kube-rbac-proxy" Apr 25 00:23:13.456684 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.456630 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="698b477f-a5ba-428e-bac0-96f0f0ee89fc" containerName="kube-rbac-proxy" Apr 25 00:23:13.456684 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.456638 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="698b477f-a5ba-428e-bac0-96f0f0ee89fc" containerName="kserve-container" Apr 25 00:23:13.456684 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.456643 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="698b477f-a5ba-428e-bac0-96f0f0ee89fc" containerName="kserve-container" Apr 25 00:23:13.456684 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.456648 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ccf9b311-61f4-49c6-b521-0cc24798e111" containerName="kube-rbac-proxy" Apr 25 00:23:13.456684 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.456653 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccf9b311-61f4-49c6-b521-0cc24798e111" containerName="kube-rbac-proxy" Apr 25 00:23:13.456684 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.456662 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce" containerName="sequence-graph-e1f95" Apr 25 00:23:13.456684 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.456667 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce" containerName="sequence-graph-e1f95" Apr 25 00:23:13.457051 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.456713 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="698b477f-a5ba-428e-bac0-96f0f0ee89fc" containerName="kserve-container" Apr 25 00:23:13.457051 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.456721 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ccf9b311-61f4-49c6-b521-0cc24798e111" containerName="kserve-container" Apr 25 00:23:13.457051 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.456733 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ccf9b311-61f4-49c6-b521-0cc24798e111" containerName="kube-rbac-proxy" Apr 25 00:23:13.457051 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.456738 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6cb4fdc-9ffd-4291-89f5-c4ff8dc483ce" containerName="sequence-graph-e1f95" Apr 25 00:23:13.457051 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.456745 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0aad686e-898e-4b26-87e0-3dff2ad29a46" containerName="ensemble-graph-ac6c8" Apr 25 00:23:13.457051 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.456751 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="698b477f-a5ba-428e-bac0-96f0f0ee89fc" containerName="kube-rbac-proxy" Apr 25 00:23:13.459670 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.459654 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" Apr 25 00:23:13.461838 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.461814 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-7621b-serving-cert\"" Apr 25 00:23:13.461969 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.461849 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-7621b-kube-rbac-proxy-sar-config\"" Apr 25 00:23:13.468347 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.468327 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd"] Apr 25 00:23:13.561174 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.561139 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b53e4e70-55e0-4f1f-b5db-743f089b31fb-proxy-tls\") pod \"splitter-graph-7621b-7fdfb8f889-6blhd\" (UID: \"b53e4e70-55e0-4f1f-b5db-743f089b31fb\") " pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" Apr 25 00:23:13.561330 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.561210 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b53e4e70-55e0-4f1f-b5db-743f089b31fb-openshift-service-ca-bundle\") pod \"splitter-graph-7621b-7fdfb8f889-6blhd\" (UID: \"b53e4e70-55e0-4f1f-b5db-743f089b31fb\") " pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" Apr 25 00:23:13.662078 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.662042 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b53e4e70-55e0-4f1f-b5db-743f089b31fb-proxy-tls\") pod \"splitter-graph-7621b-7fdfb8f889-6blhd\" (UID: \"b53e4e70-55e0-4f1f-b5db-743f089b31fb\") " pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" Apr 25 00:23:13.662247 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.662108 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b53e4e70-55e0-4f1f-b5db-743f089b31fb-openshift-service-ca-bundle\") pod \"splitter-graph-7621b-7fdfb8f889-6blhd\" (UID: \"b53e4e70-55e0-4f1f-b5db-743f089b31fb\") " pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" Apr 25 00:23:13.662247 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:23:13.662204 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-7621b-serving-cert: secret "splitter-graph-7621b-serving-cert" not found Apr 25 00:23:13.662364 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:23:13.662292 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b53e4e70-55e0-4f1f-b5db-743f089b31fb-proxy-tls podName:b53e4e70-55e0-4f1f-b5db-743f089b31fb nodeName:}" failed. No retries permitted until 2026-04-25 00:23:14.162269431 +0000 UTC m=+1776.796251348 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b53e4e70-55e0-4f1f-b5db-743f089b31fb-proxy-tls") pod "splitter-graph-7621b-7fdfb8f889-6blhd" (UID: "b53e4e70-55e0-4f1f-b5db-743f089b31fb") : secret "splitter-graph-7621b-serving-cert" not found Apr 25 00:23:13.662744 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:13.662726 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b53e4e70-55e0-4f1f-b5db-743f089b31fb-openshift-service-ca-bundle\") pod \"splitter-graph-7621b-7fdfb8f889-6blhd\" (UID: \"b53e4e70-55e0-4f1f-b5db-743f089b31fb\") " pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" Apr 25 00:23:14.167063 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:14.167030 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b53e4e70-55e0-4f1f-b5db-743f089b31fb-proxy-tls\") pod \"splitter-graph-7621b-7fdfb8f889-6blhd\" (UID: \"b53e4e70-55e0-4f1f-b5db-743f089b31fb\") " pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" Apr 25 00:23:14.169426 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:14.169399 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b53e4e70-55e0-4f1f-b5db-743f089b31fb-proxy-tls\") pod \"splitter-graph-7621b-7fdfb8f889-6blhd\" (UID: \"b53e4e70-55e0-4f1f-b5db-743f089b31fb\") " pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" Apr 25 00:23:14.370566 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:14.370535 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" Apr 25 00:23:14.487766 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:14.487739 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd"] Apr 25 00:23:14.489554 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:23:14.489526 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb53e4e70_55e0_4f1f_b5db_743f089b31fb.slice/crio-a25d54d9f168db1ad6ec5d3b65150f24d4a91e51653879667195045d70e84386 WatchSource:0}: Error finding container a25d54d9f168db1ad6ec5d3b65150f24d4a91e51653879667195045d70e84386: Status 404 returned error can't find the container with id a25d54d9f168db1ad6ec5d3b65150f24d4a91e51653879667195045d70e84386 Apr 25 00:23:15.493825 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:15.493788 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" event={"ID":"b53e4e70-55e0-4f1f-b5db-743f089b31fb","Type":"ContainerStarted","Data":"9afac5dc5c2bc7f64355a4f52ba4ed82b5e29b359acba7e6e19195fdad15204b"} Apr 25 00:23:15.493825 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:15.493821 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" event={"ID":"b53e4e70-55e0-4f1f-b5db-743f089b31fb","Type":"ContainerStarted","Data":"a25d54d9f168db1ad6ec5d3b65150f24d4a91e51653879667195045d70e84386"} Apr 25 00:23:15.494321 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:15.493844 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" Apr 25 00:23:15.510842 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:15.510794 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" podStartSLOduration=2.5107774210000002 podStartE2EDuration="2.510777421s" podCreationTimestamp="2026-04-25 00:23:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:23:15.509531942 +0000 UTC m=+1778.143513890" watchObservedRunningTime="2026-04-25 00:23:15.510777421 +0000 UTC m=+1778.144759359" Apr 25 00:23:17.309159 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:17.309119 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" Apr 25 00:23:19.358745 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:19.358709 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg"] Apr 25 00:23:19.362060 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:19.362043 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" Apr 25 00:23:19.364402 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:19.364371 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-a37f5-serving-cert\"" Apr 25 00:23:19.364553 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:19.364444 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-a37f5-kube-rbac-proxy-sar-config\"" Apr 25 00:23:19.369606 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:19.369585 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg"] Apr 25 00:23:19.412636 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:19.412604 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db64d84c-ea53-4fe5-a56a-a4727181e955-proxy-tls\") pod \"switch-graph-a37f5-68dd44b8fb-97nlg\" (UID: \"db64d84c-ea53-4fe5-a56a-a4727181e955\") " pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" Apr 25 00:23:19.412804 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:19.412656 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db64d84c-ea53-4fe5-a56a-a4727181e955-openshift-service-ca-bundle\") pod \"switch-graph-a37f5-68dd44b8fb-97nlg\" (UID: \"db64d84c-ea53-4fe5-a56a-a4727181e955\") " pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" Apr 25 00:23:19.513126 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:19.513091 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db64d84c-ea53-4fe5-a56a-a4727181e955-proxy-tls\") pod \"switch-graph-a37f5-68dd44b8fb-97nlg\" (UID: \"db64d84c-ea53-4fe5-a56a-a4727181e955\") " pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" Apr 25 00:23:19.513298 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:19.513148 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db64d84c-ea53-4fe5-a56a-a4727181e955-openshift-service-ca-bundle\") pod \"switch-graph-a37f5-68dd44b8fb-97nlg\" (UID: \"db64d84c-ea53-4fe5-a56a-a4727181e955\") " pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" Apr 25 00:23:19.513826 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:19.513797 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db64d84c-ea53-4fe5-a56a-a4727181e955-openshift-service-ca-bundle\") pod \"switch-graph-a37f5-68dd44b8fb-97nlg\" (UID: \"db64d84c-ea53-4fe5-a56a-a4727181e955\") " pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" Apr 25 00:23:19.515704 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:19.515685 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db64d84c-ea53-4fe5-a56a-a4727181e955-proxy-tls\") pod \"switch-graph-a37f5-68dd44b8fb-97nlg\" (UID: \"db64d84c-ea53-4fe5-a56a-a4727181e955\") " pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" Apr 25 00:23:19.672692 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:19.672592 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" Apr 25 00:23:19.794828 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:19.794800 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg"] Apr 25 00:23:19.797507 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:23:19.797479 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb64d84c_ea53_4fe5_a56a_a4727181e955.slice/crio-bf50c99c1df2576480808e11db63b460e2485292ff038e98e3e382099ac95619 WatchSource:0}: Error finding container bf50c99c1df2576480808e11db63b460e2485292ff038e98e3e382099ac95619: Status 404 returned error can't find the container with id bf50c99c1df2576480808e11db63b460e2485292ff038e98e3e382099ac95619 Apr 25 00:23:20.508096 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:20.508064 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" event={"ID":"db64d84c-ea53-4fe5-a56a-a4727181e955","Type":"ContainerStarted","Data":"55eb0bd15f9b05294c56dd06b4c4bc5dc6eecb99a4060c85093c5a41a199ac91"} Apr 25 00:23:20.508096 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:20.508098 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" event={"ID":"db64d84c-ea53-4fe5-a56a-a4727181e955","Type":"ContainerStarted","Data":"bf50c99c1df2576480808e11db63b460e2485292ff038e98e3e382099ac95619"} Apr 25 00:23:20.508554 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:20.508225 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" Apr 25 00:23:20.524195 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:20.524147 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" podStartSLOduration=1.524132574 podStartE2EDuration="1.524132574s" podCreationTimestamp="2026-04-25 00:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:23:20.522097495 +0000 UTC m=+1783.156079468" watchObservedRunningTime="2026-04-25 00:23:20.524132574 +0000 UTC m=+1783.158114506" Apr 25 00:23:21.502267 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:21.502239 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" Apr 25 00:23:26.516861 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:26.516830 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" Apr 25 00:23:27.517320 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.517287 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd"] Apr 25 00:23:27.517782 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.517675 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" podUID="b53e4e70-55e0-4f1f-b5db-743f089b31fb" containerName="splitter-graph-7621b" containerID="cri-o://9afac5dc5c2bc7f64355a4f52ba4ed82b5e29b359acba7e6e19195fdad15204b" gracePeriod=30 Apr 25 00:23:27.615624 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.615541 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf"] Apr 25 00:23:27.615917 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.615865 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" podUID="e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" containerName="kserve-container" containerID="cri-o://e354d4a29e23fcbbb847fe6cb8cf9c74d5afcc34cb00e1f6ef5bf514dc1428e3" gracePeriod=30 Apr 25 00:23:27.616075 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.615931 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" podUID="e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" containerName="kube-rbac-proxy" containerID="cri-o://33922d3d86674608630d8aec3fe8b685535da0b37c0374cdc9a6dc2e3b60b4a4" gracePeriod=30 Apr 25 00:23:27.652680 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.652654 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp"] Apr 25 00:23:27.656091 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.656077 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" Apr 25 00:23:27.658034 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.658009 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-41340-predictor-serving-cert\"" Apr 25 00:23:27.658147 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.658050 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-41340-kube-rbac-proxy-sar-config\"" Apr 25 00:23:27.666881 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.666856 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp"] Apr 25 00:23:27.686099 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.686071 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt"] Apr 25 00:23:27.686384 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.686359 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" podUID="7b73d0fa-8973-4fa6-9893-f71b996c8fe0" containerName="kserve-container" containerID="cri-o://40dedf9164970811c38cedead00e978199217561127f6f0d719d4cdf932520b8" gracePeriod=30 Apr 25 00:23:27.686493 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.686387 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" podUID="7b73d0fa-8973-4fa6-9893-f71b996c8fe0" containerName="kube-rbac-proxy" containerID="cri-o://fc9277bfe0796ed395fefbf74565aa6e8a7cf328794c5a1ac24c975062e8ff0e" gracePeriod=30 Apr 25 00:23:27.751859 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.751829 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s"] Apr 25 00:23:27.755197 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.755174 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" Apr 25 00:23:27.757396 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.757378 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-41340-predictor-serving-cert\"" Apr 25 00:23:27.757515 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.757403 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-41340-kube-rbac-proxy-sar-config\"" Apr 25 00:23:27.762896 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.762871 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s"] Apr 25 00:23:27.788215 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.788187 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdw2p\" (UniqueName: \"kubernetes.io/projected/11bacc24-8aa8-4160-aa1d-6c430c2dcc36-kube-api-access-mdw2p\") pod \"success-200-isvc-41340-predictor-897fcb-v7pwp\" (UID: \"11bacc24-8aa8-4160-aa1d-6c430c2dcc36\") " pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" Apr 25 00:23:27.788343 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.788230 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-41340-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/11bacc24-8aa8-4160-aa1d-6c430c2dcc36-success-200-isvc-41340-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-41340-predictor-897fcb-v7pwp\" (UID: \"11bacc24-8aa8-4160-aa1d-6c430c2dcc36\") " pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" Apr 25 00:23:27.788384 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.788352 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11bacc24-8aa8-4160-aa1d-6c430c2dcc36-proxy-tls\") pod \"success-200-isvc-41340-predictor-897fcb-v7pwp\" (UID: \"11bacc24-8aa8-4160-aa1d-6c430c2dcc36\") " pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" Apr 25 00:23:27.896550 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.892945 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdw2p\" (UniqueName: \"kubernetes.io/projected/11bacc24-8aa8-4160-aa1d-6c430c2dcc36-kube-api-access-mdw2p\") pod \"success-200-isvc-41340-predictor-897fcb-v7pwp\" (UID: \"11bacc24-8aa8-4160-aa1d-6c430c2dcc36\") " pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" Apr 25 00:23:27.896550 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.893027 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-41340-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/11bacc24-8aa8-4160-aa1d-6c430c2dcc36-success-200-isvc-41340-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-41340-predictor-897fcb-v7pwp\" (UID: \"11bacc24-8aa8-4160-aa1d-6c430c2dcc36\") " pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" Apr 25 00:23:27.896550 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.893085 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d-proxy-tls\") pod \"error-404-isvc-41340-predictor-694448f7cf-n9r4s\" (UID: \"58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d\") " pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" Apr 25 00:23:27.896550 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.893170 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-41340-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d-error-404-isvc-41340-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-41340-predictor-694448f7cf-n9r4s\" (UID: \"58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d\") " pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" Apr 25 00:23:27.896550 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.893213 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krcmk\" (UniqueName: \"kubernetes.io/projected/58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d-kube-api-access-krcmk\") pod \"error-404-isvc-41340-predictor-694448f7cf-n9r4s\" (UID: \"58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d\") " pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" Apr 25 00:23:27.896550 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.893255 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11bacc24-8aa8-4160-aa1d-6c430c2dcc36-proxy-tls\") pod \"success-200-isvc-41340-predictor-897fcb-v7pwp\" (UID: \"11bacc24-8aa8-4160-aa1d-6c430c2dcc36\") " pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" Apr 25 00:23:27.896550 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.894512 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-41340-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/11bacc24-8aa8-4160-aa1d-6c430c2dcc36-success-200-isvc-41340-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-41340-predictor-897fcb-v7pwp\" (UID: \"11bacc24-8aa8-4160-aa1d-6c430c2dcc36\") " pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" Apr 25 00:23:27.896550 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.896517 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11bacc24-8aa8-4160-aa1d-6c430c2dcc36-proxy-tls\") pod \"success-200-isvc-41340-predictor-897fcb-v7pwp\" (UID: \"11bacc24-8aa8-4160-aa1d-6c430c2dcc36\") " pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" Apr 25 00:23:27.901388 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.901363 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdw2p\" (UniqueName: \"kubernetes.io/projected/11bacc24-8aa8-4160-aa1d-6c430c2dcc36-kube-api-access-mdw2p\") pod \"success-200-isvc-41340-predictor-897fcb-v7pwp\" (UID: \"11bacc24-8aa8-4160-aa1d-6c430c2dcc36\") " pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" Apr 25 00:23:27.968147 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.968115 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" Apr 25 00:23:27.994535 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.994498 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-41340-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d-error-404-isvc-41340-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-41340-predictor-694448f7cf-n9r4s\" (UID: \"58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d\") " pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" Apr 25 00:23:27.994683 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.994559 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krcmk\" (UniqueName: \"kubernetes.io/projected/58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d-kube-api-access-krcmk\") pod \"error-404-isvc-41340-predictor-694448f7cf-n9r4s\" (UID: \"58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d\") " pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" Apr 25 00:23:27.994683 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.994659 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d-proxy-tls\") pod \"error-404-isvc-41340-predictor-694448f7cf-n9r4s\" (UID: \"58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d\") " pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" Apr 25 00:23:27.995279 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.995251 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-41340-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d-error-404-isvc-41340-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-41340-predictor-694448f7cf-n9r4s\" (UID: \"58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d\") " pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" Apr 25 00:23:27.997152 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:27.997130 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d-proxy-tls\") pod \"error-404-isvc-41340-predictor-694448f7cf-n9r4s\" (UID: \"58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d\") " pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" Apr 25 00:23:28.002639 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:28.002612 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krcmk\" (UniqueName: \"kubernetes.io/projected/58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d-kube-api-access-krcmk\") pod \"error-404-isvc-41340-predictor-694448f7cf-n9r4s\" (UID: \"58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d\") " pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" Apr 25 00:23:28.066835 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:28.066803 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" Apr 25 00:23:28.097113 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:28.097084 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp"] Apr 25 00:23:28.098553 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:23:28.098526 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11bacc24_8aa8_4160_aa1d_6c430c2dcc36.slice/crio-831431f517315bf863d300ab675b9bbe998033c09722447ffbf0ea976b812fa1 WatchSource:0}: Error finding container 831431f517315bf863d300ab675b9bbe998033c09722447ffbf0ea976b812fa1: Status 404 returned error can't find the container with id 831431f517315bf863d300ab675b9bbe998033c09722447ffbf0ea976b812fa1 Apr 25 00:23:28.194831 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:28.194771 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s"] Apr 25 00:23:28.203308 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:23:28.203287 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58dff5e2_5bd9_4d6c_8b89_f07148d0fa9d.slice/crio-9f182f326c87afbf6c95187acfc5c8b0fc5968e5286f7a80a4f1087f2071f20e WatchSource:0}: Error finding container 9f182f326c87afbf6c95187acfc5c8b0fc5968e5286f7a80a4f1087f2071f20e: Status 404 returned error can't find the container with id 9f182f326c87afbf6c95187acfc5c8b0fc5968e5286f7a80a4f1087f2071f20e Apr 25 00:23:28.533862 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:28.533830 2578 generic.go:358] "Generic (PLEG): container finished" podID="7b73d0fa-8973-4fa6-9893-f71b996c8fe0" containerID="fc9277bfe0796ed395fefbf74565aa6e8a7cf328794c5a1ac24c975062e8ff0e" exitCode=2 Apr 25 00:23:28.534300 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:28.533916 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" event={"ID":"7b73d0fa-8973-4fa6-9893-f71b996c8fe0","Type":"ContainerDied","Data":"fc9277bfe0796ed395fefbf74565aa6e8a7cf328794c5a1ac24c975062e8ff0e"} Apr 25 00:23:28.535501 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:28.535483 2578 generic.go:358] "Generic (PLEG): container finished" podID="e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" containerID="33922d3d86674608630d8aec3fe8b685535da0b37c0374cdc9a6dc2e3b60b4a4" exitCode=2 Apr 25 00:23:28.535602 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:28.535547 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" event={"ID":"e5fb70f9-c86b-41ae-a60b-60c0574f3ccf","Type":"ContainerDied","Data":"33922d3d86674608630d8aec3fe8b685535da0b37c0374cdc9a6dc2e3b60b4a4"} Apr 25 00:23:28.537029 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:28.536998 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" event={"ID":"58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d","Type":"ContainerStarted","Data":"1aade316bfd22c2ddc4ecf93106f3b54cdc3c1d8f1442415f6b44891be186811"} Apr 25 00:23:28.537029 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:28.537024 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" event={"ID":"58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d","Type":"ContainerStarted","Data":"7e98e3397a04bef096cf09d6091ba6648b279322a2c5f0c4c1601a8a6290e4a0"} Apr 25 00:23:28.537208 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:28.537037 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" event={"ID":"58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d","Type":"ContainerStarted","Data":"9f182f326c87afbf6c95187acfc5c8b0fc5968e5286f7a80a4f1087f2071f20e"} Apr 25 00:23:28.537208 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:28.537133 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" Apr 25 00:23:28.538529 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:28.538511 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" event={"ID":"11bacc24-8aa8-4160-aa1d-6c430c2dcc36","Type":"ContainerStarted","Data":"cf1997773688a8b0fc055cc38618f25c9436fc72c442a35e0738063ceb0dcd8f"} Apr 25 00:23:28.538529 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:28.538530 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" event={"ID":"11bacc24-8aa8-4160-aa1d-6c430c2dcc36","Type":"ContainerStarted","Data":"f799011376f1290d60172b7e9ff9250ec5c3f8471f6f7474a07529cc1d0db027"} Apr 25 00:23:28.538677 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:28.538539 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" event={"ID":"11bacc24-8aa8-4160-aa1d-6c430c2dcc36","Type":"ContainerStarted","Data":"831431f517315bf863d300ab675b9bbe998033c09722447ffbf0ea976b812fa1"} Apr 25 00:23:28.538677 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:28.538672 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" Apr 25 00:23:28.556030 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:28.555987 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" podStartSLOduration=1.555973896 podStartE2EDuration="1.555973896s" podCreationTimestamp="2026-04-25 00:23:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:23:28.554051938 +0000 UTC m=+1791.188033884" watchObservedRunningTime="2026-04-25 00:23:28.555973896 +0000 UTC m=+1791.189955831" Apr 25 00:23:28.583883 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:28.583836 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" podStartSLOduration=1.583822676 podStartE2EDuration="1.583822676s" podCreationTimestamp="2026-04-25 00:23:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:23:28.582129282 +0000 UTC m=+1791.216111218" watchObservedRunningTime="2026-04-25 00:23:28.583822676 +0000 UTC m=+1791.217804612" Apr 25 00:23:29.542049 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:29.542002 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" Apr 25 00:23:29.542509 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:29.542204 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" Apr 25 00:23:29.543188 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:29.543153 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" podUID="11bacc24-8aa8-4160-aa1d-6c430c2dcc36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 25 00:23:29.543188 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:29.543173 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" podUID="58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 25 00:23:30.234132 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:30.234092 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" podUID="e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.42:8643/healthz\": dial tcp 10.133.0.42:8643: connect: connection refused" Apr 25 00:23:30.240140 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:30.240113 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" podUID="e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 25 00:23:30.545262 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:30.545205 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" podUID="11bacc24-8aa8-4160-aa1d-6c430c2dcc36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 25 00:23:30.545718 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:30.545229 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" podUID="58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 25 00:23:30.962885 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:30.962862 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" Apr 25 00:23:31.121957 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.121929 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzb59\" (UniqueName: \"kubernetes.io/projected/e5fb70f9-c86b-41ae-a60b-60c0574f3ccf-kube-api-access-fzb59\") pod \"e5fb70f9-c86b-41ae-a60b-60c0574f3ccf\" (UID: \"e5fb70f9-c86b-41ae-a60b-60c0574f3ccf\") " Apr 25 00:23:31.122092 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.121977 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-7621b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e5fb70f9-c86b-41ae-a60b-60c0574f3ccf-success-200-isvc-7621b-kube-rbac-proxy-sar-config\") pod \"e5fb70f9-c86b-41ae-a60b-60c0574f3ccf\" (UID: \"e5fb70f9-c86b-41ae-a60b-60c0574f3ccf\") " Apr 25 00:23:31.122092 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.122014 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5fb70f9-c86b-41ae-a60b-60c0574f3ccf-proxy-tls\") pod \"e5fb70f9-c86b-41ae-a60b-60c0574f3ccf\" (UID: \"e5fb70f9-c86b-41ae-a60b-60c0574f3ccf\") " Apr 25 00:23:31.122375 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.122353 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5fb70f9-c86b-41ae-a60b-60c0574f3ccf-success-200-isvc-7621b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-7621b-kube-rbac-proxy-sar-config") pod "e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" (UID: "e5fb70f9-c86b-41ae-a60b-60c0574f3ccf"). InnerVolumeSpecName "success-200-isvc-7621b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:23:31.124293 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.124262 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5fb70f9-c86b-41ae-a60b-60c0574f3ccf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" (UID: "e5fb70f9-c86b-41ae-a60b-60c0574f3ccf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:23:31.124459 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.124432 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5fb70f9-c86b-41ae-a60b-60c0574f3ccf-kube-api-access-fzb59" (OuterVolumeSpecName: "kube-api-access-fzb59") pod "e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" (UID: "e5fb70f9-c86b-41ae-a60b-60c0574f3ccf"). InnerVolumeSpecName "kube-api-access-fzb59". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:23:31.125608 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.125591 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" Apr 25 00:23:31.222626 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.222592 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-7621b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7b73d0fa-8973-4fa6-9893-f71b996c8fe0-error-404-isvc-7621b-kube-rbac-proxy-sar-config\") pod \"7b73d0fa-8973-4fa6-9893-f71b996c8fe0\" (UID: \"7b73d0fa-8973-4fa6-9893-f71b996c8fe0\") " Apr 25 00:23:31.222855 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.222643 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b73d0fa-8973-4fa6-9893-f71b996c8fe0-proxy-tls\") pod \"7b73d0fa-8973-4fa6-9893-f71b996c8fe0\" (UID: \"7b73d0fa-8973-4fa6-9893-f71b996c8fe0\") " Apr 25 00:23:31.222855 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.222669 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6p58\" (UniqueName: \"kubernetes.io/projected/7b73d0fa-8973-4fa6-9893-f71b996c8fe0-kube-api-access-x6p58\") pod \"7b73d0fa-8973-4fa6-9893-f71b996c8fe0\" (UID: \"7b73d0fa-8973-4fa6-9893-f71b996c8fe0\") " Apr 25 00:23:31.222993 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.222948 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fzb59\" (UniqueName: \"kubernetes.io/projected/e5fb70f9-c86b-41ae-a60b-60c0574f3ccf-kube-api-access-fzb59\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:23:31.222993 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.222961 2578 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-7621b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e5fb70f9-c86b-41ae-a60b-60c0574f3ccf-success-200-isvc-7621b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:23:31.222993 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.222971 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5fb70f9-c86b-41ae-a60b-60c0574f3ccf-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:23:31.223120 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.223013 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b73d0fa-8973-4fa6-9893-f71b996c8fe0-error-404-isvc-7621b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-7621b-kube-rbac-proxy-sar-config") pod "7b73d0fa-8973-4fa6-9893-f71b996c8fe0" (UID: "7b73d0fa-8973-4fa6-9893-f71b996c8fe0"). InnerVolumeSpecName "error-404-isvc-7621b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:23:31.224916 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.224892 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b73d0fa-8973-4fa6-9893-f71b996c8fe0-kube-api-access-x6p58" (OuterVolumeSpecName: "kube-api-access-x6p58") pod "7b73d0fa-8973-4fa6-9893-f71b996c8fe0" (UID: "7b73d0fa-8973-4fa6-9893-f71b996c8fe0"). InnerVolumeSpecName "kube-api-access-x6p58". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:23:31.225060 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.225039 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b73d0fa-8973-4fa6-9893-f71b996c8fe0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7b73d0fa-8973-4fa6-9893-f71b996c8fe0" (UID: "7b73d0fa-8973-4fa6-9893-f71b996c8fe0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:23:31.324191 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.324107 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-7621b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7b73d0fa-8973-4fa6-9893-f71b996c8fe0-error-404-isvc-7621b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:23:31.324191 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.324138 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b73d0fa-8973-4fa6-9893-f71b996c8fe0-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:23:31.324191 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.324151 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x6p58\" (UniqueName: \"kubernetes.io/projected/7b73d0fa-8973-4fa6-9893-f71b996c8fe0-kube-api-access-x6p58\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:23:31.500134 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.500094 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" podUID="b53e4e70-55e0-4f1f-b5db-743f089b31fb" containerName="splitter-graph-7621b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:23:31.553124 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.553083 2578 generic.go:358] "Generic (PLEG): container finished" podID="e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" containerID="e354d4a29e23fcbbb847fe6cb8cf9c74d5afcc34cb00e1f6ef5bf514dc1428e3" exitCode=0 Apr 25 00:23:31.553606 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.553162 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" Apr 25 00:23:31.553606 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.553171 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" event={"ID":"e5fb70f9-c86b-41ae-a60b-60c0574f3ccf","Type":"ContainerDied","Data":"e354d4a29e23fcbbb847fe6cb8cf9c74d5afcc34cb00e1f6ef5bf514dc1428e3"} Apr 25 00:23:31.553606 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.553213 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf" event={"ID":"e5fb70f9-c86b-41ae-a60b-60c0574f3ccf","Type":"ContainerDied","Data":"0524f667cfff384027aa2fd418a9fb68bbd1ffb3ed1080a820158641e78c56f1"} Apr 25 00:23:31.553606 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.553229 2578 scope.go:117] "RemoveContainer" containerID="33922d3d86674608630d8aec3fe8b685535da0b37c0374cdc9a6dc2e3b60b4a4" Apr 25 00:23:31.555170 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.555144 2578 generic.go:358] "Generic (PLEG): container finished" podID="7b73d0fa-8973-4fa6-9893-f71b996c8fe0" containerID="40dedf9164970811c38cedead00e978199217561127f6f0d719d4cdf932520b8" exitCode=0 Apr 25 00:23:31.555271 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.555191 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" event={"ID":"7b73d0fa-8973-4fa6-9893-f71b996c8fe0","Type":"ContainerDied","Data":"40dedf9164970811c38cedead00e978199217561127f6f0d719d4cdf932520b8"} Apr 25 00:23:31.555271 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.555216 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" event={"ID":"7b73d0fa-8973-4fa6-9893-f71b996c8fe0","Type":"ContainerDied","Data":"6853df002fa621eeb354658a32ea9a9629fddfd956fcc63e32a52d2ee2a70b74"} Apr 25 00:23:31.555271 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.555227 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt" Apr 25 00:23:31.561956 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.561941 2578 scope.go:117] "RemoveContainer" containerID="e354d4a29e23fcbbb847fe6cb8cf9c74d5afcc34cb00e1f6ef5bf514dc1428e3" Apr 25 00:23:31.569678 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.569660 2578 scope.go:117] "RemoveContainer" containerID="33922d3d86674608630d8aec3fe8b685535da0b37c0374cdc9a6dc2e3b60b4a4" Apr 25 00:23:31.569937 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:23:31.569920 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33922d3d86674608630d8aec3fe8b685535da0b37c0374cdc9a6dc2e3b60b4a4\": container with ID starting with 33922d3d86674608630d8aec3fe8b685535da0b37c0374cdc9a6dc2e3b60b4a4 not found: ID does not exist" containerID="33922d3d86674608630d8aec3fe8b685535da0b37c0374cdc9a6dc2e3b60b4a4" Apr 25 00:23:31.569995 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.569948 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33922d3d86674608630d8aec3fe8b685535da0b37c0374cdc9a6dc2e3b60b4a4"} err="failed to get container status \"33922d3d86674608630d8aec3fe8b685535da0b37c0374cdc9a6dc2e3b60b4a4\": rpc error: code = NotFound desc = could not find container \"33922d3d86674608630d8aec3fe8b685535da0b37c0374cdc9a6dc2e3b60b4a4\": container with ID starting with 33922d3d86674608630d8aec3fe8b685535da0b37c0374cdc9a6dc2e3b60b4a4 not found: ID does not exist" Apr 25 00:23:31.569995 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.569965 2578 scope.go:117] "RemoveContainer" containerID="e354d4a29e23fcbbb847fe6cb8cf9c74d5afcc34cb00e1f6ef5bf514dc1428e3" Apr 25 00:23:31.570173 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:23:31.570156 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e354d4a29e23fcbbb847fe6cb8cf9c74d5afcc34cb00e1f6ef5bf514dc1428e3\": container with ID starting with e354d4a29e23fcbbb847fe6cb8cf9c74d5afcc34cb00e1f6ef5bf514dc1428e3 not found: ID does not exist" containerID="e354d4a29e23fcbbb847fe6cb8cf9c74d5afcc34cb00e1f6ef5bf514dc1428e3" Apr 25 00:23:31.570209 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.570178 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e354d4a29e23fcbbb847fe6cb8cf9c74d5afcc34cb00e1f6ef5bf514dc1428e3"} err="failed to get container status \"e354d4a29e23fcbbb847fe6cb8cf9c74d5afcc34cb00e1f6ef5bf514dc1428e3\": rpc error: code = NotFound desc = could not find container \"e354d4a29e23fcbbb847fe6cb8cf9c74d5afcc34cb00e1f6ef5bf514dc1428e3\": container with ID starting with e354d4a29e23fcbbb847fe6cb8cf9c74d5afcc34cb00e1f6ef5bf514dc1428e3 not found: ID does not exist" Apr 25 00:23:31.570209 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.570192 2578 scope.go:117] "RemoveContainer" containerID="fc9277bfe0796ed395fefbf74565aa6e8a7cf328794c5a1ac24c975062e8ff0e" Apr 25 00:23:31.577198 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.577179 2578 scope.go:117] "RemoveContainer" containerID="40dedf9164970811c38cedead00e978199217561127f6f0d719d4cdf932520b8" Apr 25 00:23:31.579894 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.579871 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf"] Apr 25 00:23:31.583512 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.583486 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7621b-predictor-5f89d47788-7mplf"] Apr 25 00:23:31.586072 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.586057 2578 scope.go:117] "RemoveContainer" containerID="fc9277bfe0796ed395fefbf74565aa6e8a7cf328794c5a1ac24c975062e8ff0e" Apr 25 00:23:31.586352 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:23:31.586331 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc9277bfe0796ed395fefbf74565aa6e8a7cf328794c5a1ac24c975062e8ff0e\": container with ID starting with fc9277bfe0796ed395fefbf74565aa6e8a7cf328794c5a1ac24c975062e8ff0e not found: ID does not exist" containerID="fc9277bfe0796ed395fefbf74565aa6e8a7cf328794c5a1ac24c975062e8ff0e" Apr 25 00:23:31.586404 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.586363 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc9277bfe0796ed395fefbf74565aa6e8a7cf328794c5a1ac24c975062e8ff0e"} err="failed to get container status \"fc9277bfe0796ed395fefbf74565aa6e8a7cf328794c5a1ac24c975062e8ff0e\": rpc error: code = NotFound desc = could not find container \"fc9277bfe0796ed395fefbf74565aa6e8a7cf328794c5a1ac24c975062e8ff0e\": container with ID starting with fc9277bfe0796ed395fefbf74565aa6e8a7cf328794c5a1ac24c975062e8ff0e not found: ID does not exist" Apr 25 00:23:31.586404 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.586381 2578 scope.go:117] "RemoveContainer" containerID="40dedf9164970811c38cedead00e978199217561127f6f0d719d4cdf932520b8" Apr 25 00:23:31.586639 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:23:31.586622 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40dedf9164970811c38cedead00e978199217561127f6f0d719d4cdf932520b8\": container with ID starting with 40dedf9164970811c38cedead00e978199217561127f6f0d719d4cdf932520b8 not found: ID does not exist" containerID="40dedf9164970811c38cedead00e978199217561127f6f0d719d4cdf932520b8" Apr 25 00:23:31.586686 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.586643 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40dedf9164970811c38cedead00e978199217561127f6f0d719d4cdf932520b8"} err="failed to get container status \"40dedf9164970811c38cedead00e978199217561127f6f0d719d4cdf932520b8\": rpc error: code = NotFound desc = could not find container \"40dedf9164970811c38cedead00e978199217561127f6f0d719d4cdf932520b8\": container with ID starting with 40dedf9164970811c38cedead00e978199217561127f6f0d719d4cdf932520b8 not found: ID does not exist" Apr 25 00:23:31.592927 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.592903 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt"] Apr 25 00:23:31.597487 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.597465 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7621b-predictor-6d5bd65f5f-bjtpt"] Apr 25 00:23:31.912069 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.911993 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b73d0fa-8973-4fa6-9893-f71b996c8fe0" path="/var/lib/kubelet/pods/7b73d0fa-8973-4fa6-9893-f71b996c8fe0/volumes" Apr 25 00:23:31.912459 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:31.912445 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" path="/var/lib/kubelet/pods/e5fb70f9-c86b-41ae-a60b-60c0574f3ccf/volumes" Apr 25 00:23:35.550408 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:35.550378 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" Apr 25 00:23:35.550808 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:35.550756 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" Apr 25 00:23:35.550808 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:35.550789 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" podUID="58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 25 00:23:35.551145 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:35.551123 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" podUID="11bacc24-8aa8-4160-aa1d-6c430c2dcc36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 25 00:23:36.500214 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:36.500174 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" podUID="b53e4e70-55e0-4f1f-b5db-743f089b31fb" containerName="splitter-graph-7621b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:23:37.957730 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:37.957703 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 25 00:23:37.960354 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:37.960331 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 25 00:23:41.501765 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:41.501723 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" podUID="b53e4e70-55e0-4f1f-b5db-743f089b31fb" containerName="splitter-graph-7621b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:23:41.502162 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:41.501839 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" Apr 25 00:23:45.551317 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:45.551280 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" podUID="11bacc24-8aa8-4160-aa1d-6c430c2dcc36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 25 00:23:45.551774 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:45.551280 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" podUID="58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 25 00:23:46.500004 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:46.499967 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" podUID="b53e4e70-55e0-4f1f-b5db-743f089b31fb" containerName="splitter-graph-7621b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:23:51.500049 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:51.500010 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" podUID="b53e4e70-55e0-4f1f-b5db-743f089b31fb" containerName="splitter-graph-7621b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:23:55.551054 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:55.551008 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" podUID="58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 25 00:23:55.551449 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:55.551144 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" podUID="11bacc24-8aa8-4160-aa1d-6c430c2dcc36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 25 00:23:56.500551 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:56.500514 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" podUID="b53e4e70-55e0-4f1f-b5db-743f089b31fb" containerName="splitter-graph-7621b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:23:57.541917 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:23:57.541871 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb53e4e70_55e0_4f1f_b5db_743f089b31fb.slice/crio-conmon-9afac5dc5c2bc7f64355a4f52ba4ed82b5e29b359acba7e6e19195fdad15204b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb53e4e70_55e0_4f1f_b5db_743f089b31fb.slice/crio-9afac5dc5c2bc7f64355a4f52ba4ed82b5e29b359acba7e6e19195fdad15204b.scope\": RecentStats: unable to find data in memory cache]" Apr 25 00:23:57.542306 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:23:57.541934 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb53e4e70_55e0_4f1f_b5db_743f089b31fb.slice/crio-conmon-9afac5dc5c2bc7f64355a4f52ba4ed82b5e29b359acba7e6e19195fdad15204b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb53e4e70_55e0_4f1f_b5db_743f089b31fb.slice/crio-9afac5dc5c2bc7f64355a4f52ba4ed82b5e29b359acba7e6e19195fdad15204b.scope\": RecentStats: unable to find data in memory cache]" Apr 25 00:23:57.542306 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:23:57.541957 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb53e4e70_55e0_4f1f_b5db_743f089b31fb.slice/crio-conmon-9afac5dc5c2bc7f64355a4f52ba4ed82b5e29b359acba7e6e19195fdad15204b.scope\": RecentStats: unable to find data in memory cache]" Apr 25 00:23:57.542306 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:23:57.541980 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb53e4e70_55e0_4f1f_b5db_743f089b31fb.slice/crio-conmon-9afac5dc5c2bc7f64355a4f52ba4ed82b5e29b359acba7e6e19195fdad15204b.scope\": RecentStats: unable to find data in memory cache]" Apr 25 00:23:57.542306 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:23:57.541980 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb53e4e70_55e0_4f1f_b5db_743f089b31fb.slice/crio-9afac5dc5c2bc7f64355a4f52ba4ed82b5e29b359acba7e6e19195fdad15204b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb53e4e70_55e0_4f1f_b5db_743f089b31fb.slice/crio-conmon-9afac5dc5c2bc7f64355a4f52ba4ed82b5e29b359acba7e6e19195fdad15204b.scope\": RecentStats: unable to find data in memory cache]" Apr 25 00:23:57.638744 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:57.638706 2578 generic.go:358] "Generic (PLEG): container finished" podID="b53e4e70-55e0-4f1f-b5db-743f089b31fb" containerID="9afac5dc5c2bc7f64355a4f52ba4ed82b5e29b359acba7e6e19195fdad15204b" exitCode=0 Apr 25 00:23:57.638892 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:57.638780 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" event={"ID":"b53e4e70-55e0-4f1f-b5db-743f089b31fb","Type":"ContainerDied","Data":"9afac5dc5c2bc7f64355a4f52ba4ed82b5e29b359acba7e6e19195fdad15204b"} Apr 25 00:23:57.681567 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:57.681538 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" Apr 25 00:23:57.748479 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:57.748446 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b53e4e70-55e0-4f1f-b5db-743f089b31fb-proxy-tls\") pod \"b53e4e70-55e0-4f1f-b5db-743f089b31fb\" (UID: \"b53e4e70-55e0-4f1f-b5db-743f089b31fb\") " Apr 25 00:23:57.748668 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:57.748507 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b53e4e70-55e0-4f1f-b5db-743f089b31fb-openshift-service-ca-bundle\") pod \"b53e4e70-55e0-4f1f-b5db-743f089b31fb\" (UID: \"b53e4e70-55e0-4f1f-b5db-743f089b31fb\") " Apr 25 00:23:57.748969 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:57.748934 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b53e4e70-55e0-4f1f-b5db-743f089b31fb-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "b53e4e70-55e0-4f1f-b5db-743f089b31fb" (UID: "b53e4e70-55e0-4f1f-b5db-743f089b31fb"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:23:57.750854 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:57.750831 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b53e4e70-55e0-4f1f-b5db-743f089b31fb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b53e4e70-55e0-4f1f-b5db-743f089b31fb" (UID: "b53e4e70-55e0-4f1f-b5db-743f089b31fb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:23:57.849632 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:57.849542 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b53e4e70-55e0-4f1f-b5db-743f089b31fb-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:23:57.849632 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:57.849576 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b53e4e70-55e0-4f1f-b5db-743f089b31fb-openshift-service-ca-bundle\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:23:58.642881 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:58.642847 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" Apr 25 00:23:58.643327 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:58.642847 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd" event={"ID":"b53e4e70-55e0-4f1f-b5db-743f089b31fb","Type":"ContainerDied","Data":"a25d54d9f168db1ad6ec5d3b65150f24d4a91e51653879667195045d70e84386"} Apr 25 00:23:58.643327 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:58.642972 2578 scope.go:117] "RemoveContainer" containerID="9afac5dc5c2bc7f64355a4f52ba4ed82b5e29b359acba7e6e19195fdad15204b" Apr 25 00:23:58.658112 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:58.658088 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd"] Apr 25 00:23:58.663751 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:58.663729 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-7621b-7fdfb8f889-6blhd"] Apr 25 00:23:59.912475 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:23:59.912437 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b53e4e70-55e0-4f1f-b5db-743f089b31fb" path="/var/lib/kubelet/pods/b53e4e70-55e0-4f1f-b5db-743f089b31fb/volumes" Apr 25 00:24:05.550847 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:05.550813 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" podUID="58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 25 00:24:05.551248 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:05.551222 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" podUID="11bacc24-8aa8-4160-aa1d-6c430c2dcc36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 25 00:24:15.552296 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:15.552267 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" Apr 25 00:24:15.552734 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:15.552323 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" Apr 25 00:24:27.716640 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.716603 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss"] Apr 25 00:24:27.717085 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.716969 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" containerName="kube-rbac-proxy" Apr 25 00:24:27.717085 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.716980 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" containerName="kube-rbac-proxy" Apr 25 00:24:27.717085 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.716993 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b73d0fa-8973-4fa6-9893-f71b996c8fe0" containerName="kserve-container" Apr 25 00:24:27.717085 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.716998 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b73d0fa-8973-4fa6-9893-f71b996c8fe0" containerName="kserve-container" Apr 25 00:24:27.717085 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.717006 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b53e4e70-55e0-4f1f-b5db-743f089b31fb" containerName="splitter-graph-7621b" Apr 25 00:24:27.717085 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.717011 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b53e4e70-55e0-4f1f-b5db-743f089b31fb" containerName="splitter-graph-7621b" Apr 25 00:24:27.717085 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.717018 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" containerName="kserve-container" Apr 25 00:24:27.717085 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.717024 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" containerName="kserve-container" Apr 25 00:24:27.717085 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.717061 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b73d0fa-8973-4fa6-9893-f71b996c8fe0" containerName="kube-rbac-proxy" Apr 25 00:24:27.717085 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.717068 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b73d0fa-8973-4fa6-9893-f71b996c8fe0" containerName="kube-rbac-proxy" Apr 25 00:24:27.717428 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.717124 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b73d0fa-8973-4fa6-9893-f71b996c8fe0" containerName="kube-rbac-proxy" Apr 25 00:24:27.717428 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.717133 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b73d0fa-8973-4fa6-9893-f71b996c8fe0" containerName="kserve-container" Apr 25 00:24:27.717428 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.717140 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b53e4e70-55e0-4f1f-b5db-743f089b31fb" containerName="splitter-graph-7621b" Apr 25 00:24:27.717428 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.717148 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" containerName="kserve-container" Apr 25 00:24:27.717428 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.717159 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5fb70f9-c86b-41ae-a60b-60c0574f3ccf" containerName="kube-rbac-proxy" Apr 25 00:24:27.720340 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.720325 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" Apr 25 00:24:27.723601 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.723580 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-41340-serving-cert\"" Apr 25 00:24:27.723727 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.723628 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-41340-kube-rbac-proxy-sar-config\"" Apr 25 00:24:27.739512 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.739488 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss"] Apr 25 00:24:27.815630 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.815590 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfff2972-8a37-40cd-a118-1945884a9ae8-openshift-service-ca-bundle\") pod \"splitter-graph-41340-79656bc944-xx8ss\" (UID: \"bfff2972-8a37-40cd-a118-1945884a9ae8\") " pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" Apr 25 00:24:27.815630 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.815631 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bfff2972-8a37-40cd-a118-1945884a9ae8-proxy-tls\") pod \"splitter-graph-41340-79656bc944-xx8ss\" (UID: \"bfff2972-8a37-40cd-a118-1945884a9ae8\") " pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" Apr 25 00:24:27.916611 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.916580 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bfff2972-8a37-40cd-a118-1945884a9ae8-proxy-tls\") pod \"splitter-graph-41340-79656bc944-xx8ss\" (UID: \"bfff2972-8a37-40cd-a118-1945884a9ae8\") " pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" Apr 25 00:24:27.916782 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.916716 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfff2972-8a37-40cd-a118-1945884a9ae8-openshift-service-ca-bundle\") pod \"splitter-graph-41340-79656bc944-xx8ss\" (UID: \"bfff2972-8a37-40cd-a118-1945884a9ae8\") " pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" Apr 25 00:24:27.916782 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:24:27.916734 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-41340-serving-cert: secret "splitter-graph-41340-serving-cert" not found Apr 25 00:24:27.916901 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:24:27.916814 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfff2972-8a37-40cd-a118-1945884a9ae8-proxy-tls podName:bfff2972-8a37-40cd-a118-1945884a9ae8 nodeName:}" failed. No retries permitted until 2026-04-25 00:24:28.416792912 +0000 UTC m=+1851.050774832 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/bfff2972-8a37-40cd-a118-1945884a9ae8-proxy-tls") pod "splitter-graph-41340-79656bc944-xx8ss" (UID: "bfff2972-8a37-40cd-a118-1945884a9ae8") : secret "splitter-graph-41340-serving-cert" not found Apr 25 00:24:27.917352 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:27.917334 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfff2972-8a37-40cd-a118-1945884a9ae8-openshift-service-ca-bundle\") pod \"splitter-graph-41340-79656bc944-xx8ss\" (UID: \"bfff2972-8a37-40cd-a118-1945884a9ae8\") " pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" Apr 25 00:24:28.422133 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:28.422095 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bfff2972-8a37-40cd-a118-1945884a9ae8-proxy-tls\") pod \"splitter-graph-41340-79656bc944-xx8ss\" (UID: \"bfff2972-8a37-40cd-a118-1945884a9ae8\") " pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" Apr 25 00:24:28.424690 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:28.424667 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bfff2972-8a37-40cd-a118-1945884a9ae8-proxy-tls\") pod \"splitter-graph-41340-79656bc944-xx8ss\" (UID: \"bfff2972-8a37-40cd-a118-1945884a9ae8\") " pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" Apr 25 00:24:28.630348 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:28.630314 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" Apr 25 00:24:28.753163 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:28.753115 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss"] Apr 25 00:24:28.755906 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:24:28.755877 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfff2972_8a37_40cd_a118_1945884a9ae8.slice/crio-af6fc3c3a85486753ffbaef13127f858585b5577411d8d3468c44612c5ff52ec WatchSource:0}: Error finding container af6fc3c3a85486753ffbaef13127f858585b5577411d8d3468c44612c5ff52ec: Status 404 returned error can't find the container with id af6fc3c3a85486753ffbaef13127f858585b5577411d8d3468c44612c5ff52ec Apr 25 00:24:29.741819 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:29.741779 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" event={"ID":"bfff2972-8a37-40cd-a118-1945884a9ae8","Type":"ContainerStarted","Data":"d456472af4c122cefe4b155c607723c1dc8f807cd7f0f26efa83fa8487ca75e8"} Apr 25 00:24:29.742013 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:29.741826 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" event={"ID":"bfff2972-8a37-40cd-a118-1945884a9ae8","Type":"ContainerStarted","Data":"af6fc3c3a85486753ffbaef13127f858585b5577411d8d3468c44612c5ff52ec"} Apr 25 00:24:29.742013 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:29.741856 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" Apr 25 00:24:29.758217 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:29.758172 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" podStartSLOduration=2.758156819 podStartE2EDuration="2.758156819s" podCreationTimestamp="2026-04-25 00:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:24:29.756208082 +0000 UTC m=+1852.390190018" watchObservedRunningTime="2026-04-25 00:24:29.758156819 +0000 UTC m=+1852.392138754" Apr 25 00:24:35.750788 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:24:35.750760 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" Apr 25 00:28:37.981224 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:28:37.981197 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 25 00:28:37.984466 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:28:37.984444 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 25 00:32:42.410369 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:42.410337 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss"] Apr 25 00:32:42.410868 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:42.410598 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" podUID="bfff2972-8a37-40cd-a118-1945884a9ae8" containerName="splitter-graph-41340" containerID="cri-o://d456472af4c122cefe4b155c607723c1dc8f807cd7f0f26efa83fa8487ca75e8" gracePeriod=30 Apr 25 00:32:42.495020 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:42.494980 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp"] Apr 25 00:32:42.495299 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:42.495252 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" podUID="11bacc24-8aa8-4160-aa1d-6c430c2dcc36" containerName="kserve-container" containerID="cri-o://f799011376f1290d60172b7e9ff9250ec5c3f8471f6f7474a07529cc1d0db027" gracePeriod=30 Apr 25 00:32:42.495400 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:42.495284 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" podUID="11bacc24-8aa8-4160-aa1d-6c430c2dcc36" containerName="kube-rbac-proxy" containerID="cri-o://cf1997773688a8b0fc055cc38618f25c9436fc72c442a35e0738063ceb0dcd8f" gracePeriod=30 Apr 25 00:32:42.520207 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:42.520175 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s"] Apr 25 00:32:42.520511 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:42.520481 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" podUID="58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d" containerName="kserve-container" containerID="cri-o://7e98e3397a04bef096cf09d6091ba6648b279322a2c5f0c4c1601a8a6290e4a0" gracePeriod=30 Apr 25 00:32:42.520639 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:42.520506 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" podUID="58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d" containerName="kube-rbac-proxy" containerID="cri-o://1aade316bfd22c2ddc4ecf93106f3b54cdc3c1d8f1442415f6b44891be186811" gracePeriod=30 Apr 25 00:32:43.263287 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:43.263251 2578 generic.go:358] "Generic (PLEG): container finished" podID="58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d" containerID="1aade316bfd22c2ddc4ecf93106f3b54cdc3c1d8f1442415f6b44891be186811" exitCode=2 Apr 25 00:32:43.263499 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:43.263320 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" event={"ID":"58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d","Type":"ContainerDied","Data":"1aade316bfd22c2ddc4ecf93106f3b54cdc3c1d8f1442415f6b44891be186811"} Apr 25 00:32:43.264843 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:43.264819 2578 generic.go:358] "Generic (PLEG): container finished" podID="11bacc24-8aa8-4160-aa1d-6c430c2dcc36" containerID="cf1997773688a8b0fc055cc38618f25c9436fc72c442a35e0738063ceb0dcd8f" exitCode=2 Apr 25 00:32:43.264948 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:43.264878 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" event={"ID":"11bacc24-8aa8-4160-aa1d-6c430c2dcc36","Type":"ContainerDied","Data":"cf1997773688a8b0fc055cc38618f25c9436fc72c442a35e0738063ceb0dcd8f"} Apr 25 00:32:45.546219 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:45.546175 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" podUID="58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.49:8643/healthz\": dial tcp 10.133.0.49:8643: connect: connection refused" Apr 25 00:32:45.546613 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:45.546179 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" podUID="11bacc24-8aa8-4160-aa1d-6c430c2dcc36" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.48:8643/healthz\": dial tcp 10.133.0.48:8643: connect: connection refused" Apr 25 00:32:45.551113 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:45.551072 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" podUID="58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 25 00:32:45.551238 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:45.551115 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" podUID="11bacc24-8aa8-4160-aa1d-6c430c2dcc36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 25 00:32:45.749842 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:45.749795 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" podUID="bfff2972-8a37-40cd-a118-1945884a9ae8" containerName="splitter-graph-41340" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:32:45.760860 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:45.760835 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" Apr 25 00:32:45.869769 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:45.869742 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-41340-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/11bacc24-8aa8-4160-aa1d-6c430c2dcc36-success-200-isvc-41340-kube-rbac-proxy-sar-config\") pod \"11bacc24-8aa8-4160-aa1d-6c430c2dcc36\" (UID: \"11bacc24-8aa8-4160-aa1d-6c430c2dcc36\") " Apr 25 00:32:45.869885 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:45.869813 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11bacc24-8aa8-4160-aa1d-6c430c2dcc36-proxy-tls\") pod \"11bacc24-8aa8-4160-aa1d-6c430c2dcc36\" (UID: \"11bacc24-8aa8-4160-aa1d-6c430c2dcc36\") " Apr 25 00:32:45.869885 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:45.869873 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdw2p\" (UniqueName: \"kubernetes.io/projected/11bacc24-8aa8-4160-aa1d-6c430c2dcc36-kube-api-access-mdw2p\") pod \"11bacc24-8aa8-4160-aa1d-6c430c2dcc36\" (UID: \"11bacc24-8aa8-4160-aa1d-6c430c2dcc36\") " Apr 25 00:32:45.870103 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:45.870072 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11bacc24-8aa8-4160-aa1d-6c430c2dcc36-success-200-isvc-41340-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-41340-kube-rbac-proxy-sar-config") pod "11bacc24-8aa8-4160-aa1d-6c430c2dcc36" (UID: "11bacc24-8aa8-4160-aa1d-6c430c2dcc36"). InnerVolumeSpecName "success-200-isvc-41340-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:32:45.871908 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:45.871883 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11bacc24-8aa8-4160-aa1d-6c430c2dcc36-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "11bacc24-8aa8-4160-aa1d-6c430c2dcc36" (UID: "11bacc24-8aa8-4160-aa1d-6c430c2dcc36"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:32:45.872033 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:45.872015 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11bacc24-8aa8-4160-aa1d-6c430c2dcc36-kube-api-access-mdw2p" (OuterVolumeSpecName: "kube-api-access-mdw2p") pod "11bacc24-8aa8-4160-aa1d-6c430c2dcc36" (UID: "11bacc24-8aa8-4160-aa1d-6c430c2dcc36"). InnerVolumeSpecName "kube-api-access-mdw2p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:32:45.873670 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:45.873653 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" Apr 25 00:32:45.970797 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:45.970747 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-41340-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d-error-404-isvc-41340-kube-rbac-proxy-sar-config\") pod \"58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d\" (UID: \"58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d\") " Apr 25 00:32:45.970996 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:45.970819 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krcmk\" (UniqueName: \"kubernetes.io/projected/58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d-kube-api-access-krcmk\") pod \"58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d\" (UID: \"58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d\") " Apr 25 00:32:45.970996 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:45.970942 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d-proxy-tls\") pod \"58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d\" (UID: \"58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d\") " Apr 25 00:32:45.971190 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:45.971174 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11bacc24-8aa8-4160-aa1d-6c430c2dcc36-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:32:45.971247 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:45.971180 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d-error-404-isvc-41340-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-41340-kube-rbac-proxy-sar-config") pod "58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d" (UID: "58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d"). InnerVolumeSpecName "error-404-isvc-41340-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:32:45.971247 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:45.971196 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mdw2p\" (UniqueName: \"kubernetes.io/projected/11bacc24-8aa8-4160-aa1d-6c430c2dcc36-kube-api-access-mdw2p\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:32:45.971337 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:45.971252 2578 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-41340-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/11bacc24-8aa8-4160-aa1d-6c430c2dcc36-success-200-isvc-41340-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:32:45.973124 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:45.973100 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d-kube-api-access-krcmk" (OuterVolumeSpecName: "kube-api-access-krcmk") pod "58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d" (UID: "58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d"). InnerVolumeSpecName "kube-api-access-krcmk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:32:45.973252 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:45.973142 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d" (UID: "58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:32:46.072663 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.072633 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:32:46.072663 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.072658 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-41340-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d-error-404-isvc-41340-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:32:46.072663 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.072669 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-krcmk\" (UniqueName: \"kubernetes.io/projected/58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d-kube-api-access-krcmk\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:32:46.275554 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.275520 2578 generic.go:358] "Generic (PLEG): container finished" podID="58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d" containerID="7e98e3397a04bef096cf09d6091ba6648b279322a2c5f0c4c1601a8a6290e4a0" exitCode=0 Apr 25 00:32:46.275712 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.275596 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" event={"ID":"58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d","Type":"ContainerDied","Data":"7e98e3397a04bef096cf09d6091ba6648b279322a2c5f0c4c1601a8a6290e4a0"} Apr 25 00:32:46.275712 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.275599 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" Apr 25 00:32:46.275712 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.275632 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s" event={"ID":"58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d","Type":"ContainerDied","Data":"9f182f326c87afbf6c95187acfc5c8b0fc5968e5286f7a80a4f1087f2071f20e"} Apr 25 00:32:46.275712 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.275653 2578 scope.go:117] "RemoveContainer" containerID="1aade316bfd22c2ddc4ecf93106f3b54cdc3c1d8f1442415f6b44891be186811" Apr 25 00:32:46.277059 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.277037 2578 generic.go:358] "Generic (PLEG): container finished" podID="11bacc24-8aa8-4160-aa1d-6c430c2dcc36" containerID="f799011376f1290d60172b7e9ff9250ec5c3f8471f6f7474a07529cc1d0db027" exitCode=0 Apr 25 00:32:46.277158 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.277098 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" event={"ID":"11bacc24-8aa8-4160-aa1d-6c430c2dcc36","Type":"ContainerDied","Data":"f799011376f1290d60172b7e9ff9250ec5c3f8471f6f7474a07529cc1d0db027"} Apr 25 00:32:46.277158 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.277119 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" event={"ID":"11bacc24-8aa8-4160-aa1d-6c430c2dcc36","Type":"ContainerDied","Data":"831431f517315bf863d300ab675b9bbe998033c09722447ffbf0ea976b812fa1"} Apr 25 00:32:46.277158 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.277123 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp" Apr 25 00:32:46.284999 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.284981 2578 scope.go:117] "RemoveContainer" containerID="7e98e3397a04bef096cf09d6091ba6648b279322a2c5f0c4c1601a8a6290e4a0" Apr 25 00:32:46.292066 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.292050 2578 scope.go:117] "RemoveContainer" containerID="1aade316bfd22c2ddc4ecf93106f3b54cdc3c1d8f1442415f6b44891be186811" Apr 25 00:32:46.292302 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:32:46.292280 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aade316bfd22c2ddc4ecf93106f3b54cdc3c1d8f1442415f6b44891be186811\": container with ID starting with 1aade316bfd22c2ddc4ecf93106f3b54cdc3c1d8f1442415f6b44891be186811 not found: ID does not exist" containerID="1aade316bfd22c2ddc4ecf93106f3b54cdc3c1d8f1442415f6b44891be186811" Apr 25 00:32:46.292354 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.292309 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aade316bfd22c2ddc4ecf93106f3b54cdc3c1d8f1442415f6b44891be186811"} err="failed to get container status \"1aade316bfd22c2ddc4ecf93106f3b54cdc3c1d8f1442415f6b44891be186811\": rpc error: code = NotFound desc = could not find container \"1aade316bfd22c2ddc4ecf93106f3b54cdc3c1d8f1442415f6b44891be186811\": container with ID starting with 1aade316bfd22c2ddc4ecf93106f3b54cdc3c1d8f1442415f6b44891be186811 not found: ID does not exist" Apr 25 00:32:46.292354 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.292326 2578 scope.go:117] "RemoveContainer" containerID="7e98e3397a04bef096cf09d6091ba6648b279322a2c5f0c4c1601a8a6290e4a0" Apr 25 00:32:46.292584 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:32:46.292567 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e98e3397a04bef096cf09d6091ba6648b279322a2c5f0c4c1601a8a6290e4a0\": container with ID starting with 7e98e3397a04bef096cf09d6091ba6648b279322a2c5f0c4c1601a8a6290e4a0 not found: ID does not exist" containerID="7e98e3397a04bef096cf09d6091ba6648b279322a2c5f0c4c1601a8a6290e4a0" Apr 25 00:32:46.292658 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.292589 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e98e3397a04bef096cf09d6091ba6648b279322a2c5f0c4c1601a8a6290e4a0"} err="failed to get container status \"7e98e3397a04bef096cf09d6091ba6648b279322a2c5f0c4c1601a8a6290e4a0\": rpc error: code = NotFound desc = could not find container \"7e98e3397a04bef096cf09d6091ba6648b279322a2c5f0c4c1601a8a6290e4a0\": container with ID starting with 7e98e3397a04bef096cf09d6091ba6648b279322a2c5f0c4c1601a8a6290e4a0 not found: ID does not exist" Apr 25 00:32:46.292658 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.292603 2578 scope.go:117] "RemoveContainer" containerID="cf1997773688a8b0fc055cc38618f25c9436fc72c442a35e0738063ceb0dcd8f" Apr 25 00:32:46.294976 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.294958 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp"] Apr 25 00:32:46.298246 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.298227 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-41340-predictor-897fcb-v7pwp"] Apr 25 00:32:46.302468 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.302448 2578 scope.go:117] "RemoveContainer" containerID="f799011376f1290d60172b7e9ff9250ec5c3f8471f6f7474a07529cc1d0db027" Apr 25 00:32:46.307758 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.307737 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s"] Apr 25 00:32:46.310522 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.310501 2578 scope.go:117] "RemoveContainer" containerID="cf1997773688a8b0fc055cc38618f25c9436fc72c442a35e0738063ceb0dcd8f" Apr 25 00:32:46.310888 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:32:46.310869 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf1997773688a8b0fc055cc38618f25c9436fc72c442a35e0738063ceb0dcd8f\": container with ID starting with cf1997773688a8b0fc055cc38618f25c9436fc72c442a35e0738063ceb0dcd8f not found: ID does not exist" containerID="cf1997773688a8b0fc055cc38618f25c9436fc72c442a35e0738063ceb0dcd8f" Apr 25 00:32:46.310968 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.310896 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf1997773688a8b0fc055cc38618f25c9436fc72c442a35e0738063ceb0dcd8f"} err="failed to get container status \"cf1997773688a8b0fc055cc38618f25c9436fc72c442a35e0738063ceb0dcd8f\": rpc error: code = NotFound desc = could not find container \"cf1997773688a8b0fc055cc38618f25c9436fc72c442a35e0738063ceb0dcd8f\": container with ID starting with cf1997773688a8b0fc055cc38618f25c9436fc72c442a35e0738063ceb0dcd8f not found: ID does not exist" Apr 25 00:32:46.310968 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.310914 2578 scope.go:117] "RemoveContainer" containerID="f799011376f1290d60172b7e9ff9250ec5c3f8471f6f7474a07529cc1d0db027" Apr 25 00:32:46.311169 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:32:46.311151 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f799011376f1290d60172b7e9ff9250ec5c3f8471f6f7474a07529cc1d0db027\": container with ID starting with f799011376f1290d60172b7e9ff9250ec5c3f8471f6f7474a07529cc1d0db027 not found: ID does not exist" containerID="f799011376f1290d60172b7e9ff9250ec5c3f8471f6f7474a07529cc1d0db027" Apr 25 00:32:46.311209 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.311178 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f799011376f1290d60172b7e9ff9250ec5c3f8471f6f7474a07529cc1d0db027"} err="failed to get container status \"f799011376f1290d60172b7e9ff9250ec5c3f8471f6f7474a07529cc1d0db027\": rpc error: code = NotFound desc = could not find container \"f799011376f1290d60172b7e9ff9250ec5c3f8471f6f7474a07529cc1d0db027\": container with ID starting with f799011376f1290d60172b7e9ff9250ec5c3f8471f6f7474a07529cc1d0db027 not found: ID does not exist" Apr 25 00:32:46.311838 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:46.311821 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-41340-predictor-694448f7cf-n9r4s"] Apr 25 00:32:47.912402 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:47.912358 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11bacc24-8aa8-4160-aa1d-6c430c2dcc36" path="/var/lib/kubelet/pods/11bacc24-8aa8-4160-aa1d-6c430c2dcc36/volumes" Apr 25 00:32:47.913157 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:47.913034 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d" path="/var/lib/kubelet/pods/58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d/volumes" Apr 25 00:32:50.748809 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:50.748768 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" podUID="bfff2972-8a37-40cd-a118-1945884a9ae8" containerName="splitter-graph-41340" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:32:55.748577 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:55.748524 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" podUID="bfff2972-8a37-40cd-a118-1945884a9ae8" containerName="splitter-graph-41340" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:32:55.749025 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:32:55.748683 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" Apr 25 00:33:00.748692 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:33:00.748651 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" podUID="bfff2972-8a37-40cd-a118-1945884a9ae8" containerName="splitter-graph-41340" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:33:05.748688 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:33:05.748648 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" podUID="bfff2972-8a37-40cd-a118-1945884a9ae8" containerName="splitter-graph-41340" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:33:10.748971 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:33:10.748931 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" podUID="bfff2972-8a37-40cd-a118-1945884a9ae8" containerName="splitter-graph-41340" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:33:12.552877 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:33:12.552851 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" Apr 25 00:33:12.591393 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:33:12.591363 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bfff2972-8a37-40cd-a118-1945884a9ae8-proxy-tls\") pod \"bfff2972-8a37-40cd-a118-1945884a9ae8\" (UID: \"bfff2972-8a37-40cd-a118-1945884a9ae8\") " Apr 25 00:33:12.591567 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:33:12.591401 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfff2972-8a37-40cd-a118-1945884a9ae8-openshift-service-ca-bundle\") pod \"bfff2972-8a37-40cd-a118-1945884a9ae8\" (UID: \"bfff2972-8a37-40cd-a118-1945884a9ae8\") " Apr 25 00:33:12.591803 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:33:12.591783 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfff2972-8a37-40cd-a118-1945884a9ae8-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "bfff2972-8a37-40cd-a118-1945884a9ae8" (UID: "bfff2972-8a37-40cd-a118-1945884a9ae8"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:33:12.593565 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:33:12.593544 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfff2972-8a37-40cd-a118-1945884a9ae8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bfff2972-8a37-40cd-a118-1945884a9ae8" (UID: "bfff2972-8a37-40cd-a118-1945884a9ae8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:33:12.691967 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:33:12.691896 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bfff2972-8a37-40cd-a118-1945884a9ae8-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:33:12.691967 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:33:12.691924 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfff2972-8a37-40cd-a118-1945884a9ae8-openshift-service-ca-bundle\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:33:13.360646 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:33:13.360610 2578 generic.go:358] "Generic (PLEG): container finished" podID="bfff2972-8a37-40cd-a118-1945884a9ae8" containerID="d456472af4c122cefe4b155c607723c1dc8f807cd7f0f26efa83fa8487ca75e8" exitCode=0 Apr 25 00:33:13.360849 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:33:13.360678 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" event={"ID":"bfff2972-8a37-40cd-a118-1945884a9ae8","Type":"ContainerDied","Data":"d456472af4c122cefe4b155c607723c1dc8f807cd7f0f26efa83fa8487ca75e8"} Apr 25 00:33:13.360849 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:33:13.360700 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" event={"ID":"bfff2972-8a37-40cd-a118-1945884a9ae8","Type":"ContainerDied","Data":"af6fc3c3a85486753ffbaef13127f858585b5577411d8d3468c44612c5ff52ec"} Apr 25 00:33:13.360849 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:33:13.360714 2578 scope.go:117] "RemoveContainer" containerID="d456472af4c122cefe4b155c607723c1dc8f807cd7f0f26efa83fa8487ca75e8" Apr 25 00:33:13.360849 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:33:13.360679 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss" Apr 25 00:33:13.369569 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:33:13.369551 2578 scope.go:117] "RemoveContainer" containerID="d456472af4c122cefe4b155c607723c1dc8f807cd7f0f26efa83fa8487ca75e8" Apr 25 00:33:13.369818 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:33:13.369797 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d456472af4c122cefe4b155c607723c1dc8f807cd7f0f26efa83fa8487ca75e8\": container with ID starting with d456472af4c122cefe4b155c607723c1dc8f807cd7f0f26efa83fa8487ca75e8 not found: ID does not exist" containerID="d456472af4c122cefe4b155c607723c1dc8f807cd7f0f26efa83fa8487ca75e8" Apr 25 00:33:13.369882 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:33:13.369826 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d456472af4c122cefe4b155c607723c1dc8f807cd7f0f26efa83fa8487ca75e8"} err="failed to get container status \"d456472af4c122cefe4b155c607723c1dc8f807cd7f0f26efa83fa8487ca75e8\": rpc error: code = NotFound desc = could not find container \"d456472af4c122cefe4b155c607723c1dc8f807cd7f0f26efa83fa8487ca75e8\": container with ID starting with d456472af4c122cefe4b155c607723c1dc8f807cd7f0f26efa83fa8487ca75e8 not found: ID does not exist" Apr 25 00:33:13.381165 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:33:13.381143 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss"] Apr 25 00:33:13.384867 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:33:13.384845 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-41340-79656bc944-xx8ss"] Apr 25 00:33:13.911891 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:33:13.911856 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfff2972-8a37-40cd-a118-1945884a9ae8" path="/var/lib/kubelet/pods/bfff2972-8a37-40cd-a118-1945884a9ae8/volumes" Apr 25 00:33:38.007942 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:33:38.007911 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 25 00:33:38.011386 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:33:38.011364 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 25 00:38:38.038901 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:38:38.038817 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 25 00:38:38.042589 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:38:38.042568 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 25 00:39:38.594154 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:38.594120 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg"] Apr 25 00:39:38.594727 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:38.594374 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" podUID="db64d84c-ea53-4fe5-a56a-a4727181e955" containerName="switch-graph-a37f5" containerID="cri-o://55eb0bd15f9b05294c56dd06b4c4bc5dc6eecb99a4060c85093c5a41a199ac91" gracePeriod=30 Apr 25 00:39:38.709709 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:38.709674 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w"] Apr 25 00:39:38.710030 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:38.709985 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" podUID="e12b5b8f-d177-46cd-b4a8-07accb6972eb" containerName="kserve-container" containerID="cri-o://425784d4c965f7bb1b595ef8a5ede2b547aa87740df41b4e638cbb9efaa1a254" gracePeriod=30 Apr 25 00:39:38.710101 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:38.710043 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" podUID="e12b5b8f-d177-46cd-b4a8-07accb6972eb" containerName="kube-rbac-proxy" containerID="cri-o://afa171472f7efbd7d338911411a0eb3b6f2bc9b03cd4eebac4857aa0f98decae" gracePeriod=30 Apr 25 00:39:38.757327 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:38.757293 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm"] Apr 25 00:39:38.757637 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:38.757607 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" podUID="cb303152-c58d-4a84-bf8f-5dd213041b6d" containerName="kserve-container" containerID="cri-o://e3735e37c1b4a6707bdd8f9cb1adda2f2599bbc949f92bba083cace416816680" gracePeriod=30 Apr 25 00:39:38.757790 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:38.757661 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" podUID="cb303152-c58d-4a84-bf8f-5dd213041b6d" containerName="kube-rbac-proxy" containerID="cri-o://f29b4a85ce18b2de9c6aa661808083f14c34842da7c9317e1feebdcc61735635" gracePeriod=30 Apr 25 00:39:39.509804 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:39.509771 2578 generic.go:358] "Generic (PLEG): container finished" podID="e12b5b8f-d177-46cd-b4a8-07accb6972eb" containerID="afa171472f7efbd7d338911411a0eb3b6f2bc9b03cd4eebac4857aa0f98decae" exitCode=2 Apr 25 00:39:39.509994 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:39.509845 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" event={"ID":"e12b5b8f-d177-46cd-b4a8-07accb6972eb","Type":"ContainerDied","Data":"afa171472f7efbd7d338911411a0eb3b6f2bc9b03cd4eebac4857aa0f98decae"} Apr 25 00:39:39.511336 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:39.511312 2578 generic.go:358] "Generic (PLEG): container finished" podID="cb303152-c58d-4a84-bf8f-5dd213041b6d" containerID="f29b4a85ce18b2de9c6aa661808083f14c34842da7c9317e1feebdcc61735635" exitCode=2 Apr 25 00:39:39.511491 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:39.511373 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" event={"ID":"cb303152-c58d-4a84-bf8f-5dd213041b6d","Type":"ContainerDied","Data":"f29b4a85ce18b2de9c6aa661808083f14c34842da7c9317e1feebdcc61735635"} Apr 25 00:39:41.515512 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:41.515474 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" podUID="db64d84c-ea53-4fe5-a56a-a4727181e955" containerName="switch-graph-a37f5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:39:42.099732 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.099709 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" Apr 25 00:39:42.121998 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.121970 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb303152-c58d-4a84-bf8f-5dd213041b6d-proxy-tls\") pod \"cb303152-c58d-4a84-bf8f-5dd213041b6d\" (UID: \"cb303152-c58d-4a84-bf8f-5dd213041b6d\") " Apr 25 00:39:42.122174 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.122029 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-a37f5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb303152-c58d-4a84-bf8f-5dd213041b6d-error-404-isvc-a37f5-kube-rbac-proxy-sar-config\") pod \"cb303152-c58d-4a84-bf8f-5dd213041b6d\" (UID: \"cb303152-c58d-4a84-bf8f-5dd213041b6d\") " Apr 25 00:39:42.122174 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.122050 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpfwc\" (UniqueName: \"kubernetes.io/projected/cb303152-c58d-4a84-bf8f-5dd213041b6d-kube-api-access-zpfwc\") pod \"cb303152-c58d-4a84-bf8f-5dd213041b6d\" (UID: \"cb303152-c58d-4a84-bf8f-5dd213041b6d\") " Apr 25 00:39:42.122539 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.122493 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb303152-c58d-4a84-bf8f-5dd213041b6d-error-404-isvc-a37f5-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-a37f5-kube-rbac-proxy-sar-config") pod "cb303152-c58d-4a84-bf8f-5dd213041b6d" (UID: "cb303152-c58d-4a84-bf8f-5dd213041b6d"). InnerVolumeSpecName "error-404-isvc-a37f5-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:39:42.124357 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.124330 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb303152-c58d-4a84-bf8f-5dd213041b6d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cb303152-c58d-4a84-bf8f-5dd213041b6d" (UID: "cb303152-c58d-4a84-bf8f-5dd213041b6d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:39:42.124541 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.124393 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb303152-c58d-4a84-bf8f-5dd213041b6d-kube-api-access-zpfwc" (OuterVolumeSpecName: "kube-api-access-zpfwc") pod "cb303152-c58d-4a84-bf8f-5dd213041b6d" (UID: "cb303152-c58d-4a84-bf8f-5dd213041b6d"). InnerVolumeSpecName "kube-api-access-zpfwc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:39:42.222734 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.222648 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-a37f5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb303152-c58d-4a84-bf8f-5dd213041b6d-error-404-isvc-a37f5-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:39:42.222734 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.222682 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zpfwc\" (UniqueName: \"kubernetes.io/projected/cb303152-c58d-4a84-bf8f-5dd213041b6d-kube-api-access-zpfwc\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:39:42.222734 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.222693 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb303152-c58d-4a84-bf8f-5dd213041b6d-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:39:42.304223 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.304181 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" podUID="e12b5b8f-d177-46cd-b4a8-07accb6972eb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.44:8643/healthz\": dial tcp 10.133.0.44:8643: connect: connection refused" Apr 25 00:39:42.521835 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.521798 2578 generic.go:358] "Generic (PLEG): container finished" podID="cb303152-c58d-4a84-bf8f-5dd213041b6d" containerID="e3735e37c1b4a6707bdd8f9cb1adda2f2599bbc949f92bba083cace416816680" exitCode=0 Apr 25 00:39:42.521835 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.521837 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" event={"ID":"cb303152-c58d-4a84-bf8f-5dd213041b6d","Type":"ContainerDied","Data":"e3735e37c1b4a6707bdd8f9cb1adda2f2599bbc949f92bba083cace416816680"} Apr 25 00:39:42.522364 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.521863 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" event={"ID":"cb303152-c58d-4a84-bf8f-5dd213041b6d","Type":"ContainerDied","Data":"8c1260358f15cf3e6fcab5fccb1e7b49b4bb865f7a41ffed8c0d911524de2ca3"} Apr 25 00:39:42.522364 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.521877 2578 scope.go:117] "RemoveContainer" containerID="f29b4a85ce18b2de9c6aa661808083f14c34842da7c9317e1feebdcc61735635" Apr 25 00:39:42.522364 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.521880 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm" Apr 25 00:39:42.530207 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.530190 2578 scope.go:117] "RemoveContainer" containerID="e3735e37c1b4a6707bdd8f9cb1adda2f2599bbc949f92bba083cace416816680" Apr 25 00:39:42.537829 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.537813 2578 scope.go:117] "RemoveContainer" containerID="f29b4a85ce18b2de9c6aa661808083f14c34842da7c9317e1feebdcc61735635" Apr 25 00:39:42.538072 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:39:42.538053 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f29b4a85ce18b2de9c6aa661808083f14c34842da7c9317e1feebdcc61735635\": container with ID starting with f29b4a85ce18b2de9c6aa661808083f14c34842da7c9317e1feebdcc61735635 not found: ID does not exist" containerID="f29b4a85ce18b2de9c6aa661808083f14c34842da7c9317e1feebdcc61735635" Apr 25 00:39:42.538135 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.538080 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f29b4a85ce18b2de9c6aa661808083f14c34842da7c9317e1feebdcc61735635"} err="failed to get container status \"f29b4a85ce18b2de9c6aa661808083f14c34842da7c9317e1feebdcc61735635\": rpc error: code = NotFound desc = could not find container \"f29b4a85ce18b2de9c6aa661808083f14c34842da7c9317e1feebdcc61735635\": container with ID starting with f29b4a85ce18b2de9c6aa661808083f14c34842da7c9317e1feebdcc61735635 not found: ID does not exist" Apr 25 00:39:42.538135 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.538098 2578 scope.go:117] "RemoveContainer" containerID="e3735e37c1b4a6707bdd8f9cb1adda2f2599bbc949f92bba083cace416816680" Apr 25 00:39:42.538317 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:39:42.538298 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3735e37c1b4a6707bdd8f9cb1adda2f2599bbc949f92bba083cace416816680\": container with ID starting with e3735e37c1b4a6707bdd8f9cb1adda2f2599bbc949f92bba083cace416816680 not found: ID does not exist" containerID="e3735e37c1b4a6707bdd8f9cb1adda2f2599bbc949f92bba083cace416816680" Apr 25 00:39:42.538372 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.538326 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3735e37c1b4a6707bdd8f9cb1adda2f2599bbc949f92bba083cace416816680"} err="failed to get container status \"e3735e37c1b4a6707bdd8f9cb1adda2f2599bbc949f92bba083cace416816680\": rpc error: code = NotFound desc = could not find container \"e3735e37c1b4a6707bdd8f9cb1adda2f2599bbc949f92bba083cace416816680\": container with ID starting with e3735e37c1b4a6707bdd8f9cb1adda2f2599bbc949f92bba083cace416816680 not found: ID does not exist" Apr 25 00:39:42.542442 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.542403 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm"] Apr 25 00:39:42.546106 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:42.546087 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a37f5-predictor-547f4c9447-g2whm"] Apr 25 00:39:43.912197 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:43.912161 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb303152-c58d-4a84-bf8f-5dd213041b6d" path="/var/lib/kubelet/pods/cb303152-c58d-4a84-bf8f-5dd213041b6d/volumes" Apr 25 00:39:45.532707 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:45.532678 2578 generic.go:358] "Generic (PLEG): container finished" podID="e12b5b8f-d177-46cd-b4a8-07accb6972eb" containerID="425784d4c965f7bb1b595ef8a5ede2b547aa87740df41b4e638cbb9efaa1a254" exitCode=0 Apr 25 00:39:45.533042 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:45.532756 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" event={"ID":"e12b5b8f-d177-46cd-b4a8-07accb6972eb","Type":"ContainerDied","Data":"425784d4c965f7bb1b595ef8a5ede2b547aa87740df41b4e638cbb9efaa1a254"} Apr 25 00:39:45.651281 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:45.651252 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" Apr 25 00:39:45.748904 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:45.748813 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-a37f5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e12b5b8f-d177-46cd-b4a8-07accb6972eb-success-200-isvc-a37f5-kube-rbac-proxy-sar-config\") pod \"e12b5b8f-d177-46cd-b4a8-07accb6972eb\" (UID: \"e12b5b8f-d177-46cd-b4a8-07accb6972eb\") " Apr 25 00:39:45.748904 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:45.748888 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e12b5b8f-d177-46cd-b4a8-07accb6972eb-proxy-tls\") pod \"e12b5b8f-d177-46cd-b4a8-07accb6972eb\" (UID: \"e12b5b8f-d177-46cd-b4a8-07accb6972eb\") " Apr 25 00:39:45.749095 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:45.748907 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4scl6\" (UniqueName: \"kubernetes.io/projected/e12b5b8f-d177-46cd-b4a8-07accb6972eb-kube-api-access-4scl6\") pod \"e12b5b8f-d177-46cd-b4a8-07accb6972eb\" (UID: \"e12b5b8f-d177-46cd-b4a8-07accb6972eb\") " Apr 25 00:39:45.749205 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:45.749182 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e12b5b8f-d177-46cd-b4a8-07accb6972eb-success-200-isvc-a37f5-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-a37f5-kube-rbac-proxy-sar-config") pod "e12b5b8f-d177-46cd-b4a8-07accb6972eb" (UID: "e12b5b8f-d177-46cd-b4a8-07accb6972eb"). InnerVolumeSpecName "success-200-isvc-a37f5-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:39:45.751162 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:45.751138 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e12b5b8f-d177-46cd-b4a8-07accb6972eb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e12b5b8f-d177-46cd-b4a8-07accb6972eb" (UID: "e12b5b8f-d177-46cd-b4a8-07accb6972eb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:39:45.751251 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:45.751169 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e12b5b8f-d177-46cd-b4a8-07accb6972eb-kube-api-access-4scl6" (OuterVolumeSpecName: "kube-api-access-4scl6") pod "e12b5b8f-d177-46cd-b4a8-07accb6972eb" (UID: "e12b5b8f-d177-46cd-b4a8-07accb6972eb"). InnerVolumeSpecName "kube-api-access-4scl6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:39:45.849652 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:45.849619 2578 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-a37f5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e12b5b8f-d177-46cd-b4a8-07accb6972eb-success-200-isvc-a37f5-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:39:45.849652 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:45.849647 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e12b5b8f-d177-46cd-b4a8-07accb6972eb-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:39:45.849652 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:45.849658 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4scl6\" (UniqueName: \"kubernetes.io/projected/e12b5b8f-d177-46cd-b4a8-07accb6972eb-kube-api-access-4scl6\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:39:46.515392 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:46.515351 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" podUID="db64d84c-ea53-4fe5-a56a-a4727181e955" containerName="switch-graph-a37f5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:39:46.537179 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:46.537143 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" event={"ID":"e12b5b8f-d177-46cd-b4a8-07accb6972eb","Type":"ContainerDied","Data":"86f2ef976d72f0f298781f341831145c2da5d267690de95ca9ccffc822052ea8"} Apr 25 00:39:46.537609 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:46.537197 2578 scope.go:117] "RemoveContainer" containerID="afa171472f7efbd7d338911411a0eb3b6f2bc9b03cd4eebac4857aa0f98decae" Apr 25 00:39:46.537609 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:46.537207 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w" Apr 25 00:39:46.545216 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:46.545196 2578 scope.go:117] "RemoveContainer" containerID="425784d4c965f7bb1b595ef8a5ede2b547aa87740df41b4e638cbb9efaa1a254" Apr 25 00:39:46.553128 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:46.553109 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w"] Apr 25 00:39:46.558738 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:46.558714 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a37f5-predictor-68dfd7bf6d-p6m6w"] Apr 25 00:39:47.911747 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:47.911715 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e12b5b8f-d177-46cd-b4a8-07accb6972eb" path="/var/lib/kubelet/pods/e12b5b8f-d177-46cd-b4a8-07accb6972eb/volumes" Apr 25 00:39:51.515769 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:51.515682 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" podUID="db64d84c-ea53-4fe5-a56a-a4727181e955" containerName="switch-graph-a37f5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:39:51.516132 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:51.515825 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" Apr 25 00:39:53.488181 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:53.488153 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a37f5-68dd44b8fb-97nlg_db64d84c-ea53-4fe5-a56a-a4727181e955/switch-graph-a37f5/0.log" Apr 25 00:39:54.284800 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:54.284774 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a37f5-68dd44b8fb-97nlg_db64d84c-ea53-4fe5-a56a-a4727181e955/switch-graph-a37f5/0.log" Apr 25 00:39:55.065947 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:55.065917 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a37f5-68dd44b8fb-97nlg_db64d84c-ea53-4fe5-a56a-a4727181e955/switch-graph-a37f5/0.log" Apr 25 00:39:55.820870 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:55.820840 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a37f5-68dd44b8fb-97nlg_db64d84c-ea53-4fe5-a56a-a4727181e955/switch-graph-a37f5/0.log" Apr 25 00:39:56.515353 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:56.515310 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" podUID="db64d84c-ea53-4fe5-a56a-a4727181e955" containerName="switch-graph-a37f5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:39:56.593565 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:56.593529 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a37f5-68dd44b8fb-97nlg_db64d84c-ea53-4fe5-a56a-a4727181e955/switch-graph-a37f5/0.log" Apr 25 00:39:57.340761 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:57.340728 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a37f5-68dd44b8fb-97nlg_db64d84c-ea53-4fe5-a56a-a4727181e955/switch-graph-a37f5/0.log" Apr 25 00:39:58.093396 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:58.093357 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a37f5-68dd44b8fb-97nlg_db64d84c-ea53-4fe5-a56a-a4727181e955/switch-graph-a37f5/0.log" Apr 25 00:39:58.846089 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:58.846058 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a37f5-68dd44b8fb-97nlg_db64d84c-ea53-4fe5-a56a-a4727181e955/switch-graph-a37f5/0.log" Apr 25 00:39:59.595741 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:39:59.595705 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a37f5-68dd44b8fb-97nlg_db64d84c-ea53-4fe5-a56a-a4727181e955/switch-graph-a37f5/0.log" Apr 25 00:40:00.358174 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:00.358147 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a37f5-68dd44b8fb-97nlg_db64d84c-ea53-4fe5-a56a-a4727181e955/switch-graph-a37f5/0.log" Apr 25 00:40:01.121520 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:01.121489 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a37f5-68dd44b8fb-97nlg_db64d84c-ea53-4fe5-a56a-a4727181e955/switch-graph-a37f5/0.log" Apr 25 00:40:01.515432 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:01.515382 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" podUID="db64d84c-ea53-4fe5-a56a-a4727181e955" containerName="switch-graph-a37f5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:40:01.934522 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:01.934430 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a37f5-68dd44b8fb-97nlg_db64d84c-ea53-4fe5-a56a-a4727181e955/switch-graph-a37f5/0.log" Apr 25 00:40:04.489170 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489134 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6sp6k/must-gather-q4x8v"] Apr 25 00:40:04.489555 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489460 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e12b5b8f-d177-46cd-b4a8-07accb6972eb" containerName="kserve-container" Apr 25 00:40:04.489555 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489476 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e12b5b8f-d177-46cd-b4a8-07accb6972eb" containerName="kserve-container" Apr 25 00:40:04.489555 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489490 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e12b5b8f-d177-46cd-b4a8-07accb6972eb" containerName="kube-rbac-proxy" Apr 25 00:40:04.489555 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489496 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e12b5b8f-d177-46cd-b4a8-07accb6972eb" containerName="kube-rbac-proxy" Apr 25 00:40:04.489555 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489505 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d" containerName="kserve-container" Apr 25 00:40:04.489555 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489512 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d" containerName="kserve-container" Apr 25 00:40:04.489555 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489520 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb303152-c58d-4a84-bf8f-5dd213041b6d" containerName="kserve-container" Apr 25 00:40:04.489555 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489525 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb303152-c58d-4a84-bf8f-5dd213041b6d" containerName="kserve-container" Apr 25 00:40:04.489555 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489531 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11bacc24-8aa8-4160-aa1d-6c430c2dcc36" containerName="kube-rbac-proxy" Apr 25 00:40:04.489555 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489538 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="11bacc24-8aa8-4160-aa1d-6c430c2dcc36" containerName="kube-rbac-proxy" Apr 25 00:40:04.489555 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489548 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bfff2972-8a37-40cd-a118-1945884a9ae8" containerName="splitter-graph-41340" Apr 25 00:40:04.489555 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489556 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfff2972-8a37-40cd-a118-1945884a9ae8" containerName="splitter-graph-41340" Apr 25 00:40:04.489977 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489565 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb303152-c58d-4a84-bf8f-5dd213041b6d" containerName="kube-rbac-proxy" Apr 25 00:40:04.489977 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489572 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb303152-c58d-4a84-bf8f-5dd213041b6d" containerName="kube-rbac-proxy" Apr 25 00:40:04.489977 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489585 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d" containerName="kube-rbac-proxy" Apr 25 00:40:04.489977 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489591 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d" containerName="kube-rbac-proxy" Apr 25 00:40:04.489977 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489598 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11bacc24-8aa8-4160-aa1d-6c430c2dcc36" containerName="kserve-container" Apr 25 00:40:04.489977 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489603 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="11bacc24-8aa8-4160-aa1d-6c430c2dcc36" containerName="kserve-container" Apr 25 00:40:04.489977 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489649 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb303152-c58d-4a84-bf8f-5dd213041b6d" containerName="kserve-container" Apr 25 00:40:04.489977 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489659 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d" containerName="kserve-container" Apr 25 00:40:04.489977 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489665 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="58dff5e2-5bd9-4d6c-8b89-f07148d0fa9d" containerName="kube-rbac-proxy" Apr 25 00:40:04.489977 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489689 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="11bacc24-8aa8-4160-aa1d-6c430c2dcc36" containerName="kserve-container" Apr 25 00:40:04.489977 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489696 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e12b5b8f-d177-46cd-b4a8-07accb6972eb" containerName="kserve-container" Apr 25 00:40:04.489977 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489701 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e12b5b8f-d177-46cd-b4a8-07accb6972eb" containerName="kube-rbac-proxy" Apr 25 00:40:04.489977 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489708 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="11bacc24-8aa8-4160-aa1d-6c430c2dcc36" containerName="kube-rbac-proxy" Apr 25 00:40:04.489977 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489714 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb303152-c58d-4a84-bf8f-5dd213041b6d" containerName="kube-rbac-proxy" Apr 25 00:40:04.489977 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.489720 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="bfff2972-8a37-40cd-a118-1945884a9ae8" containerName="splitter-graph-41340" Apr 25 00:40:04.493810 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.493793 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sp6k/must-gather-q4x8v" Apr 25 00:40:04.496161 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.496141 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6sp6k\"/\"openshift-service-ca.crt\"" Apr 25 00:40:04.496321 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.496303 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6sp6k\"/\"kube-root-ca.crt\"" Apr 25 00:40:04.496431 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.496307 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6sp6k\"/\"default-dockercfg-r2q97\"" Apr 25 00:40:04.501434 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.500213 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sp6k/must-gather-q4x8v"] Apr 25 00:40:04.608095 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.608067 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f46d44dc-08bb-4e08-87a6-5de97d536c81-must-gather-output\") pod \"must-gather-q4x8v\" (UID: \"f46d44dc-08bb-4e08-87a6-5de97d536c81\") " pod="openshift-must-gather-6sp6k/must-gather-q4x8v" Apr 25 00:40:04.608282 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.608106 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf96t\" (UniqueName: \"kubernetes.io/projected/f46d44dc-08bb-4e08-87a6-5de97d536c81-kube-api-access-zf96t\") pod \"must-gather-q4x8v\" (UID: \"f46d44dc-08bb-4e08-87a6-5de97d536c81\") " pod="openshift-must-gather-6sp6k/must-gather-q4x8v" Apr 25 00:40:04.709546 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.709512 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f46d44dc-08bb-4e08-87a6-5de97d536c81-must-gather-output\") pod \"must-gather-q4x8v\" (UID: \"f46d44dc-08bb-4e08-87a6-5de97d536c81\") " pod="openshift-must-gather-6sp6k/must-gather-q4x8v" Apr 25 00:40:04.709733 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.709554 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zf96t\" (UniqueName: \"kubernetes.io/projected/f46d44dc-08bb-4e08-87a6-5de97d536c81-kube-api-access-zf96t\") pod \"must-gather-q4x8v\" (UID: \"f46d44dc-08bb-4e08-87a6-5de97d536c81\") " pod="openshift-must-gather-6sp6k/must-gather-q4x8v" Apr 25 00:40:04.709926 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.709902 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f46d44dc-08bb-4e08-87a6-5de97d536c81-must-gather-output\") pod \"must-gather-q4x8v\" (UID: \"f46d44dc-08bb-4e08-87a6-5de97d536c81\") " pod="openshift-must-gather-6sp6k/must-gather-q4x8v" Apr 25 00:40:04.717682 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.717656 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf96t\" (UniqueName: \"kubernetes.io/projected/f46d44dc-08bb-4e08-87a6-5de97d536c81-kube-api-access-zf96t\") pod \"must-gather-q4x8v\" (UID: \"f46d44dc-08bb-4e08-87a6-5de97d536c81\") " pod="openshift-must-gather-6sp6k/must-gather-q4x8v" Apr 25 00:40:04.819157 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.819122 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sp6k/must-gather-q4x8v" Apr 25 00:40:04.942540 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.942511 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sp6k/must-gather-q4x8v"] Apr 25 00:40:04.945222 ip-10-0-129-98 kubenswrapper[2578]: W0425 00:40:04.945196 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf46d44dc_08bb_4e08_87a6_5de97d536c81.slice/crio-1b33b8ecce3882c4f2d49165e2918be479dff856d639591ef975ff35554abf3d WatchSource:0}: Error finding container 1b33b8ecce3882c4f2d49165e2918be479dff856d639591ef975ff35554abf3d: Status 404 returned error can't find the container with id 1b33b8ecce3882c4f2d49165e2918be479dff856d639591ef975ff35554abf3d Apr 25 00:40:04.946953 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:04.946937 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:40:05.598867 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:05.598821 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sp6k/must-gather-q4x8v" event={"ID":"f46d44dc-08bb-4e08-87a6-5de97d536c81","Type":"ContainerStarted","Data":"1b33b8ecce3882c4f2d49165e2918be479dff856d639591ef975ff35554abf3d"} Apr 25 00:40:06.516280 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:06.516231 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" podUID="db64d84c-ea53-4fe5-a56a-a4727181e955" containerName="switch-graph-a37f5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:40:06.604576 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:06.604543 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sp6k/must-gather-q4x8v" event={"ID":"f46d44dc-08bb-4e08-87a6-5de97d536c81","Type":"ContainerStarted","Data":"993c8be5aed814ef48635d6aae9f0d5027a4c858b7dfc306676b454104659082"} Apr 25 00:40:06.604576 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:06.604582 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sp6k/must-gather-q4x8v" event={"ID":"f46d44dc-08bb-4e08-87a6-5de97d536c81","Type":"ContainerStarted","Data":"3c295630e9d37a17d758debb51e5ad55a6b2cf986b4f50e1f697f2a79318686e"} Apr 25 00:40:06.621953 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:06.621891 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6sp6k/must-gather-q4x8v" podStartSLOduration=1.6783379360000001 podStartE2EDuration="2.621874301s" podCreationTimestamp="2026-04-25 00:40:04 +0000 UTC" firstStartedPulling="2026-04-25 00:40:04.947065756 +0000 UTC m=+2787.581047670" lastFinishedPulling="2026-04-25 00:40:05.890602121 +0000 UTC m=+2788.524584035" observedRunningTime="2026-04-25 00:40:06.61976412 +0000 UTC m=+2789.253746058" watchObservedRunningTime="2026-04-25 00:40:06.621874301 +0000 UTC m=+2789.255856238" Apr 25 00:40:07.277541 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:07.277509 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-26g8h_532cdfc7-fd38-495f-b85d-70daea2998a1/global-pull-secret-syncer/0.log" Apr 25 00:40:07.452302 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:07.452258 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-dwll4_35b60d0c-8fbf-4b7b-a0e6-31a46f1a96d2/konnectivity-agent/0.log" Apr 25 00:40:07.513489 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:07.513456 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-98.ec2.internal_87dc53c55f73620bf5df44e2826c141e/haproxy/0.log" Apr 25 00:40:09.406956 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:09.406608 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" Apr 25 00:40:09.456637 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:09.455684 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db64d84c-ea53-4fe5-a56a-a4727181e955-proxy-tls\") pod \"db64d84c-ea53-4fe5-a56a-a4727181e955\" (UID: \"db64d84c-ea53-4fe5-a56a-a4727181e955\") " Apr 25 00:40:09.456637 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:09.455768 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db64d84c-ea53-4fe5-a56a-a4727181e955-openshift-service-ca-bundle\") pod \"db64d84c-ea53-4fe5-a56a-a4727181e955\" (UID: \"db64d84c-ea53-4fe5-a56a-a4727181e955\") " Apr 25 00:40:09.456637 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:09.456309 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db64d84c-ea53-4fe5-a56a-a4727181e955-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "db64d84c-ea53-4fe5-a56a-a4727181e955" (UID: "db64d84c-ea53-4fe5-a56a-a4727181e955"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:40:09.467603 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:09.467555 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db64d84c-ea53-4fe5-a56a-a4727181e955-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "db64d84c-ea53-4fe5-a56a-a4727181e955" (UID: "db64d84c-ea53-4fe5-a56a-a4727181e955"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:40:09.557958 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:09.557892 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db64d84c-ea53-4fe5-a56a-a4727181e955-proxy-tls\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:40:09.557958 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:09.557930 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db64d84c-ea53-4fe5-a56a-a4727181e955-openshift-service-ca-bundle\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 25 00:40:09.626531 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:09.622571 2578 generic.go:358] "Generic (PLEG): container finished" podID="db64d84c-ea53-4fe5-a56a-a4727181e955" containerID="55eb0bd15f9b05294c56dd06b4c4bc5dc6eecb99a4060c85093c5a41a199ac91" exitCode=0 Apr 25 00:40:09.626531 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:09.622644 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" event={"ID":"db64d84c-ea53-4fe5-a56a-a4727181e955","Type":"ContainerDied","Data":"55eb0bd15f9b05294c56dd06b4c4bc5dc6eecb99a4060c85093c5a41a199ac91"} Apr 25 00:40:09.626531 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:09.622675 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" event={"ID":"db64d84c-ea53-4fe5-a56a-a4727181e955","Type":"ContainerDied","Data":"bf50c99c1df2576480808e11db63b460e2485292ff038e98e3e382099ac95619"} Apr 25 00:40:09.626531 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:09.622697 2578 scope.go:117] "RemoveContainer" containerID="55eb0bd15f9b05294c56dd06b4c4bc5dc6eecb99a4060c85093c5a41a199ac91" Apr 25 00:40:09.626531 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:09.622861 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg" Apr 25 00:40:09.656893 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:09.656775 2578 scope.go:117] "RemoveContainer" containerID="55eb0bd15f9b05294c56dd06b4c4bc5dc6eecb99a4060c85093c5a41a199ac91" Apr 25 00:40:09.660164 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:09.660103 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg"] Apr 25 00:40:09.660835 ip-10-0-129-98 kubenswrapper[2578]: E0425 00:40:09.660742 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55eb0bd15f9b05294c56dd06b4c4bc5dc6eecb99a4060c85093c5a41a199ac91\": container with ID starting with 55eb0bd15f9b05294c56dd06b4c4bc5dc6eecb99a4060c85093c5a41a199ac91 not found: ID does not exist" containerID="55eb0bd15f9b05294c56dd06b4c4bc5dc6eecb99a4060c85093c5a41a199ac91" Apr 25 00:40:09.660835 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:09.660786 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55eb0bd15f9b05294c56dd06b4c4bc5dc6eecb99a4060c85093c5a41a199ac91"} err="failed to get container status \"55eb0bd15f9b05294c56dd06b4c4bc5dc6eecb99a4060c85093c5a41a199ac91\": rpc error: code = NotFound desc = could not find container \"55eb0bd15f9b05294c56dd06b4c4bc5dc6eecb99a4060c85093c5a41a199ac91\": container with ID starting with 55eb0bd15f9b05294c56dd06b4c4bc5dc6eecb99a4060c85093c5a41a199ac91 not found: ID does not exist" Apr 25 00:40:09.667086 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:09.666586 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-a37f5-68dd44b8fb-97nlg"] Apr 25 00:40:09.915782 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:09.915697 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db64d84c-ea53-4fe5-a56a-a4727181e955" path="/var/lib/kubelet/pods/db64d84c-ea53-4fe5-a56a-a4727181e955/volumes" Apr 25 00:40:11.018345 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:11.018267 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5ec9f71c-d45c-4be2-9915-8a57dfeb094d/alertmanager/0.log" Apr 25 00:40:11.044058 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:11.044031 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5ec9f71c-d45c-4be2-9915-8a57dfeb094d/config-reloader/0.log" Apr 25 00:40:11.066746 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:11.066709 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5ec9f71c-d45c-4be2-9915-8a57dfeb094d/kube-rbac-proxy-web/0.log" Apr 25 00:40:11.096283 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:11.096238 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5ec9f71c-d45c-4be2-9915-8a57dfeb094d/kube-rbac-proxy/0.log" Apr 25 00:40:11.122604 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:11.122569 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5ec9f71c-d45c-4be2-9915-8a57dfeb094d/kube-rbac-proxy-metric/0.log" Apr 25 00:40:11.148645 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:11.148614 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5ec9f71c-d45c-4be2-9915-8a57dfeb094d/prom-label-proxy/0.log" Apr 25 00:40:11.179680 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:11.179648 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5ec9f71c-d45c-4be2-9915-8a57dfeb094d/init-config-reloader/0.log" Apr 25 00:40:11.323257 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:11.323167 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5fb87d9599-nll7r_ffe4cebf-4220-4ce3-bbbb-19bf7016f72a/metrics-server/0.log" Apr 25 00:40:11.393253 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:11.393220 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4ps8n_e360c868-68fa-4bd9-864f-093fce4cb0c5/node-exporter/0.log" Apr 25 00:40:11.415300 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:11.415264 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4ps8n_e360c868-68fa-4bd9-864f-093fce4cb0c5/kube-rbac-proxy/0.log" Apr 25 00:40:11.440095 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:11.440065 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4ps8n_e360c868-68fa-4bd9-864f-093fce4cb0c5/init-textfile/0.log" Apr 25 00:40:11.722848 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:11.722751 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_85c5a684-d475-4029-ad11-b6e97d35d195/prometheus/0.log" Apr 25 00:40:11.743258 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:11.743225 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_85c5a684-d475-4029-ad11-b6e97d35d195/config-reloader/0.log" Apr 25 00:40:11.770278 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:11.770236 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_85c5a684-d475-4029-ad11-b6e97d35d195/thanos-sidecar/0.log" Apr 25 00:40:11.796518 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:11.796485 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_85c5a684-d475-4029-ad11-b6e97d35d195/kube-rbac-proxy-web/0.log" Apr 25 00:40:11.819117 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:11.819070 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_85c5a684-d475-4029-ad11-b6e97d35d195/kube-rbac-proxy/0.log" Apr 25 00:40:11.842704 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:11.842673 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_85c5a684-d475-4029-ad11-b6e97d35d195/kube-rbac-proxy-thanos/0.log" Apr 25 00:40:11.869969 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:11.869938 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_85c5a684-d475-4029-ad11-b6e97d35d195/init-config-reloader/0.log" Apr 25 00:40:11.984657 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:11.984578 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-8d467dc86-ltlcr_55e30091-910e-4fab-9cad-4ef17aa7f6f6/telemeter-client/0.log" Apr 25 00:40:12.041090 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:12.041059 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-8d467dc86-ltlcr_55e30091-910e-4fab-9cad-4ef17aa7f6f6/reload/0.log" Apr 25 00:40:12.063841 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:12.063808 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-8d467dc86-ltlcr_55e30091-910e-4fab-9cad-4ef17aa7f6f6/kube-rbac-proxy/0.log" Apr 25 00:40:14.147015 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:14.146949 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6467d98968-v8qkd_6e56e78d-5aae-468e-a90b-b0792d315656/console/0.log" Apr 25 00:40:14.688825 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:14.688789 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d"] Apr 25 00:40:14.689125 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:14.689112 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db64d84c-ea53-4fe5-a56a-a4727181e955" containerName="switch-graph-a37f5" Apr 25 00:40:14.689170 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:14.689127 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="db64d84c-ea53-4fe5-a56a-a4727181e955" containerName="switch-graph-a37f5" Apr 25 00:40:14.689206 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:14.689175 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="db64d84c-ea53-4fe5-a56a-a4727181e955" containerName="switch-graph-a37f5" Apr 25 00:40:14.693447 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:14.693403 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d" Apr 25 00:40:14.700800 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:14.700776 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d"] Apr 25 00:40:14.809013 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:14.808984 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af7f477e-9eb5-47f0-ae6c-369d25b94442-sys\") pod \"perf-node-gather-daemonset-8fw5d\" (UID: \"af7f477e-9eb5-47f0-ae6c-369d25b94442\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d" Apr 25 00:40:14.809181 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:14.809030 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/af7f477e-9eb5-47f0-ae6c-369d25b94442-proc\") pod \"perf-node-gather-daemonset-8fw5d\" (UID: \"af7f477e-9eb5-47f0-ae6c-369d25b94442\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d" Apr 25 00:40:14.809181 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:14.809069 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/af7f477e-9eb5-47f0-ae6c-369d25b94442-podres\") pod \"perf-node-gather-daemonset-8fw5d\" (UID: \"af7f477e-9eb5-47f0-ae6c-369d25b94442\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d" Apr 25 00:40:14.809181 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:14.809088 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksqcm\" (UniqueName: \"kubernetes.io/projected/af7f477e-9eb5-47f0-ae6c-369d25b94442-kube-api-access-ksqcm\") pod \"perf-node-gather-daemonset-8fw5d\" (UID: \"af7f477e-9eb5-47f0-ae6c-369d25b94442\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d" Apr 25 00:40:14.809181 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:14.809120 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/af7f477e-9eb5-47f0-ae6c-369d25b94442-lib-modules\") pod \"perf-node-gather-daemonset-8fw5d\" (UID: \"af7f477e-9eb5-47f0-ae6c-369d25b94442\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d" Apr 25 00:40:14.909708 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:14.909674 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ksqcm\" (UniqueName: \"kubernetes.io/projected/af7f477e-9eb5-47f0-ae6c-369d25b94442-kube-api-access-ksqcm\") pod \"perf-node-gather-daemonset-8fw5d\" (UID: \"af7f477e-9eb5-47f0-ae6c-369d25b94442\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d" Apr 25 00:40:14.909900 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:14.909724 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/af7f477e-9eb5-47f0-ae6c-369d25b94442-lib-modules\") pod \"perf-node-gather-daemonset-8fw5d\" (UID: \"af7f477e-9eb5-47f0-ae6c-369d25b94442\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d" Apr 25 00:40:14.909900 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:14.909773 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af7f477e-9eb5-47f0-ae6c-369d25b94442-sys\") pod \"perf-node-gather-daemonset-8fw5d\" (UID: \"af7f477e-9eb5-47f0-ae6c-369d25b94442\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d" Apr 25 00:40:14.909900 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:14.909809 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/af7f477e-9eb5-47f0-ae6c-369d25b94442-proc\") pod \"perf-node-gather-daemonset-8fw5d\" (UID: \"af7f477e-9eb5-47f0-ae6c-369d25b94442\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d" Apr 25 00:40:14.909900 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:14.909831 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/af7f477e-9eb5-47f0-ae6c-369d25b94442-podres\") pod \"perf-node-gather-daemonset-8fw5d\" (UID: \"af7f477e-9eb5-47f0-ae6c-369d25b94442\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d" Apr 25 00:40:14.910067 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:14.909898 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af7f477e-9eb5-47f0-ae6c-369d25b94442-sys\") pod \"perf-node-gather-daemonset-8fw5d\" (UID: \"af7f477e-9eb5-47f0-ae6c-369d25b94442\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d" Apr 25 00:40:14.910067 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:14.909931 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/af7f477e-9eb5-47f0-ae6c-369d25b94442-lib-modules\") pod \"perf-node-gather-daemonset-8fw5d\" (UID: \"af7f477e-9eb5-47f0-ae6c-369d25b94442\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d" Apr 25 00:40:14.910067 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:14.909898 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/af7f477e-9eb5-47f0-ae6c-369d25b94442-proc\") pod \"perf-node-gather-daemonset-8fw5d\" (UID: \"af7f477e-9eb5-47f0-ae6c-369d25b94442\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d" Apr 25 00:40:14.910067 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:14.909937 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/af7f477e-9eb5-47f0-ae6c-369d25b94442-podres\") pod \"perf-node-gather-daemonset-8fw5d\" (UID: \"af7f477e-9eb5-47f0-ae6c-369d25b94442\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d" Apr 25 00:40:14.919104 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:14.919076 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksqcm\" (UniqueName: \"kubernetes.io/projected/af7f477e-9eb5-47f0-ae6c-369d25b94442-kube-api-access-ksqcm\") pod \"perf-node-gather-daemonset-8fw5d\" (UID: \"af7f477e-9eb5-47f0-ae6c-369d25b94442\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d" Apr 25 00:40:15.005201 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:15.005114 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d" Apr 25 00:40:15.153855 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:15.153746 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d"] Apr 25 00:40:15.262086 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:15.262056 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-l2htg_9d910288-c4b3-4b19-9188-f9ded54fd92f/dns/0.log" Apr 25 00:40:15.282092 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:15.282068 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-l2htg_9d910288-c4b3-4b19-9188-f9ded54fd92f/kube-rbac-proxy/0.log" Apr 25 00:40:15.395824 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:15.395792 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s2pf5_b50de4c3-3440-4c81-81ac-23466ec3f726/dns-node-resolver/0.log" Apr 25 00:40:15.648989 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:15.648902 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d" event={"ID":"af7f477e-9eb5-47f0-ae6c-369d25b94442","Type":"ContainerStarted","Data":"d9030a8c912e6448a9f1afb457f816c6ad72e676dc013452909f687d762e4bd3"} Apr 25 00:40:15.649219 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:15.649198 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d" event={"ID":"af7f477e-9eb5-47f0-ae6c-369d25b94442","Type":"ContainerStarted","Data":"00a6d50780b82150f1492098d5eae2004183ed4c63194f8d6909374192e15396"} Apr 25 00:40:15.649382 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:15.649359 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d" Apr 25 00:40:15.665206 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:15.665161 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d" podStartSLOduration=1.665147173 podStartE2EDuration="1.665147173s" podCreationTimestamp="2026-04-25 00:40:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:40:15.662883642 +0000 UTC m=+2798.296865577" watchObservedRunningTime="2026-04-25 00:40:15.665147173 +0000 UTC m=+2798.299129110" Apr 25 00:40:15.802244 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:15.802189 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-946b4db85-hh7bl_899f41b7-00d1-44b2-bcba-0541dea9fcb3/registry/0.log" Apr 25 00:40:15.870738 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:15.870709 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vdjkm_220c5498-d45f-48c2-a25e-01ac23225100/node-ca/0.log" Apr 25 00:40:16.925190 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:16.925150 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-z6qcm_ed05dbf9-ea8c-41d5-ac86-6efec6560e64/serve-healthcheck-canary/0.log" Apr 25 00:40:17.364753 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:17.364717 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rrznm_bcc13d05-91af-4a72-98c9-e7de706cfb3c/kube-rbac-proxy/0.log" Apr 25 00:40:17.387207 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:17.387177 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rrznm_bcc13d05-91af-4a72-98c9-e7de706cfb3c/exporter/0.log" Apr 25 00:40:17.411955 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:17.411910 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rrznm_bcc13d05-91af-4a72-98c9-e7de706cfb3c/extractor/0.log" Apr 25 00:40:19.370946 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:19.370905 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-lxncq_056ba31c-acb9-4e89-99bb-bf2a09c5d4fc/manager/0.log" Apr 25 00:40:19.650915 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:19.650837 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-sk5sp_ec7dfc0e-4b8f-4573-9e35-638b6bb8681e/s3-init/0.log" Apr 25 00:40:21.662845 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:21.662817 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-8fw5d" Apr 25 00:40:24.849090 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:24.849059 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7zs6q_d89d33b9-52c1-474f-a5b8-221754ae1cc6/kube-multus-additional-cni-plugins/0.log" Apr 25 00:40:24.870869 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:24.870846 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7zs6q_d89d33b9-52c1-474f-a5b8-221754ae1cc6/egress-router-binary-copy/0.log" Apr 25 00:40:24.891094 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:24.891053 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7zs6q_d89d33b9-52c1-474f-a5b8-221754ae1cc6/cni-plugins/0.log" Apr 25 00:40:24.911012 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:24.910989 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7zs6q_d89d33b9-52c1-474f-a5b8-221754ae1cc6/bond-cni-plugin/0.log" Apr 25 00:40:24.930813 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:24.930789 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7zs6q_d89d33b9-52c1-474f-a5b8-221754ae1cc6/routeoverride-cni/0.log" Apr 25 00:40:24.950542 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:24.950519 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7zs6q_d89d33b9-52c1-474f-a5b8-221754ae1cc6/whereabouts-cni-bincopy/0.log" Apr 25 00:40:24.970201 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:24.970175 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7zs6q_d89d33b9-52c1-474f-a5b8-221754ae1cc6/whereabouts-cni/0.log" Apr 25 00:40:25.325342 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:25.325311 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ktchn_8566f65b-b13b-4b52-8b4d-8dcbd70b502a/kube-multus/0.log" Apr 25 00:40:25.419634 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:25.419603 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rhtrz_d3fe756c-b2b5-42bc-8234-bd6d59e5dd29/network-metrics-daemon/0.log" Apr 25 00:40:25.436899 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:25.436865 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rhtrz_d3fe756c-b2b5-42bc-8234-bd6d59e5dd29/kube-rbac-proxy/0.log" Apr 25 00:40:26.550762 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:26.550691 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-controller/0.log" Apr 25 00:40:26.568982 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:26.568923 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/0.log" Apr 25 00:40:26.599871 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:26.599829 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovn-acl-logging/1.log" Apr 25 00:40:26.618576 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:26.618550 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/kube-rbac-proxy-node/0.log" Apr 25 00:40:26.638807 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:26.638783 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/kube-rbac-proxy-ovn-metrics/0.log" Apr 25 00:40:26.658238 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:26.658212 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/northd/0.log" Apr 25 00:40:26.678549 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:26.678524 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/nbdb/0.log" Apr 25 00:40:26.698970 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:26.698945 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/sbdb/0.log" Apr 25 00:40:26.881837 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:26.881751 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t4f4q_e0939cda-0079-43e5-b1be-4f8099b11f56/ovnkube-controller/0.log" Apr 25 00:40:28.131345 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:28.131313 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-w2qd9_ac28ac1b-bb45-4d9f-a544-3fa1e7fd33f1/network-check-target-container/0.log" Apr 25 00:40:29.096929 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:29.096904 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-kngps_31f651c0-8e2e-4e85-b153-94f4291085b1/iptables-alerter/0.log" Apr 25 00:40:29.693300 ip-10-0-129-98 kubenswrapper[2578]: I0425 00:40:29.693262 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-tmhfx_edeca547-37b0-442b-95dc-712808101f9a/tuned/0.log"