Apr 24 21:27:33.699821 ip-10-0-132-124 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:27:34.163101 ip-10-0-132-124 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:34.163101 ip-10-0-132-124 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:27:34.163101 ip-10-0-132-124 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:34.163101 ip-10-0-132-124 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:27:34.163101 ip-10-0-132-124 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:34.165945 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.165800 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:27:34.171640 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171618 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:34.171640 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171637 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:34.171640 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171643 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:34.171640 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171648 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:34.171881 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171652 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:34.171881 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171657 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:34.171881 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171660 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:34.171881 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171665 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:34.171881 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171669 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:34.171881 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171672 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:34.171881 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171676 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:34.171881 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171682 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:34.171881 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171687 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:34.171881 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171693 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:34.171881 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171699 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:34.171881 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171704 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:34.171881 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171707 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:34.171881 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171712 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:34.171881 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171716 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:34.171881 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171720 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:34.171881 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171725 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:34.171881 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171729 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:34.171881 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171733 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:34.172690 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171737 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:34.172690 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171740 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:34.172690 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171744 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:34.172690 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171748 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:34.172690 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171752 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:34.172690 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171756 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:34.172690 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171759 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:34.172690 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171764 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:34.172690 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171768 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:34.172690 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171773 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:34.172690 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171777 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:34.172690 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171780 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:34.172690 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171786 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:34.172690 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171790 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:34.172690 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171795 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:34.172690 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171799 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:34.172690 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171803 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:34.172690 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171807 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:34.172690 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171811 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:34.172690 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171815 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:34.173678 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171819 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:34.173678 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171823 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:34.173678 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171827 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:34.173678 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171831 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:34.173678 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171835 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:34.173678 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171839 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:34.173678 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171843 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:34.173678 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171847 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:34.173678 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171851 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:34.173678 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171855 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:34.173678 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171859 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:34.173678 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171863 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:34.173678 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171867 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:34.173678 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171871 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:34.173678 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171874 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:34.173678 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171879 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:34.173678 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171882 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:34.173678 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171887 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:34.173678 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171891 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:34.173678 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171895 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:34.174502 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171899 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:34.174502 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171902 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:34.174502 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171906 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:34.174502 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171911 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:34.174502 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171915 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:34.174502 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171919 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:34.174502 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171923 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:34.174502 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171928 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:34.174502 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171932 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:34.174502 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171938 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:34.174502 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171942 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:34.174502 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171946 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:34.174502 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171950 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:34.174502 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171954 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:34.174502 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171958 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:34.174502 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171969 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:34.174502 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171973 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:34.174502 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171977 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:34.174502 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171981 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:34.174502 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171985 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:34.175000 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171989 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:34.175000 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171993 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:34.175000 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.171997 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:34.175000 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172635 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:34.175000 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172645 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:34.175000 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172650 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:34.175000 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172656 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:34.175000 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172660 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:34.175000 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172665 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:34.175000 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172670 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:34.175000 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172674 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:34.175000 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172678 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:34.175000 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172682 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:34.175000 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172686 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:34.175000 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172690 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:34.175000 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172694 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:34.175000 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172699 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:34.175000 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172703 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:34.175000 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172706 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:34.175000 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172710 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:34.175611 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172715 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:34.175611 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172719 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:34.175611 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172723 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:34.175611 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172727 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:34.175611 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172732 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:34.175611 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172736 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:34.175611 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172740 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:34.175611 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172745 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:34.175611 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172749 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:34.175611 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172753 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:34.175611 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172757 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:34.175611 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172761 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:34.175611 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172765 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:34.175611 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172769 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:34.175611 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172774 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:34.175611 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172778 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:34.175611 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172782 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:34.175611 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172786 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:34.175611 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172790 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:34.175611 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172794 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:34.176289 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172797 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:34.176289 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172802 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:34.176289 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172806 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:34.176289 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172810 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:34.176289 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172813 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:34.176289 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172824 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:34.176289 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172829 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:34.176289 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172834 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:34.176289 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172839 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:34.176289 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172843 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:34.176289 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172849 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:34.176289 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172854 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:34.176289 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172859 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:34.176289 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172864 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:34.176289 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172868 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:34.176289 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172873 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:34.176289 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172878 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:34.176289 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172886 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:34.176289 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172891 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:34.176825 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172895 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:34.176825 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172901 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:34.176825 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172905 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:34.176825 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172910 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:34.176825 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172914 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:34.176825 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172918 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:34.176825 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172922 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:34.176825 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172926 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:34.176825 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172930 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:34.176825 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172934 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:34.176825 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172939 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:34.176825 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172943 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:34.176825 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172947 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:34.176825 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172951 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:34.176825 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172955 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:34.176825 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172958 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:34.176825 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172962 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:34.176825 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172966 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:34.176825 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172969 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:34.176825 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172973 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:34.177308 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172978 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:34.177308 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172988 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:34.177308 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172992 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:34.177308 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172996 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:34.177308 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.172999 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:34.177308 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.173003 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:34.177308 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.173007 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:34.177308 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.173011 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:34.177308 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.173015 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:34.177308 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.173019 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:34.177308 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173119 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:27:34.177308 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173129 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:27:34.177308 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173137 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:27:34.177308 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173144 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:27:34.177308 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173152 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:27:34.177308 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173157 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:27:34.177308 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173174 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:27:34.177308 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173180 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:27:34.177308 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173185 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:27:34.177308 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173190 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:27:34.177308 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173194 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173199 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173205 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173209 2578 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173214 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173218 2578 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173222 2578 flags.go:64] FLAG: --cloud-config="" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173226 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173231 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173238 2578 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173242 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173247 2578 flags.go:64] FLAG: --config-dir="" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173252 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173259 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173266 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173272 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173277 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173282 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173287 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173291 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173296 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173301 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173305 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173311 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173316 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:27:34.177919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173320 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173325 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173331 2578 flags.go:64] FLAG: --enable-server="true" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173335 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173342 2578 flags.go:64] FLAG: --event-burst="100" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173347 2578 flags.go:64] FLAG: --event-qps="50" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173351 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173356 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173361 2578 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173366 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173371 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173376 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173381 2578 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173385 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173390 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173395 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173400 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173404 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173408 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173413 2578 flags.go:64] FLAG: --feature-gates="" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173421 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173426 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173431 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173436 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173441 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:27:34.178512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173446 2578 flags.go:64] FLAG: --help="false" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173451 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-132-124.ec2.internal" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173456 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173461 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173466 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173471 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173476 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173481 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173486 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173490 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173496 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173501 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173506 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173511 2578 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173515 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173519 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173524 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173528 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173533 2578 flags.go:64] FLAG: --lock-file="" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173536 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173558 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173563 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173571 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:27:34.179121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173574 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173579 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173583 2578 flags.go:64] FLAG: --logging-format="text" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173587 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173593 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173597 2578 flags.go:64] FLAG: --manifest-url="" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173602 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173608 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173612 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173618 2578 flags.go:64] FLAG: --max-pods="110" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173622 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173626 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173630 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173634 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173639 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173644 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173648 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173659 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173663 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173668 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173674 2578 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173679 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173686 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173690 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:27:34.179693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173695 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173699 2578 flags.go:64] FLAG: --port="10250" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173703 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173708 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-05e6ec5c790963389" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173712 2578 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173717 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173721 2578 flags.go:64] FLAG: --register-node="true" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173725 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173730 2578 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173736 2578 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173741 2578 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173746 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173751 2578 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173758 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173763 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173768 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173773 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173777 2578 flags.go:64] FLAG: --runonce="false" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173781 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173786 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173791 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173795 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173800 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173804 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173809 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173814 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:27:34.180261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173818 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173824 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173829 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173834 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173839 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173843 2578 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173848 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173857 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173861 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173866 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173873 2578 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173878 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173882 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173886 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173891 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173895 2578 flags.go:64] FLAG: --v="2" Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173902 2578 flags.go:64] FLAG: --version="false" Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173908 2578 flags.go:64] FLAG: --vmodule="" Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173914 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.173920 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174066 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174072 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174078 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174082 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:34.180942 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174086 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:34.181501 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174091 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:34.181501 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174095 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:34.181501 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174099 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:34.181501 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174103 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:34.181501 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174107 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:34.181501 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174111 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:34.181501 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174115 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:34.181501 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174120 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:34.181501 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174126 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:34.181501 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174130 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:34.181501 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174134 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:34.181501 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174139 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:34.181501 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174144 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:34.181501 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174149 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:34.181501 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174153 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:34.181501 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174157 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:34.181501 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174161 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:34.181501 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174165 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:34.181501 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174169 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:34.181501 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174173 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:34.182034 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174177 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:34.182034 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174182 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:34.182034 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174186 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:34.182034 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174189 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:34.182034 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174193 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:34.182034 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174197 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:34.182034 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174204 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:34.182034 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174208 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:34.182034 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174212 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:34.182034 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174216 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:34.182034 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174220 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:34.182034 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174224 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:34.182034 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174227 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:34.182034 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174231 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:34.182034 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174235 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:34.182034 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174239 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:34.182034 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174243 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:34.182034 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174248 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:34.182034 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174252 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:34.182034 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174256 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:34.182511 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174262 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:34.182511 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174266 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:34.182511 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174270 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:34.182511 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174273 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:34.182511 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174278 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:34.182511 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174282 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:34.182511 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174286 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:34.182511 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174290 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:34.182511 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174294 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:34.182511 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174298 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:34.182511 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174302 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:34.182511 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174306 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:34.182511 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174310 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:34.182511 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174314 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:34.182511 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174318 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:34.182511 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174322 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:34.182511 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174326 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:34.182511 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174330 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:34.182511 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174335 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:34.182984 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174339 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:34.182984 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174343 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:34.182984 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174347 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:34.182984 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174351 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:34.182984 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174355 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:34.182984 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174360 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:34.182984 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174364 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:34.182984 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174368 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:34.182984 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174372 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:34.182984 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174376 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:34.182984 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174380 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:34.182984 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174384 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:34.182984 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174391 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:34.182984 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174399 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:34.182984 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174405 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:34.182984 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174410 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:34.182984 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174414 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:34.182984 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174418 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:34.182984 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174423 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:34.183440 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174428 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:34.183440 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174432 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:34.183440 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.174436 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:34.183440 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.175299 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:34.183440 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.181651 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:27:34.183440 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.181667 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:27:34.183440 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181711 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:34.183440 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181715 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:34.183440 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181718 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:34.183440 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181721 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:34.183440 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181724 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:34.183440 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181726 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:34.183440 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181729 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:34.183440 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181731 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:34.183440 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181734 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:34.183440 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181736 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:34.183869 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181739 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:34.183869 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181742 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:34.183869 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181745 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:34.183869 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181747 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:34.183869 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181749 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:34.183869 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181753 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:34.183869 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181757 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:34.183869 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181760 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:34.183869 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181763 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:34.183869 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181766 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:34.183869 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181769 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:34.183869 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181772 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:34.183869 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181774 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:34.183869 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181777 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:34.183869 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181779 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:34.183869 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181782 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:34.183869 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181784 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:34.183869 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181787 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:34.183869 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181789 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:34.183869 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181791 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:34.184338 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181794 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:34.184338 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181798 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:34.184338 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181800 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:34.184338 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181803 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:34.184338 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181806 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:34.184338 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181808 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:34.184338 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181810 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:34.184338 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181813 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:34.184338 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181815 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:34.184338 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181818 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:34.184338 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181820 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:34.184338 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181823 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:34.184338 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181825 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:34.184338 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181828 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:34.184338 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181830 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:34.184338 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181832 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:34.184338 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181835 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:34.184338 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181837 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:34.184338 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181840 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:34.184338 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181842 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:34.184828 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181845 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:34.184828 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181849 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:34.184828 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181851 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:34.184828 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181854 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:34.184828 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181856 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:34.184828 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181859 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:34.184828 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181861 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:34.184828 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181863 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:34.184828 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181866 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:34.184828 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181868 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:34.184828 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181871 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:34.184828 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181873 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:34.184828 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181875 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:34.184828 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181879 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:34.184828 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181883 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:34.184828 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181885 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:34.184828 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181888 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:34.184828 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181890 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:34.184828 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181892 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:34.185288 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181895 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:34.185288 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181897 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:34.185288 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181900 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:34.185288 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181902 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:34.185288 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181905 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:34.185288 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181907 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:34.185288 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181910 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:34.185288 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181912 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:34.185288 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181914 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:34.185288 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181917 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:34.185288 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181919 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:34.185288 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181922 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:34.185288 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181924 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:34.185288 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181926 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:34.185288 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181929 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:34.185288 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181931 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:34.185288 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.181934 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:34.185720 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.181939 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:34.185720 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182027 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:34.185720 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182031 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:34.185720 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182035 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:34.185720 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182039 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:34.185720 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182042 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:34.185720 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182045 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:34.185720 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182048 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:34.185720 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182051 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:34.185720 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182053 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:34.185720 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182056 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:34.185720 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182059 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:34.185720 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182061 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:34.185720 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182064 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:34.185720 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182066 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:34.186085 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182069 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:34.186085 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182071 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:34.186085 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182073 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:34.186085 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182076 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:34.186085 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182078 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:34.186085 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182081 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:34.186085 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182083 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:34.186085 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182086 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:34.186085 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182088 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:34.186085 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182090 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:34.186085 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182093 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:34.186085 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182095 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:34.186085 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182098 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:34.186085 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182100 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:34.186085 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182103 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:34.186085 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182105 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:34.186085 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182108 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:34.186085 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182110 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:34.186085 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182112 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:34.186085 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182115 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:34.186561 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182117 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:34.186561 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182119 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:34.186561 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182122 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:34.186561 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182125 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:34.186561 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182127 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:34.186561 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182130 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:34.186561 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182132 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:34.186561 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182135 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:34.186561 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182137 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:34.186561 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182141 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:34.186561 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182143 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:34.186561 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182145 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:34.186561 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182148 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:34.186561 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182150 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:34.186561 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182153 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:34.186561 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182157 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:34.186561 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182159 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:34.186561 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182161 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:34.186561 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182164 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:34.187007 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182166 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:34.187007 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182168 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:34.187007 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182171 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:34.187007 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182173 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:34.187007 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182175 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:34.187007 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182177 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:34.187007 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182180 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:34.187007 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182182 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:34.187007 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182184 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:34.187007 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182187 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:34.187007 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182189 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:34.187007 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182192 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:34.187007 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182194 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:34.187007 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182196 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:34.187007 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182199 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:34.187007 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182201 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:34.187007 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182203 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:34.187007 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182206 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:34.187007 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182208 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:34.187007 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182210 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:34.187472 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182213 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:34.187472 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182215 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:34.187472 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182218 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:34.187472 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182220 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:34.187472 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182223 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:34.187472 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182225 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:34.187472 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182227 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:34.187472 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182229 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:34.187472 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182232 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:34.187472 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182234 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:34.187472 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182236 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:34.187472 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182239 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:34.187472 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:34.182241 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:34.187472 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.182246 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:34.187472 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.182975 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:27:34.187844 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.187475 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:27:34.188432 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.188421 2578 server.go:1019] "Starting client certificate rotation" Apr 24 21:27:34.188531 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.188516 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:34.188580 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.188568 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:34.215012 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.214994 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:34.220353 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.220041 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:34.234138 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.234113 2578 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:27:34.240256 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.240242 2578 log.go:25] "Validated CRI v1 image API" Apr 24 21:27:34.242297 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.242281 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:27:34.245583 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.245562 2578 fs.go:135] Filesystem UUIDs: map[349e64bb-0ed4-4f01-98c4-95cf05520e8b:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 85de2c82-bbaa-4711-945d-92bd01c8b859:/dev/nvme0n1p4] Apr 24 21:27:34.245656 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.245581 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:27:34.249080 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.249059 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:34.250617 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.250488 2578 manager.go:217] Machine: {Timestamp:2026-04-24 21:27:34.249204115 +0000 UTC m=+0.424938782 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:2499998 MemoryCapacity:32812158976 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec222178b80628926927689659059012 SystemUUID:ec222178-b806-2892-6927-689659059012 BootID:60479f51-96ff-41fe-8fd9-8e899a599475 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406081536 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406077440 Type:vfs Inodes:4005390 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:51:1e:7f:43:d5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:51:1e:7f:43:d5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:b2:fc:13:d7:f3:85 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812158976 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:27:34.250617 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.250606 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:27:34.250775 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.250707 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:27:34.252206 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.252179 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:27:34.252364 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.252209 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-124.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:27:34.252438 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.252381 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:27:34.252438 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.252394 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:27:34.252438 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.252417 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:34.254099 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.254086 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:34.255957 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.255943 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:34.256092 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.256081 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:27:34.258557 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.258528 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:27:34.258613 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.258604 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:27:34.258672 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.258627 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:27:34.258672 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.258640 2578 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:27:34.258672 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.258652 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:27:34.259711 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.259697 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:34.259784 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.259720 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:34.263061 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.263038 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:27:34.264414 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.264399 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:27:34.265265 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.265251 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:27:34.265333 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.265271 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:27:34.265333 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.265280 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:27:34.265333 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.265288 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:27:34.265333 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.265296 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:27:34.265333 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.265305 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:27:34.265333 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.265313 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:27:34.265333 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.265321 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:27:34.265333 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.265331 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:27:34.265582 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.265343 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:27:34.265582 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.265358 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:27:34.265582 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.265575 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:27:34.266398 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.266388 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:27:34.266449 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.266400 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:27:34.269804 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.269789 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:27:34.269877 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.269831 2578 server.go:1295] "Started kubelet" Apr 24 21:27:34.274101 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.274075 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-124.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:27:34.274321 ip-10-0-132-124 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:27:34.274520 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:34.274341 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:27:34.274631 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:34.274606 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-124.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:27:34.274949 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.274801 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:27:34.275000 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.269950 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:27:34.275051 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.275021 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:27:34.276166 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.276146 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:27:34.278065 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.278047 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:27:34.280790 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.280773 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6rrtn" Apr 24 21:27:34.282191 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:34.281172 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-124.ec2.internal.18a968293b28151c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-124.ec2.internal,UID:ip-10-0-132-124.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-124.ec2.internal,},FirstTimestamp:2026-04-24 21:27:34.269801756 +0000 UTC m=+0.445536423,LastTimestamp:2026-04-24 21:27:34.269801756 +0000 UTC m=+0.445536423,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-124.ec2.internal,}" Apr 24 21:27:34.282709 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.282692 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:27:34.282709 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.282701 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:34.283430 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.283415 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:27:34.283505 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.283433 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:27:34.283505 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.283448 2578 factory.go:55] Registering systemd factory Apr 24 21:27:34.283505 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.283464 2578 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:27:34.283764 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.283507 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:27:34.283764 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.283562 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:27:34.283764 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.283573 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:27:34.283764 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.283654 2578 factory.go:153] Registering CRI-O factory Apr 24 21:27:34.283764 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.283666 2578 factory.go:223] Registration of the crio container factory successfully Apr 24 21:27:34.283764 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.283711 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:27:34.283764 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.283730 2578 factory.go:103] Registering Raw factory Apr 24 21:27:34.283764 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.283745 2578 manager.go:1196] Started watching for new ooms in manager Apr 24 21:27:34.284062 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:34.283804 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:27:34.284920 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.284879 2578 manager.go:319] Starting recovery of all containers Apr 24 21:27:34.285237 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:34.285126 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-124.ec2.internal\" not found" Apr 24 21:27:34.289185 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:34.289157 2578 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-132-124.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:27:34.289276 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:34.289198 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:27:34.290190 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.290168 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6rrtn" Apr 24 21:27:34.295204 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.295187 2578 manager.go:324] Recovery completed Apr 24 21:27:34.296748 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:34.296717 2578 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 24 21:27:34.299724 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.299652 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:34.305447 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.305433 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-124.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:34.305500 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.305463 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-124.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:34.305500 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.305476 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-124.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:34.305950 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.305936 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:27:34.305950 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.305950 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:27:34.306043 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.305964 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:34.309330 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.309319 2578 policy_none.go:49] "None policy: Start" Apr 24 21:27:34.309369 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.309334 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:27:34.309369 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.309344 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:27:34.349131 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.349116 2578 manager.go:341] "Starting Device Plugin manager" Apr 24 21:27:34.373821 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:34.349154 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:27:34.373821 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.349166 2578 server.go:85] "Starting device plugin registration server" Apr 24 21:27:34.373821 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.349391 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:27:34.373821 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.349403 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:27:34.373821 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.349506 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:27:34.373821 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.349610 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:27:34.373821 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.349620 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:27:34.373821 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:34.350093 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:27:34.373821 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:34.350131 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-124.ec2.internal\" not found" Apr 24 21:27:34.415659 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.415602 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:27:34.416890 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.416876 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:27:34.417014 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.416904 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:27:34.417014 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.416925 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:27:34.417014 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.416934 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:27:34.417014 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:34.417006 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:27:34.420276 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.420257 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:34.450349 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.450327 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:34.451131 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.451102 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-124.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:34.451131 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.451132 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-124.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:34.451235 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.451141 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-124.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:34.451235 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.451163 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-124.ec2.internal" Apr 24 21:27:34.463451 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.463433 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-124.ec2.internal" Apr 24 21:27:34.463500 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:34.463454 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-124.ec2.internal\": node \"ip-10-0-132-124.ec2.internal\" not found" Apr 24 21:27:34.494275 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:34.494256 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-124.ec2.internal\" not found" Apr 24 21:27:34.517679 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.517659 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-124.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-124.ec2.internal"] Apr 24 21:27:34.517730 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.517722 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:34.519735 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.519720 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-124.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:34.519796 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.519750 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-124.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:34.519796 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.519760 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-124.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:34.521783 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.521772 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:34.521957 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.521945 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-124.ec2.internal" Apr 24 21:27:34.521990 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.521971 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:34.522450 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.522429 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-124.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:34.522450 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.522444 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-124.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:34.522535 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.522457 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-124.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:34.522535 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.522465 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-124.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:34.522535 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.522467 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-124.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:34.522641 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.522478 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-124.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:34.524570 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.524558 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-124.ec2.internal" Apr 24 21:27:34.524623 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.524579 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:34.525715 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.525689 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-124.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:34.525715 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.525711 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-124.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:34.525812 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.525726 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-124.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:34.547074 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:34.547049 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-124.ec2.internal\" not found" node="ip-10-0-132-124.ec2.internal" Apr 24 21:27:34.550993 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:34.550977 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-124.ec2.internal\" not found" node="ip-10-0-132-124.ec2.internal" Apr 24 21:27:34.594522 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:34.594505 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-124.ec2.internal\" not found" Apr 24 21:27:34.684534 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.684492 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1bd7c9a210b9e7fcb345143388e6cb26-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-124.ec2.internal\" (UID: \"1bd7c9a210b9e7fcb345143388e6cb26\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-124.ec2.internal" Apr 24 21:27:34.684534 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.684516 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1bd7c9a210b9e7fcb345143388e6cb26-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-124.ec2.internal\" (UID: \"1bd7c9a210b9e7fcb345143388e6cb26\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-124.ec2.internal" Apr 24 21:27:34.684534 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.684532 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/455f5e57cbc63caaa90072de2a2bd596-config\") pod \"kube-apiserver-proxy-ip-10-0-132-124.ec2.internal\" (UID: \"455f5e57cbc63caaa90072de2a2bd596\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-124.ec2.internal" Apr 24 21:27:34.695577 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:34.695561 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-124.ec2.internal\" not found" Apr 24 21:27:34.785219 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.785193 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1bd7c9a210b9e7fcb345143388e6cb26-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-124.ec2.internal\" (UID: \"1bd7c9a210b9e7fcb345143388e6cb26\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-124.ec2.internal" Apr 24 21:27:34.785219 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.785223 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1bd7c9a210b9e7fcb345143388e6cb26-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-124.ec2.internal\" (UID: \"1bd7c9a210b9e7fcb345143388e6cb26\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-124.ec2.internal" Apr 24 21:27:34.785320 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.785241 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/455f5e57cbc63caaa90072de2a2bd596-config\") pod \"kube-apiserver-proxy-ip-10-0-132-124.ec2.internal\" (UID: \"455f5e57cbc63caaa90072de2a2bd596\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-124.ec2.internal" Apr 24 21:27:34.785320 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.785289 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1bd7c9a210b9e7fcb345143388e6cb26-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-124.ec2.internal\" (UID: \"1bd7c9a210b9e7fcb345143388e6cb26\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-124.ec2.internal" Apr 24 21:27:34.785320 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.785297 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1bd7c9a210b9e7fcb345143388e6cb26-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-124.ec2.internal\" (UID: \"1bd7c9a210b9e7fcb345143388e6cb26\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-124.ec2.internal" Apr 24 21:27:34.785404 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.785289 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/455f5e57cbc63caaa90072de2a2bd596-config\") pod \"kube-apiserver-proxy-ip-10-0-132-124.ec2.internal\" (UID: \"455f5e57cbc63caaa90072de2a2bd596\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-124.ec2.internal" Apr 24 21:27:34.796314 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:34.796296 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-124.ec2.internal\" not found" Apr 24 21:27:34.849463 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.849448 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-124.ec2.internal" Apr 24 21:27:34.853743 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:34.853725 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-124.ec2.internal" Apr 24 21:27:34.896367 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:34.896342 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-124.ec2.internal\" not found" Apr 24 21:27:34.996856 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:34.996794 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-124.ec2.internal\" not found" Apr 24 21:27:35.097309 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:35.097285 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-124.ec2.internal\" not found" Apr 24 21:27:35.188806 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:35.188783 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:27:35.189400 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:35.188937 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:27:35.197937 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:35.197917 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-124.ec2.internal\" not found" Apr 24 21:27:35.283101 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:35.282945 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:35.291458 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:35.291431 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:22:34 +0000 UTC" deadline="2028-01-22 09:49:21.144081114 +0000 UTC" Apr 24 21:27:35.291458 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:35.291456 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15300h21m45.852627554s" Apr 24 21:27:35.298711 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:35.298686 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-124.ec2.internal\" not found" Apr 24 21:27:35.313338 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:35.313319 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:35.361639 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:35.361610 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-4q762" Apr 24 21:27:35.364607 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:35.364585 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bd7c9a210b9e7fcb345143388e6cb26.slice/crio-3e2d3a78f29d1292b18461c7fcfece46912694301b92ff199bfcab20092294ee WatchSource:0}: Error finding container 3e2d3a78f29d1292b18461c7fcfece46912694301b92ff199bfcab20092294ee: Status 404 returned error can't find the container with id 3e2d3a78f29d1292b18461c7fcfece46912694301b92ff199bfcab20092294ee Apr 24 21:27:35.365113 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:35.365093 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod455f5e57cbc63caaa90072de2a2bd596.slice/crio-bdcb9e300d85cb201f334a8d1e75ac5276b3edc0f545db228e522de28731649c WatchSource:0}: Error finding container bdcb9e300d85cb201f334a8d1e75ac5276b3edc0f545db228e522de28731649c: Status 404 returned error can't find the container with id bdcb9e300d85cb201f334a8d1e75ac5276b3edc0f545db228e522de28731649c Apr 24 21:27:35.368507 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:35.368489 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:27:35.375555 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:35.375523 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-4q762" Apr 24 21:27:35.394945 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:35.394927 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:35.419731 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:35.419676 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-124.ec2.internal" event={"ID":"455f5e57cbc63caaa90072de2a2bd596","Type":"ContainerStarted","Data":"bdcb9e300d85cb201f334a8d1e75ac5276b3edc0f545db228e522de28731649c"} Apr 24 21:27:35.420491 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:35.420473 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-124.ec2.internal" event={"ID":"1bd7c9a210b9e7fcb345143388e6cb26","Type":"ContainerStarted","Data":"3e2d3a78f29d1292b18461c7fcfece46912694301b92ff199bfcab20092294ee"} Apr 24 21:27:35.483691 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:35.483672 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-124.ec2.internal" Apr 24 21:27:35.498532 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:35.498511 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:35.500682 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:35.500669 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-124.ec2.internal" Apr 24 21:27:35.511773 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:35.511758 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:35.658697 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:35.658599 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:35.789967 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:35.789928 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:36.097351 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.097278 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:36.260222 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.260196 2578 apiserver.go:52] "Watching apiserver" Apr 24 21:27:36.267322 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.267264 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:27:36.269073 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.269046 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-csj8d","kube-system/kube-apiserver-proxy-ip-10-0-132-124.ec2.internal","openshift-cluster-node-tuning-operator/tuned-nr5k8","openshift-image-registry/node-ca-5t5rz","openshift-multus/multus-additional-cni-plugins-fztlb","openshift-multus/network-metrics-daemon-6wxzd","openshift-network-diagnostics/network-check-target-rkwhf","openshift-ovn-kubernetes/ovnkube-node-4256n","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d","openshift-dns/node-resolver-48sm2","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-124.ec2.internal","openshift-multus/multus-lshqp","openshift-network-operator/iptables-alerter-fdh56"] Apr 24 21:27:36.273889 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.273862 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:27:36.274002 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:36.273950 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:27:36.276070 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.276047 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.278230 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.278210 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-m2s9z\"" Apr 24 21:27:36.278326 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.278228 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.278479 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.278461 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.279125 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.279103 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.280183 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.280163 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.280618 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.280601 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:27:36.280706 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.280678 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5t5rz" Apr 24 21:27:36.281012 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.280994 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:27:36.281186 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.281003 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-l82zb\"" Apr 24 21:27:36.281375 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.281359 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.281449 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.281434 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:27:36.282825 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.282806 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:27:36.282895 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:36.282877 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:27:36.282964 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.282950 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-csj8d" Apr 24 21:27:36.283025 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.283007 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.283155 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.283139 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:27:36.283351 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.283337 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.283414 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.283378 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-gm5vw\"" Apr 24 21:27:36.285234 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.285211 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.285480 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.285460 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-6c2sv\"" Apr 24 21:27:36.285760 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.285743 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:27:36.285891 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.285875 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:27:36.287348 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.287328 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.287446 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.287420 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:27:36.287446 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.287431 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:27:36.287740 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.287722 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.287823 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.287777 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.287823 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.287812 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:27:36.287968 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.287952 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-j52cm\"" Apr 24 21:27:36.288368 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.288338 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:27:36.289782 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.289764 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-48sm2" Apr 24 21:27:36.290823 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.290802 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:27:36.290910 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.290840 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.291301 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.291188 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-74bs9\"" Apr 24 21:27:36.291301 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.291207 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.292516 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.292496 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.292623 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.292566 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.292696 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.292675 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.292875 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.292855 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8knfc\"" Apr 24 21:27:36.294172 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294149 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c64a924-1f49-45bd-870b-9fb356e61e75-host\") pod \"node-ca-5t5rz\" (UID: \"9c64a924-1f49-45bd-870b-9fb356e61e75\") " pod="openshift-image-registry/node-ca-5t5rz" Apr 24 21:27:36.294277 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294188 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/05330bcd-753c-4b2c-add6-ad37ce95d4d1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.294277 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294215 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-host-kubelet\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.294277 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294254 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-host-slash\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.294429 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294293 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-var-lib-openvswitch\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.294429 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294338 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-run-openvswitch\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.294429 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294365 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-etc-modprobe-d\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.294429 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294412 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-sys\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.294595 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294444 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/05330bcd-753c-4b2c-add6-ad37ce95d4d1-cnibin\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.294595 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294463 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-log-socket\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.294595 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294480 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ecf62cd7-5041-4b2d-8eff-453431841db5-env-overrides\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.294595 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294500 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlp7d\" (UniqueName: \"kubernetes.io/projected/007dd22d-9512-495a-ad7f-d8424286a304-kube-api-access-rlp7d\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.294595 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294522 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdm8r\" (UniqueName: \"kubernetes.io/projected/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-kube-api-access-rdm8r\") pod \"network-metrics-daemon-6wxzd\" (UID: \"b536a581-6c7c-4e7e-9fb3-6223e4ab90f0\") " pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:27:36.294595 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294570 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-host-run-netns\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.294595 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294593 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-host-run-ovn-kubernetes\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.294851 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294614 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ecf62cd7-5041-4b2d-8eff-453431841db5-ovnkube-config\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.294851 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294628 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnr2z\" (UniqueName: \"kubernetes.io/projected/ecf62cd7-5041-4b2d-8eff-453431841db5-kube-api-access-tnr2z\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.294851 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294648 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-etc-sysconfig\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.294851 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294664 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-lib-modules\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.294851 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294709 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:27:36.294851 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294716 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/007dd22d-9512-495a-ad7f-d8424286a304-etc-tuned\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.294851 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294755 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/83486ef0-fe96-4f97-a0e5-bec233422715-konnectivity-ca\") pod \"konnectivity-agent-csj8d\" (UID: \"83486ef0-fe96-4f97-a0e5-bec233422715\") " pod="kube-system/konnectivity-agent-csj8d" Apr 24 21:27:36.294851 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294781 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/05330bcd-753c-4b2c-add6-ad37ce95d4d1-cni-binary-copy\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.294851 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294791 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vmqdm\"" Apr 24 21:27:36.294851 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294807 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/05330bcd-753c-4b2c-add6-ad37ce95d4d1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.294851 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294830 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-etc-openvswitch\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.295342 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294853 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-host-cni-bin\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.295342 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294903 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-etc-sysctl-conf\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.295342 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294925 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9c64a924-1f49-45bd-870b-9fb356e61e75-serviceca\") pod \"node-ca-5t5rz\" (UID: \"9c64a924-1f49-45bd-870b-9fb356e61e75\") " pod="openshift-image-registry/node-ca-5t5rz" Apr 24 21:27:36.295342 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294952 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm4r5\" (UniqueName: \"kubernetes.io/projected/9c64a924-1f49-45bd-870b-9fb356e61e75-kube-api-access-cm4r5\") pod \"node-ca-5t5rz\" (UID: \"9c64a924-1f49-45bd-870b-9fb356e61e75\") " pod="openshift-image-registry/node-ca-5t5rz" Apr 24 21:27:36.295342 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.294988 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfjt2\" (UniqueName: \"kubernetes.io/projected/05330bcd-753c-4b2c-add6-ad37ce95d4d1-kube-api-access-zfjt2\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.295342 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.295012 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-node-log\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.295342 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.295028 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-run\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.295342 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.295047 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/007dd22d-9512-495a-ad7f-d8424286a304-tmp\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.295342 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.295065 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs\") pod \"network-metrics-daemon-6wxzd\" (UID: \"b536a581-6c7c-4e7e-9fb3-6223e4ab90f0\") " pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:27:36.295342 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.295087 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/05330bcd-753c-4b2c-add6-ad37ce95d4d1-system-cni-dir\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.295342 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.295109 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ecf62cd7-5041-4b2d-8eff-453431841db5-ovn-node-metrics-cert\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.295342 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.295132 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ecf62cd7-5041-4b2d-8eff-453431841db5-ovnkube-script-lib\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.295342 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.295154 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-etc-kubernetes\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.295342 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.295174 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/83486ef0-fe96-4f97-a0e5-bec233422715-agent-certs\") pod \"konnectivity-agent-csj8d\" (UID: \"83486ef0-fe96-4f97-a0e5-bec233422715\") " pod="kube-system/konnectivity-agent-csj8d" Apr 24 21:27:36.295342 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.295188 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vbt4\" (UniqueName: \"kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4\") pod \"network-check-target-rkwhf\" (UID: \"c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e\") " pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:27:36.295342 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.295202 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/05330bcd-753c-4b2c-add6-ad37ce95d4d1-os-release\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.295969 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.295238 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-systemd-units\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.295969 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.295261 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-run-ovn\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.295969 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.295299 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-host-cni-netd\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.295969 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.295321 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-etc-systemd\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.295969 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.295342 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-host\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.295969 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.295362 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/05330bcd-753c-4b2c-add6-ad37ce95d4d1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.295969 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.295376 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-run-systemd\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.295969 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.295391 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.295969 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.295459 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-etc-sysctl-d\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.295969 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.295482 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-var-lib-kubelet\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.295969 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.295609 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fdh56" Apr 24 21:27:36.297906 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.297889 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:27:36.297995 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.297936 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.297995 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.297982 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-44d8l\"" Apr 24 21:27:36.298124 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.298108 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.376333 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.376249 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:35 +0000 UTC" deadline="2027-10-29 09:16:52.523681202 +0000 UTC" Apr 24 21:27:36.376333 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.376284 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13259h49m16.147400714s" Apr 24 21:27:36.385172 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.385149 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:27:36.395720 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.395694 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-host-run-netns\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.395830 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.395736 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-host-run-ovn-kubernetes\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.395830 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.395761 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ecf62cd7-5041-4b2d-8eff-453431841db5-ovnkube-config\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.395830 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.395794 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-host-run-netns\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.395990 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.395831 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-host-run-ovn-kubernetes\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.395990 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.395890 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-system-cni-dir\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.395990 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.395921 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-cni-binary-copy\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.395990 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.395946 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-host-run-multus-certs\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.395990 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.395974 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-etc-sysconfig\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.396207 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.395997 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-lib-modules\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.396207 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396019 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/007dd22d-9512-495a-ad7f-d8424286a304-etc-tuned\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.396207 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396042 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/05330bcd-753c-4b2c-add6-ad37ce95d4d1-cni-binary-copy\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.396207 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396042 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-etc-sysconfig\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.396207 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396068 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/05330bcd-753c-4b2c-add6-ad37ce95d4d1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.396207 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396096 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-host-var-lib-cni-multus\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.396207 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396140 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-multus-conf-dir\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.396207 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396162 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6183aed2-60ab-4cae-8455-c797d1e3ebf6-host-slash\") pod \"iptables-alerter-fdh56\" (UID: \"6183aed2-60ab-4cae-8455-c797d1e3ebf6\") " pod="openshift-network-operator/iptables-alerter-fdh56" Apr 24 21:27:36.396207 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396169 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-lib-modules\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.396207 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396188 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-etc-sysctl-conf\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.396675 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396223 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cm4r5\" (UniqueName: \"kubernetes.io/projected/9c64a924-1f49-45bd-870b-9fb356e61e75-kube-api-access-cm4r5\") pod \"node-ca-5t5rz\" (UID: \"9c64a924-1f49-45bd-870b-9fb356e61e75\") " pod="openshift-image-registry/node-ca-5t5rz" Apr 24 21:27:36.396675 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396251 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfjt2\" (UniqueName: \"kubernetes.io/projected/05330bcd-753c-4b2c-add6-ad37ce95d4d1-kube-api-access-zfjt2\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.396675 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396280 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-node-log\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.396675 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396307 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0f52ff53-2325-461a-9bb4-dde9a76323fb-tmp-dir\") pod \"node-resolver-48sm2\" (UID: \"0f52ff53-2325-461a-9bb4-dde9a76323fb\") " pod="openshift-dns/node-resolver-48sm2" Apr 24 21:27:36.396675 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396321 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-etc-sysctl-conf\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.396675 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396330 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcglk\" (UniqueName: \"kubernetes.io/projected/6183aed2-60ab-4cae-8455-c797d1e3ebf6-kube-api-access-kcglk\") pod \"iptables-alerter-fdh56\" (UID: \"6183aed2-60ab-4cae-8455-c797d1e3ebf6\") " pod="openshift-network-operator/iptables-alerter-fdh56" Apr 24 21:27:36.396675 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396364 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs\") pod \"network-metrics-daemon-6wxzd\" (UID: \"b536a581-6c7c-4e7e-9fb3-6223e4ab90f0\") " pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:27:36.396675 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396355 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:27:36.396675 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:36.396472 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:36.396675 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:36.396539 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs podName:b536a581-6c7c-4e7e-9fb3-6223e4ab90f0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:36.896519821 +0000 UTC m=+3.072254488 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs") pod "network-metrics-daemon-6wxzd" (UID: "b536a581-6c7c-4e7e-9fb3-6223e4ab90f0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:36.396675 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396590 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c2c4820b-6b31-4f7b-89f3-aae65e915513-registration-dir\") pod \"aws-ebs-csi-driver-node-8ps7d\" (UID: \"c2c4820b-6b31-4f7b-89f3-aae65e915513\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.396675 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396618 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-os-release\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.396675 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396646 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/05330bcd-753c-4b2c-add6-ad37ce95d4d1-cni-binary-copy\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.396675 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396642 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-host-var-lib-cni-bin\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.397309 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396694 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-etc-kubernetes\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.397309 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396727 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/83486ef0-fe96-4f97-a0e5-bec233422715-agent-certs\") pod \"konnectivity-agent-csj8d\" (UID: \"83486ef0-fe96-4f97-a0e5-bec233422715\") " pod="kube-system/konnectivity-agent-csj8d" Apr 24 21:27:36.397309 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396747 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-systemd-units\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.397309 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396769 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-etc-kubernetes\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.397309 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396378 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-node-log\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.397309 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396769 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-etc-systemd\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.397309 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396813 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-host\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.397309 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396819 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-systemd-units\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.397309 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396849 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.397309 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396874 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-host\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.397309 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396881 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-multus-cni-dir\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.397309 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396831 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-etc-systemd\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.397309 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396889 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.397309 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396907 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-var-lib-kubelet\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.397309 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396931 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c64a924-1f49-45bd-870b-9fb356e61e75-host\") pod \"node-ca-5t5rz\" (UID: \"9c64a924-1f49-45bd-870b-9fb356e61e75\") " pod="openshift-image-registry/node-ca-5t5rz" Apr 24 21:27:36.397309 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396955 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-host-kubelet\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.397309 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.396979 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-host-slash\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.398056 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397002 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-var-lib-openvswitch\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.398056 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397011 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c64a924-1f49-45bd-870b-9fb356e61e75-host\") pod \"node-ca-5t5rz\" (UID: \"9c64a924-1f49-45bd-870b-9fb356e61e75\") " pod="openshift-image-registry/node-ca-5t5rz" Apr 24 21:27:36.398056 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397020 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-etc-modprobe-d\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.398056 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397048 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/05330bcd-753c-4b2c-add6-ad37ce95d4d1-cnibin\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.398056 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397088 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-var-lib-kubelet\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.398056 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397090 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-etc-modprobe-d\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.398056 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397078 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c2c4820b-6b31-4f7b-89f3-aae65e915513-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8ps7d\" (UID: \"c2c4820b-6b31-4f7b-89f3-aae65e915513\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.398056 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397130 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0f52ff53-2325-461a-9bb4-dde9a76323fb-hosts-file\") pod \"node-resolver-48sm2\" (UID: \"0f52ff53-2325-461a-9bb4-dde9a76323fb\") " pod="openshift-dns/node-resolver-48sm2" Apr 24 21:27:36.398056 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397134 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-host-slash\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.398056 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397134 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-host-kubelet\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.398056 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397134 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-var-lib-openvswitch\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.398056 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397046 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ecf62cd7-5041-4b2d-8eff-453431841db5-ovnkube-config\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.398056 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397152 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/05330bcd-753c-4b2c-add6-ad37ce95d4d1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.398056 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397155 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-host-run-k8s-cni-cncf-io\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.398056 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397183 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/05330bcd-753c-4b2c-add6-ad37ce95d4d1-cnibin\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.398056 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397194 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlp7d\" (UniqueName: \"kubernetes.io/projected/007dd22d-9512-495a-ad7f-d8424286a304-kube-api-access-rlp7d\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.398056 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397218 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdm8r\" (UniqueName: \"kubernetes.io/projected/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-kube-api-access-rdm8r\") pod \"network-metrics-daemon-6wxzd\" (UID: \"b536a581-6c7c-4e7e-9fb3-6223e4ab90f0\") " pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:27:36.398847 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397241 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnr2z\" (UniqueName: \"kubernetes.io/projected/ecf62cd7-5041-4b2d-8eff-453431841db5-kube-api-access-tnr2z\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.398847 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397263 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c2c4820b-6b31-4f7b-89f3-aae65e915513-etc-selinux\") pod \"aws-ebs-csi-driver-node-8ps7d\" (UID: \"c2c4820b-6b31-4f7b-89f3-aae65e915513\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.398847 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397285 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqk5q\" (UniqueName: \"kubernetes.io/projected/c2c4820b-6b31-4f7b-89f3-aae65e915513-kube-api-access-cqk5q\") pod \"aws-ebs-csi-driver-node-8ps7d\" (UID: \"c2c4820b-6b31-4f7b-89f3-aae65e915513\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.398847 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397307 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/83486ef0-fe96-4f97-a0e5-bec233422715-konnectivity-ca\") pod \"konnectivity-agent-csj8d\" (UID: \"83486ef0-fe96-4f97-a0e5-bec233422715\") " pod="kube-system/konnectivity-agent-csj8d" Apr 24 21:27:36.398847 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397329 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-etc-openvswitch\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.398847 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397349 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-host-cni-bin\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.398847 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397382 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pvxq\" (UniqueName: \"kubernetes.io/projected/0f52ff53-2325-461a-9bb4-dde9a76323fb-kube-api-access-4pvxq\") pod \"node-resolver-48sm2\" (UID: \"0f52ff53-2325-461a-9bb4-dde9a76323fb\") " pod="openshift-dns/node-resolver-48sm2" Apr 24 21:27:36.398847 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397403 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6183aed2-60ab-4cae-8455-c797d1e3ebf6-iptables-alerter-script\") pod \"iptables-alerter-fdh56\" (UID: \"6183aed2-60ab-4cae-8455-c797d1e3ebf6\") " pod="openshift-network-operator/iptables-alerter-fdh56" Apr 24 21:27:36.398847 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397425 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9c64a924-1f49-45bd-870b-9fb356e61e75-serviceca\") pod \"node-ca-5t5rz\" (UID: \"9c64a924-1f49-45bd-870b-9fb356e61e75\") " pod="openshift-image-registry/node-ca-5t5rz" Apr 24 21:27:36.398847 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397446 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-run\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.398847 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397447 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-host-cni-bin\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.398847 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397467 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/007dd22d-9512-495a-ad7f-d8424286a304-tmp\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.398847 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397489 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/05330bcd-753c-4b2c-add6-ad37ce95d4d1-system-cni-dir\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.398847 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397494 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-etc-openvswitch\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.398847 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397572 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-run\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.398847 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397575 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/05330bcd-753c-4b2c-add6-ad37ce95d4d1-system-cni-dir\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.398847 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397646 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ecf62cd7-5041-4b2d-8eff-453431841db5-ovn-node-metrics-cert\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.399530 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397678 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ecf62cd7-5041-4b2d-8eff-453431841db5-ovnkube-script-lib\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.399530 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397705 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-cnibin\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.399530 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397731 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-hostroot\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.399530 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397780 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-multus-daemon-config\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.399530 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397807 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vbt4\" (UniqueName: \"kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4\") pod \"network-check-target-rkwhf\" (UID: \"c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e\") " pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:27:36.399530 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397834 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/05330bcd-753c-4b2c-add6-ad37ce95d4d1-os-release\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.399530 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397859 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-run-ovn\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.399530 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397885 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-host-cni-netd\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.399530 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397888 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/83486ef0-fe96-4f97-a0e5-bec233422715-konnectivity-ca\") pod \"konnectivity-agent-csj8d\" (UID: \"83486ef0-fe96-4f97-a0e5-bec233422715\") " pod="kube-system/konnectivity-agent-csj8d" Apr 24 21:27:36.399530 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397911 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c2c4820b-6b31-4f7b-89f3-aae65e915513-socket-dir\") pod \"aws-ebs-csi-driver-node-8ps7d\" (UID: \"c2c4820b-6b31-4f7b-89f3-aae65e915513\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.399530 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397926 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9c64a924-1f49-45bd-870b-9fb356e61e75-serviceca\") pod \"node-ca-5t5rz\" (UID: \"9c64a924-1f49-45bd-870b-9fb356e61e75\") " pod="openshift-image-registry/node-ca-5t5rz" Apr 24 21:27:36.399530 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397939 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/05330bcd-753c-4b2c-add6-ad37ce95d4d1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.399530 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.397975 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-run-ovn\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.399530 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398003 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-run-systemd\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.399530 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398030 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c2c4820b-6b31-4f7b-89f3-aae65e915513-sys-fs\") pod \"aws-ebs-csi-driver-node-8ps7d\" (UID: \"c2c4820b-6b31-4f7b-89f3-aae65e915513\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.399530 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398055 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2mzz\" (UniqueName: \"kubernetes.io/projected/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-kube-api-access-m2mzz\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.399530 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398082 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-etc-sysctl-d\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.400082 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398135 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-run-systemd\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.400082 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398135 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/05330bcd-753c-4b2c-add6-ad37ce95d4d1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.400082 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398169 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-run-openvswitch\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.400082 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398182 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-host-cni-netd\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.400082 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398194 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ecf62cd7-5041-4b2d-8eff-453431841db5-env-overrides\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.400082 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398239 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/05330bcd-753c-4b2c-add6-ad37ce95d4d1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.400082 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398261 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/05330bcd-753c-4b2c-add6-ad37ce95d4d1-os-release\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.400082 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398295 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-run-openvswitch\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.400082 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398321 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c2c4820b-6b31-4f7b-89f3-aae65e915513-device-dir\") pod \"aws-ebs-csi-driver-node-8ps7d\" (UID: \"c2c4820b-6b31-4f7b-89f3-aae65e915513\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.400082 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398345 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-multus-socket-dir-parent\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.400082 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398349 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-etc-sysctl-d\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.400082 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398371 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-host-run-netns\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.400082 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398397 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-sys\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.400082 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398423 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-log-socket\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.400082 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398460 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ecf62cd7-5041-4b2d-8eff-453431841db5-log-socket\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.400082 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398493 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-host-var-lib-kubelet\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.400082 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398505 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/007dd22d-9512-495a-ad7f-d8424286a304-sys\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.400636 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398538 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-etc-kubernetes\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.400636 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398645 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ecf62cd7-5041-4b2d-8eff-453431841db5-ovnkube-script-lib\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.400636 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.398820 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ecf62cd7-5041-4b2d-8eff-453431841db5-env-overrides\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.400636 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.399038 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/05330bcd-753c-4b2c-add6-ad37ce95d4d1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.400636 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.400068 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/007dd22d-9512-495a-ad7f-d8424286a304-tmp\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.400636 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.400149 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/007dd22d-9512-495a-ad7f-d8424286a304-etc-tuned\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.400636 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.400286 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/83486ef0-fe96-4f97-a0e5-bec233422715-agent-certs\") pod \"konnectivity-agent-csj8d\" (UID: \"83486ef0-fe96-4f97-a0e5-bec233422715\") " pod="kube-system/konnectivity-agent-csj8d" Apr 24 21:27:36.400959 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.400730 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ecf62cd7-5041-4b2d-8eff-453431841db5-ovn-node-metrics-cert\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.404658 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:36.404630 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:36.404658 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:36.404655 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:36.404791 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:36.404668 2578 projected.go:194] Error preparing data for projected volume kube-api-access-2vbt4 for pod openshift-network-diagnostics/network-check-target-rkwhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:36.404791 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:36.404736 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4 podName:c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e nodeName:}" failed. No retries permitted until 2026-04-24 21:27:36.904718816 +0000 UTC m=+3.080453491 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2vbt4" (UniqueName: "kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4") pod "network-check-target-rkwhf" (UID: "c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:36.406928 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.406910 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm4r5\" (UniqueName: \"kubernetes.io/projected/9c64a924-1f49-45bd-870b-9fb356e61e75-kube-api-access-cm4r5\") pod \"node-ca-5t5rz\" (UID: \"9c64a924-1f49-45bd-870b-9fb356e61e75\") " pod="openshift-image-registry/node-ca-5t5rz" Apr 24 21:27:36.407003 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.406968 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfjt2\" (UniqueName: \"kubernetes.io/projected/05330bcd-753c-4b2c-add6-ad37ce95d4d1-kube-api-access-zfjt2\") pod \"multus-additional-cni-plugins-fztlb\" (UID: \"05330bcd-753c-4b2c-add6-ad37ce95d4d1\") " pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.407170 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.407152 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlp7d\" (UniqueName: \"kubernetes.io/projected/007dd22d-9512-495a-ad7f-d8424286a304-kube-api-access-rlp7d\") pod \"tuned-nr5k8\" (UID: \"007dd22d-9512-495a-ad7f-d8424286a304\") " pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.407395 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.407379 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdm8r\" (UniqueName: \"kubernetes.io/projected/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-kube-api-access-rdm8r\") pod \"network-metrics-daemon-6wxzd\" (UID: \"b536a581-6c7c-4e7e-9fb3-6223e4ab90f0\") " pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:27:36.407686 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.407669 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnr2z\" (UniqueName: \"kubernetes.io/projected/ecf62cd7-5041-4b2d-8eff-453431841db5-kube-api-access-tnr2z\") pod \"ovnkube-node-4256n\" (UID: \"ecf62cd7-5041-4b2d-8eff-453431841db5\") " pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.498972 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.498935 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-multus-cni-dir\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.499128 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.498981 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c2c4820b-6b31-4f7b-89f3-aae65e915513-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8ps7d\" (UID: \"c2c4820b-6b31-4f7b-89f3-aae65e915513\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.499128 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499003 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0f52ff53-2325-461a-9bb4-dde9a76323fb-hosts-file\") pod \"node-resolver-48sm2\" (UID: \"0f52ff53-2325-461a-9bb4-dde9a76323fb\") " pod="openshift-dns/node-resolver-48sm2" Apr 24 21:27:36.499128 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499026 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-host-run-k8s-cni-cncf-io\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.499128 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499050 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c2c4820b-6b31-4f7b-89f3-aae65e915513-etc-selinux\") pod \"aws-ebs-csi-driver-node-8ps7d\" (UID: \"c2c4820b-6b31-4f7b-89f3-aae65e915513\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.499128 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499071 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqk5q\" (UniqueName: \"kubernetes.io/projected/c2c4820b-6b31-4f7b-89f3-aae65e915513-kube-api-access-cqk5q\") pod \"aws-ebs-csi-driver-node-8ps7d\" (UID: \"c2c4820b-6b31-4f7b-89f3-aae65e915513\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.499128 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499068 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-multus-cni-dir\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.499128 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499094 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pvxq\" (UniqueName: \"kubernetes.io/projected/0f52ff53-2325-461a-9bb4-dde9a76323fb-kube-api-access-4pvxq\") pod \"node-resolver-48sm2\" (UID: \"0f52ff53-2325-461a-9bb4-dde9a76323fb\") " pod="openshift-dns/node-resolver-48sm2" Apr 24 21:27:36.499128 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499115 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6183aed2-60ab-4cae-8455-c797d1e3ebf6-iptables-alerter-script\") pod \"iptables-alerter-fdh56\" (UID: \"6183aed2-60ab-4cae-8455-c797d1e3ebf6\") " pod="openshift-network-operator/iptables-alerter-fdh56" Apr 24 21:27:36.499484 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499143 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c2c4820b-6b31-4f7b-89f3-aae65e915513-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8ps7d\" (UID: \"c2c4820b-6b31-4f7b-89f3-aae65e915513\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.499484 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499182 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0f52ff53-2325-461a-9bb4-dde9a76323fb-hosts-file\") pod \"node-resolver-48sm2\" (UID: \"0f52ff53-2325-461a-9bb4-dde9a76323fb\") " pod="openshift-dns/node-resolver-48sm2" Apr 24 21:27:36.499484 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499144 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-cnibin\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.499484 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499197 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c2c4820b-6b31-4f7b-89f3-aae65e915513-etc-selinux\") pod \"aws-ebs-csi-driver-node-8ps7d\" (UID: \"c2c4820b-6b31-4f7b-89f3-aae65e915513\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.499484 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499216 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-hostroot\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.499484 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499237 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-multus-daemon-config\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.499484 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499247 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-cnibin\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.499484 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499281 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c2c4820b-6b31-4f7b-89f3-aae65e915513-socket-dir\") pod \"aws-ebs-csi-driver-node-8ps7d\" (UID: \"c2c4820b-6b31-4f7b-89f3-aae65e915513\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.499484 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499293 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-hostroot\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.499484 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499308 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c2c4820b-6b31-4f7b-89f3-aae65e915513-sys-fs\") pod \"aws-ebs-csi-driver-node-8ps7d\" (UID: \"c2c4820b-6b31-4f7b-89f3-aae65e915513\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.499484 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499333 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2mzz\" (UniqueName: \"kubernetes.io/projected/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-kube-api-access-m2mzz\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.499484 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499361 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c2c4820b-6b31-4f7b-89f3-aae65e915513-device-dir\") pod \"aws-ebs-csi-driver-node-8ps7d\" (UID: \"c2c4820b-6b31-4f7b-89f3-aae65e915513\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.499484 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499386 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-multus-socket-dir-parent\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.499484 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499409 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-host-run-netns\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.499484 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499437 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-host-var-lib-kubelet\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.499484 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499462 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-etc-kubernetes\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.499484 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499490 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-system-cni-dir\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.500255 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499511 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-cni-binary-copy\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.500255 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499538 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-host-run-multus-certs\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.500255 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499608 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-host-var-lib-cni-multus\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.500255 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499633 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-multus-conf-dir\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.500255 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499658 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6183aed2-60ab-4cae-8455-c797d1e3ebf6-host-slash\") pod \"iptables-alerter-fdh56\" (UID: \"6183aed2-60ab-4cae-8455-c797d1e3ebf6\") " pod="openshift-network-operator/iptables-alerter-fdh56" Apr 24 21:27:36.500255 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499685 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0f52ff53-2325-461a-9bb4-dde9a76323fb-tmp-dir\") pod \"node-resolver-48sm2\" (UID: \"0f52ff53-2325-461a-9bb4-dde9a76323fb\") " pod="openshift-dns/node-resolver-48sm2" Apr 24 21:27:36.500255 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499709 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcglk\" (UniqueName: \"kubernetes.io/projected/6183aed2-60ab-4cae-8455-c797d1e3ebf6-kube-api-access-kcglk\") pod \"iptables-alerter-fdh56\" (UID: \"6183aed2-60ab-4cae-8455-c797d1e3ebf6\") " pod="openshift-network-operator/iptables-alerter-fdh56" Apr 24 21:27:36.500255 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499748 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c2c4820b-6b31-4f7b-89f3-aae65e915513-registration-dir\") pod \"aws-ebs-csi-driver-node-8ps7d\" (UID: \"c2c4820b-6b31-4f7b-89f3-aae65e915513\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.500255 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499772 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-os-release\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.500255 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499782 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6183aed2-60ab-4cae-8455-c797d1e3ebf6-iptables-alerter-script\") pod \"iptables-alerter-fdh56\" (UID: \"6183aed2-60ab-4cae-8455-c797d1e3ebf6\") " pod="openshift-network-operator/iptables-alerter-fdh56" Apr 24 21:27:36.500255 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499117 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-host-run-k8s-cni-cncf-io\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.500255 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499843 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-host-var-lib-cni-bin\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.500255 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.499796 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-host-var-lib-cni-bin\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.500884 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.500271 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-system-cni-dir\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.500884 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.500295 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-multus-daemon-config\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.500884 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.500449 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c2c4820b-6b31-4f7b-89f3-aae65e915513-socket-dir\") pod \"aws-ebs-csi-driver-node-8ps7d\" (UID: \"c2c4820b-6b31-4f7b-89f3-aae65e915513\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.500884 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.500490 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c2c4820b-6b31-4f7b-89f3-aae65e915513-sys-fs\") pod \"aws-ebs-csi-driver-node-8ps7d\" (UID: \"c2c4820b-6b31-4f7b-89f3-aae65e915513\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.500884 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.500629 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-host-run-netns\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.500884 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.500657 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-multus-socket-dir-parent\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.500884 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.500657 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-host-var-lib-kubelet\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.500884 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.500694 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-etc-kubernetes\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.500884 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.500700 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-host-var-lib-cni-multus\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.500884 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.500705 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-multus-conf-dir\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.500884 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.500730 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-cni-binary-copy\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.500884 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.500742 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-host-run-multus-certs\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.500884 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.500748 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6183aed2-60ab-4cae-8455-c797d1e3ebf6-host-slash\") pod \"iptables-alerter-fdh56\" (UID: \"6183aed2-60ab-4cae-8455-c797d1e3ebf6\") " pod="openshift-network-operator/iptables-alerter-fdh56" Apr 24 21:27:36.500884 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.500672 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c2c4820b-6b31-4f7b-89f3-aae65e915513-device-dir\") pod \"aws-ebs-csi-driver-node-8ps7d\" (UID: \"c2c4820b-6b31-4f7b-89f3-aae65e915513\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.500884 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.500765 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c2c4820b-6b31-4f7b-89f3-aae65e915513-registration-dir\") pod \"aws-ebs-csi-driver-node-8ps7d\" (UID: \"c2c4820b-6b31-4f7b-89f3-aae65e915513\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.500884 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.500802 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-os-release\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.501349 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.500931 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0f52ff53-2325-461a-9bb4-dde9a76323fb-tmp-dir\") pod \"node-resolver-48sm2\" (UID: \"0f52ff53-2325-461a-9bb4-dde9a76323fb\") " pod="openshift-dns/node-resolver-48sm2" Apr 24 21:27:36.515254 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.515221 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqk5q\" (UniqueName: \"kubernetes.io/projected/c2c4820b-6b31-4f7b-89f3-aae65e915513-kube-api-access-cqk5q\") pod \"aws-ebs-csi-driver-node-8ps7d\" (UID: \"c2c4820b-6b31-4f7b-89f3-aae65e915513\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.518723 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.518691 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2mzz\" (UniqueName: \"kubernetes.io/projected/fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd-kube-api-access-m2mzz\") pod \"multus-lshqp\" (UID: \"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd\") " pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.520125 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.520099 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pvxq\" (UniqueName: \"kubernetes.io/projected/0f52ff53-2325-461a-9bb4-dde9a76323fb-kube-api-access-4pvxq\") pod \"node-resolver-48sm2\" (UID: \"0f52ff53-2325-461a-9bb4-dde9a76323fb\") " pod="openshift-dns/node-resolver-48sm2" Apr 24 21:27:36.520217 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.520173 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcglk\" (UniqueName: \"kubernetes.io/projected/6183aed2-60ab-4cae-8455-c797d1e3ebf6-kube-api-access-kcglk\") pod \"iptables-alerter-fdh56\" (UID: \"6183aed2-60ab-4cae-8455-c797d1e3ebf6\") " pod="openshift-network-operator/iptables-alerter-fdh56" Apr 24 21:27:36.588235 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.588205 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" Apr 24 21:27:36.594942 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.594919 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fztlb" Apr 24 21:27:36.602681 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.602664 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5t5rz" Apr 24 21:27:36.608239 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.608222 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-csj8d" Apr 24 21:27:36.614739 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.614722 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:27:36.621889 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.621875 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" Apr 24 21:27:36.627290 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.627245 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-48sm2" Apr 24 21:27:36.635744 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.635727 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lshqp" Apr 24 21:27:36.641256 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.641238 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fdh56" Apr 24 21:27:36.901983 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:36.901899 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs\") pod \"network-metrics-daemon-6wxzd\" (UID: \"b536a581-6c7c-4e7e-9fb3-6223e4ab90f0\") " pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:27:36.902134 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:36.902039 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:36.902134 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:36.902108 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs podName:b536a581-6c7c-4e7e-9fb3-6223e4ab90f0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:37.902092851 +0000 UTC m=+4.077827522 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs") pod "network-metrics-daemon-6wxzd" (UID: "b536a581-6c7c-4e7e-9fb3-6223e4ab90f0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:36.984782 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:36.984754 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83486ef0_fe96_4f97_a0e5_bec233422715.slice/crio-02342c16ad408105a73d13e3e21a622bbf2c90512b536904036304329253ca36 WatchSource:0}: Error finding container 02342c16ad408105a73d13e3e21a622bbf2c90512b536904036304329253ca36: Status 404 returned error can't find the container with id 02342c16ad408105a73d13e3e21a622bbf2c90512b536904036304329253ca36 Apr 24 21:27:36.986632 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:36.986609 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod007dd22d_9512_495a_ad7f_d8424286a304.slice/crio-e89ed06b2f3d67ed0daddf14bbbbd6c2308f8ffd277f5f4036ccb450e6904cd5 WatchSource:0}: Error finding container e89ed06b2f3d67ed0daddf14bbbbd6c2308f8ffd277f5f4036ccb450e6904cd5: Status 404 returned error can't find the container with id e89ed06b2f3d67ed0daddf14bbbbd6c2308f8ffd277f5f4036ccb450e6904cd5 Apr 24 21:27:36.988881 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:36.988806 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c64a924_1f49_45bd_870b_9fb356e61e75.slice/crio-269a1962d97bafff3dace9bf542b78a88ce16e20121c6d7efae933a58763005a WatchSource:0}: Error finding container 269a1962d97bafff3dace9bf542b78a88ce16e20121c6d7efae933a58763005a: Status 404 returned error can't find the container with id 269a1962d97bafff3dace9bf542b78a88ce16e20121c6d7efae933a58763005a Apr 24 21:27:36.990666 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:36.990648 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05330bcd_753c_4b2c_add6_ad37ce95d4d1.slice/crio-ab38ecda4fdb070d2c9699ec3be6328be84b7dcd9a81c3545323aed8f75388f0 WatchSource:0}: Error finding container ab38ecda4fdb070d2c9699ec3be6328be84b7dcd9a81c3545323aed8f75388f0: Status 404 returned error can't find the container with id ab38ecda4fdb070d2c9699ec3be6328be84b7dcd9a81c3545323aed8f75388f0 Apr 24 21:27:36.991444 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:36.991421 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd8fb7f1_8db0_4a33_b951_6a4739d1a1cd.slice/crio-9dc56abd7bea50d9b4274222572d2df4d2f7530d5f49a974c1d3b3eb71332c80 WatchSource:0}: Error finding container 9dc56abd7bea50d9b4274222572d2df4d2f7530d5f49a974c1d3b3eb71332c80: Status 404 returned error can't find the container with id 9dc56abd7bea50d9b4274222572d2df4d2f7530d5f49a974c1d3b3eb71332c80 Apr 24 21:27:36.993757 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:36.993741 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2c4820b_6b31_4f7b_89f3_aae65e915513.slice/crio-600de5c3cc3268bbf4c3b1137e98fe9c033fcf106524e4701214a98a3b63b459 WatchSource:0}: Error finding container 600de5c3cc3268bbf4c3b1137e98fe9c033fcf106524e4701214a98a3b63b459: Status 404 returned error can't find the container with id 600de5c3cc3268bbf4c3b1137e98fe9c033fcf106524e4701214a98a3b63b459 Apr 24 21:27:36.995315 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:36.994600 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6183aed2_60ab_4cae_8455_c797d1e3ebf6.slice/crio-35335029e42a13385555071a159341cfcc557228ea3e31bb247c1a0b917a3153 WatchSource:0}: Error finding container 35335029e42a13385555071a159341cfcc557228ea3e31bb247c1a0b917a3153: Status 404 returned error can't find the container with id 35335029e42a13385555071a159341cfcc557228ea3e31bb247c1a0b917a3153 Apr 24 21:27:36.995315 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:36.994783 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecf62cd7_5041_4b2d_8eff_453431841db5.slice/crio-e9aa1bc9cf834f1e5b3b079b5e3cdf25d3d197afd37b3b41f58c8ca17a58a323 WatchSource:0}: Error finding container e9aa1bc9cf834f1e5b3b079b5e3cdf25d3d197afd37b3b41f58c8ca17a58a323: Status 404 returned error can't find the container with id e9aa1bc9cf834f1e5b3b079b5e3cdf25d3d197afd37b3b41f58c8ca17a58a323 Apr 24 21:27:36.995886 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:27:36.995861 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f52ff53_2325_461a_9bb4_dde9a76323fb.slice/crio-a93d76800da900c09ad6701b6d02cfec6cd02a097da4fcb77395a91836cf292a WatchSource:0}: Error finding container a93d76800da900c09ad6701b6d02cfec6cd02a097da4fcb77395a91836cf292a: Status 404 returned error can't find the container with id a93d76800da900c09ad6701b6d02cfec6cd02a097da4fcb77395a91836cf292a Apr 24 21:27:37.002426 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:37.002407 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vbt4\" (UniqueName: \"kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4\") pod \"network-check-target-rkwhf\" (UID: \"c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e\") " pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:27:37.002605 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:37.002512 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:37.002605 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:37.002525 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:37.002605 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:37.002533 2578 projected.go:194] Error preparing data for projected volume kube-api-access-2vbt4 for pod openshift-network-diagnostics/network-check-target-rkwhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:37.002605 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:37.002597 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4 podName:c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e nodeName:}" failed. No retries permitted until 2026-04-24 21:27:38.002584326 +0000 UTC m=+4.178318977 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-2vbt4" (UniqueName: "kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4") pod "network-check-target-rkwhf" (UID: "c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:37.377781 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:37.377656 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:35 +0000 UTC" deadline="2027-12-07 08:04:11.584259759 +0000 UTC" Apr 24 21:27:37.377781 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:37.377698 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14194h36m34.206566003s" Apr 24 21:27:37.426663 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:37.426624 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-48sm2" event={"ID":"0f52ff53-2325-461a-9bb4-dde9a76323fb","Type":"ContainerStarted","Data":"a93d76800da900c09ad6701b6d02cfec6cd02a097da4fcb77395a91836cf292a"} Apr 24 21:27:37.431024 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:37.430997 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ecf62cd7-5041-4b2d-8eff-453431841db5","Type":"ContainerStarted","Data":"e9aa1bc9cf834f1e5b3b079b5e3cdf25d3d197afd37b3b41f58c8ca17a58a323"} Apr 24 21:27:37.433280 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:37.433253 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" event={"ID":"c2c4820b-6b31-4f7b-89f3-aae65e915513","Type":"ContainerStarted","Data":"600de5c3cc3268bbf4c3b1137e98fe9c033fcf106524e4701214a98a3b63b459"} Apr 24 21:27:37.438467 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:37.438441 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fztlb" event={"ID":"05330bcd-753c-4b2c-add6-ad37ce95d4d1","Type":"ContainerStarted","Data":"ab38ecda4fdb070d2c9699ec3be6328be84b7dcd9a81c3545323aed8f75388f0"} Apr 24 21:27:37.440195 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:37.440172 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" event={"ID":"007dd22d-9512-495a-ad7f-d8424286a304","Type":"ContainerStarted","Data":"e89ed06b2f3d67ed0daddf14bbbbd6c2308f8ffd277f5f4036ccb450e6904cd5"} Apr 24 21:27:37.454119 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:37.454065 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-124.ec2.internal" event={"ID":"455f5e57cbc63caaa90072de2a2bd596","Type":"ContainerStarted","Data":"0c9e92cc046669656b83e885a319db7ac0a12c1b89c6ef58c64d2449da82a6ba"} Apr 24 21:27:37.468416 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:37.468389 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fdh56" event={"ID":"6183aed2-60ab-4cae-8455-c797d1e3ebf6","Type":"ContainerStarted","Data":"35335029e42a13385555071a159341cfcc557228ea3e31bb247c1a0b917a3153"} Apr 24 21:27:37.477113 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:37.477088 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lshqp" event={"ID":"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd","Type":"ContainerStarted","Data":"9dc56abd7bea50d9b4274222572d2df4d2f7530d5f49a974c1d3b3eb71332c80"} Apr 24 21:27:37.482919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:37.482892 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5t5rz" event={"ID":"9c64a924-1f49-45bd-870b-9fb356e61e75","Type":"ContainerStarted","Data":"269a1962d97bafff3dace9bf542b78a88ce16e20121c6d7efae933a58763005a"} Apr 24 21:27:37.484056 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:37.484034 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-csj8d" event={"ID":"83486ef0-fe96-4f97-a0e5-bec233422715","Type":"ContainerStarted","Data":"02342c16ad408105a73d13e3e21a622bbf2c90512b536904036304329253ca36"} Apr 24 21:27:37.909722 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:37.909524 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs\") pod \"network-metrics-daemon-6wxzd\" (UID: \"b536a581-6c7c-4e7e-9fb3-6223e4ab90f0\") " pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:27:37.909884 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:37.909672 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:37.909884 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:37.909838 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs podName:b536a581-6c7c-4e7e-9fb3-6223e4ab90f0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:39.909820032 +0000 UTC m=+6.085554703 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs") pod "network-metrics-daemon-6wxzd" (UID: "b536a581-6c7c-4e7e-9fb3-6223e4ab90f0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:38.010574 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:38.010502 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vbt4\" (UniqueName: \"kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4\") pod \"network-check-target-rkwhf\" (UID: \"c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e\") " pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:27:38.010741 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:38.010723 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:38.010804 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:38.010750 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:38.010804 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:38.010762 2578 projected.go:194] Error preparing data for projected volume kube-api-access-2vbt4 for pod openshift-network-diagnostics/network-check-target-rkwhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:38.010908 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:38.010812 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4 podName:c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e nodeName:}" failed. No retries permitted until 2026-04-24 21:27:40.010794533 +0000 UTC m=+6.186529187 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-2vbt4" (UniqueName: "kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4") pod "network-check-target-rkwhf" (UID: "c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:38.417485 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:38.417456 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:27:38.417917 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:38.417602 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:27:38.423693 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:38.423644 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:27:38.423981 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:38.423825 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:27:38.518083 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:38.517988 2578 generic.go:358] "Generic (PLEG): container finished" podID="1bd7c9a210b9e7fcb345143388e6cb26" containerID="857e9f7eafc98a4c086adf4a4b89c9164f846483251a03de5dc4a1c937bb93d8" exitCode=0 Apr 24 21:27:38.518633 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:38.518606 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-124.ec2.internal" event={"ID":"1bd7c9a210b9e7fcb345143388e6cb26","Type":"ContainerDied","Data":"857e9f7eafc98a4c086adf4a4b89c9164f846483251a03de5dc4a1c937bb93d8"} Apr 24 21:27:38.544889 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:38.544606 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-124.ec2.internal" podStartSLOduration=3.544589912 podStartE2EDuration="3.544589912s" podCreationTimestamp="2026-04-24 21:27:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:37.471668234 +0000 UTC m=+3.647402910" watchObservedRunningTime="2026-04-24 21:27:38.544589912 +0000 UTC m=+4.720324586" Apr 24 21:27:39.524072 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:39.523835 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-124.ec2.internal" event={"ID":"1bd7c9a210b9e7fcb345143388e6cb26","Type":"ContainerStarted","Data":"c2d8599874f03f4bb1358ccff7e8bd7045e978adae02e0294c0bc62ce2068b4b"} Apr 24 21:27:39.550946 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:39.550896 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-124.ec2.internal" podStartSLOduration=4.550879256 podStartE2EDuration="4.550879256s" podCreationTimestamp="2026-04-24 21:27:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:39.550799548 +0000 UTC m=+5.726534219" watchObservedRunningTime="2026-04-24 21:27:39.550879256 +0000 UTC m=+5.726613930" Apr 24 21:27:39.925864 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:39.925790 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs\") pod \"network-metrics-daemon-6wxzd\" (UID: \"b536a581-6c7c-4e7e-9fb3-6223e4ab90f0\") " pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:27:39.926014 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:39.925910 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:39.926014 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:39.925961 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs podName:b536a581-6c7c-4e7e-9fb3-6223e4ab90f0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:43.925942434 +0000 UTC m=+10.101677089 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs") pod "network-metrics-daemon-6wxzd" (UID: "b536a581-6c7c-4e7e-9fb3-6223e4ab90f0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:40.026279 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:40.026242 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vbt4\" (UniqueName: \"kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4\") pod \"network-check-target-rkwhf\" (UID: \"c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e\") " pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:27:40.026431 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:40.026395 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:40.026431 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:40.026415 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:40.026431 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:40.026425 2578 projected.go:194] Error preparing data for projected volume kube-api-access-2vbt4 for pod openshift-network-diagnostics/network-check-target-rkwhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:40.026603 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:40.026472 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4 podName:c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.026459063 +0000 UTC m=+10.202193718 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-2vbt4" (UniqueName: "kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4") pod "network-check-target-rkwhf" (UID: "c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:40.420635 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:40.419905 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:27:40.420635 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:40.420037 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:27:40.420635 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:40.420419 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:27:40.420635 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:40.420499 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:27:42.420795 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:42.420766 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:27:42.421163 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:42.420901 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:27:42.421163 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:42.420963 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:27:42.421163 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:42.421008 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:27:43.959034 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:43.958996 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs\") pod \"network-metrics-daemon-6wxzd\" (UID: \"b536a581-6c7c-4e7e-9fb3-6223e4ab90f0\") " pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:27:43.959481 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:43.959171 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:43.959481 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:43.959246 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs podName:b536a581-6c7c-4e7e-9fb3-6223e4ab90f0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:51.959225834 +0000 UTC m=+18.134960505 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs") pod "network-metrics-daemon-6wxzd" (UID: "b536a581-6c7c-4e7e-9fb3-6223e4ab90f0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:44.060405 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:44.059815 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vbt4\" (UniqueName: \"kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4\") pod \"network-check-target-rkwhf\" (UID: \"c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e\") " pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:27:44.060405 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:44.060010 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:44.060405 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:44.060029 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:44.060405 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:44.060041 2578 projected.go:194] Error preparing data for projected volume kube-api-access-2vbt4 for pod openshift-network-diagnostics/network-check-target-rkwhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:44.060405 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:44.060095 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4 podName:c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e nodeName:}" failed. No retries permitted until 2026-04-24 21:27:52.060076971 +0000 UTC m=+18.235811645 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-2vbt4" (UniqueName: "kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4") pod "network-check-target-rkwhf" (UID: "c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:44.418985 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:44.418910 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:27:44.419141 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:44.419016 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:27:44.419402 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:44.419385 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:27:44.419520 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:44.419495 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:27:46.420472 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:46.420399 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:27:46.420903 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:46.420398 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:27:46.420903 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:46.420511 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:27:46.420903 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:46.420638 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:27:48.417353 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:48.417323 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:27:48.417851 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:48.417334 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:27:48.417851 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:48.417428 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:27:48.417851 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:48.417524 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:27:50.420615 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:50.420579 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:27:50.421078 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:50.420579 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:27:50.421078 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:50.420691 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:27:50.421078 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:50.420811 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:27:52.019208 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:52.019171 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs\") pod \"network-metrics-daemon-6wxzd\" (UID: \"b536a581-6c7c-4e7e-9fb3-6223e4ab90f0\") " pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:27:52.019653 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:52.019287 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:52.019653 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:52.019343 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs podName:b536a581-6c7c-4e7e-9fb3-6223e4ab90f0 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.019329429 +0000 UTC m=+34.195064080 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs") pod "network-metrics-daemon-6wxzd" (UID: "b536a581-6c7c-4e7e-9fb3-6223e4ab90f0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:52.120335 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:52.120295 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vbt4\" (UniqueName: \"kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4\") pod \"network-check-target-rkwhf\" (UID: \"c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e\") " pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:27:52.120497 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:52.120478 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:52.120578 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:52.120555 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:52.120578 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:52.120571 2578 projected.go:194] Error preparing data for projected volume kube-api-access-2vbt4 for pod openshift-network-diagnostics/network-check-target-rkwhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:52.120736 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:52.120688 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4 podName:c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.120618866 +0000 UTC m=+34.296353524 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-2vbt4" (UniqueName: "kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4") pod "network-check-target-rkwhf" (UID: "c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:52.417154 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:52.417079 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:27:52.417389 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:52.417250 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:27:52.417389 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:52.417339 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:27:52.417389 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:52.417238 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:27:54.418748 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:54.418714 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:27:54.419399 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:54.418754 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:27:54.419399 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:54.418803 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:27:54.419399 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:54.418857 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:27:54.550480 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:54.550447 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lshqp" event={"ID":"fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd","Type":"ContainerStarted","Data":"8ee399232ecf46da680123a4f8b20ce2596c1afda99ae6fadd0d97efa53f462d"} Apr 24 21:27:54.551905 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:54.551878 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5t5rz" event={"ID":"9c64a924-1f49-45bd-870b-9fb356e61e75","Type":"ContainerStarted","Data":"c804c23ab655e8f3b1096f6f67ec0782af52718ea47d1ab1c3ab7010386f19ae"} Apr 24 21:27:54.553147 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:54.553127 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-csj8d" event={"ID":"83486ef0-fe96-4f97-a0e5-bec233422715","Type":"ContainerStarted","Data":"6622aa164e448171887be3bd37b87c2700d8a8daefe5ac7daf50ece8de9520f9"} Apr 24 21:27:54.554604 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:54.554580 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-48sm2" event={"ID":"0f52ff53-2325-461a-9bb4-dde9a76323fb","Type":"ContainerStarted","Data":"a3e9d4c6c0a1e8eb785f14699a0ae36a9722217bdce68b7cc6624261c4b7f215"} Apr 24 21:27:54.556816 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:54.556795 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ecf62cd7-5041-4b2d-8eff-453431841db5","Type":"ContainerStarted","Data":"3a884680a299d280abac681da8c47b66fd9bfeb6f08d83314ac478816365ac85"} Apr 24 21:27:54.556816 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:54.556831 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ecf62cd7-5041-4b2d-8eff-453431841db5","Type":"ContainerStarted","Data":"a54c313ba3a4f08208db04ef128140f46a37ffd456e47284a020d08fedc3b324"} Apr 24 21:27:54.556968 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:54.556845 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ecf62cd7-5041-4b2d-8eff-453431841db5","Type":"ContainerStarted","Data":"96b9d2fc47cba9b3169745e7951cb66e1714beb4677d96e7f7090bc00344af5a"} Apr 24 21:27:54.558019 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:54.557997 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" event={"ID":"c2c4820b-6b31-4f7b-89f3-aae65e915513","Type":"ContainerStarted","Data":"f4009e0cf82fd92bd4608b55f8e7c43d90c587e57bf6db2d9ccef9c1b0ff6141"} Apr 24 21:27:54.559259 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:54.559238 2578 generic.go:358] "Generic (PLEG): container finished" podID="05330bcd-753c-4b2c-add6-ad37ce95d4d1" containerID="1d3b0811910790d2d603a1fec28d3377457ac785b59f3467044e2ec23d451ec1" exitCode=0 Apr 24 21:27:54.559349 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:54.559311 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fztlb" event={"ID":"05330bcd-753c-4b2c-add6-ad37ce95d4d1","Type":"ContainerDied","Data":"1d3b0811910790d2d603a1fec28d3377457ac785b59f3467044e2ec23d451ec1"} Apr 24 21:27:54.560785 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:54.560767 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" event={"ID":"007dd22d-9512-495a-ad7f-d8424286a304","Type":"ContainerStarted","Data":"688ad584db551cf5c8c86fdbc9844b9c897e1983ac7f9c2027064b077e335f9b"} Apr 24 21:27:54.571757 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:54.571712 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lshqp" podStartSLOduration=3.814374024 podStartE2EDuration="20.571697772s" podCreationTimestamp="2026-04-24 21:27:34 +0000 UTC" firstStartedPulling="2026-04-24 21:27:36.993264758 +0000 UTC m=+3.168999409" lastFinishedPulling="2026-04-24 21:27:53.750588489 +0000 UTC m=+19.926323157" observedRunningTime="2026-04-24 21:27:54.571054293 +0000 UTC m=+20.746788966" watchObservedRunningTime="2026-04-24 21:27:54.571697772 +0000 UTC m=+20.747432446" Apr 24 21:27:54.591259 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:54.591222 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-csj8d" podStartSLOduration=4.026635159 podStartE2EDuration="20.591213524s" podCreationTimestamp="2026-04-24 21:27:34 +0000 UTC" firstStartedPulling="2026-04-24 21:27:36.98844135 +0000 UTC m=+3.164176002" lastFinishedPulling="2026-04-24 21:27:53.553019713 +0000 UTC m=+19.728754367" observedRunningTime="2026-04-24 21:27:54.590853249 +0000 UTC m=+20.766587922" watchObservedRunningTime="2026-04-24 21:27:54.591213524 +0000 UTC m=+20.766948195" Apr 24 21:27:54.617575 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:54.617528 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-nr5k8" podStartSLOduration=3.882640596 podStartE2EDuration="20.617516296s" podCreationTimestamp="2026-04-24 21:27:34 +0000 UTC" firstStartedPulling="2026-04-24 21:27:36.989181106 +0000 UTC m=+3.164915765" lastFinishedPulling="2026-04-24 21:27:53.724056801 +0000 UTC m=+19.899791465" observedRunningTime="2026-04-24 21:27:54.615782126 +0000 UTC m=+20.791516799" watchObservedRunningTime="2026-04-24 21:27:54.617516296 +0000 UTC m=+20.793250963" Apr 24 21:27:54.666004 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:54.665768 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5t5rz" podStartSLOduration=4.090791917 podStartE2EDuration="20.665752041s" podCreationTimestamp="2026-04-24 21:27:34 +0000 UTC" firstStartedPulling="2026-04-24 21:27:36.990163647 +0000 UTC m=+3.165898307" lastFinishedPulling="2026-04-24 21:27:53.565123767 +0000 UTC m=+19.740858431" observedRunningTime="2026-04-24 21:27:54.664773949 +0000 UTC m=+20.840508621" watchObservedRunningTime="2026-04-24 21:27:54.665752041 +0000 UTC m=+20.841486713" Apr 24 21:27:54.696230 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:54.696184 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-48sm2" podStartSLOduration=3.9430601579999998 podStartE2EDuration="20.696171734s" podCreationTimestamp="2026-04-24 21:27:34 +0000 UTC" firstStartedPulling="2026-04-24 21:27:36.997475742 +0000 UTC m=+3.173210406" lastFinishedPulling="2026-04-24 21:27:53.750587327 +0000 UTC m=+19.926321982" observedRunningTime="2026-04-24 21:27:54.695489644 +0000 UTC m=+20.871224316" watchObservedRunningTime="2026-04-24 21:27:54.696171734 +0000 UTC m=+20.871906405" Apr 24 21:27:55.260453 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:55.260422 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:27:55.360757 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:55.360596 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:27:55.260448791Z","UUID":"48d536e4-7bd5-4445-8d2b-667f9c59b2bd","Handler":null,"Name":"","Endpoint":""} Apr 24 21:27:55.362502 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:55.362484 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:27:55.362502 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:55.362508 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:27:55.565628 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:55.565592 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ecf62cd7-5041-4b2d-8eff-453431841db5","Type":"ContainerStarted","Data":"642512af38e9ca484ed17ab53c27bb1f8021cae2d56fc8fc7da695ba8dce06cc"} Apr 24 21:27:55.565628 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:55.565630 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ecf62cd7-5041-4b2d-8eff-453431841db5","Type":"ContainerStarted","Data":"b47b3f32bcde398e018231cef28a216412a10864a35f34e51ed079592a2854b3"} Apr 24 21:27:55.566201 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:55.565643 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ecf62cd7-5041-4b2d-8eff-453431841db5","Type":"ContainerStarted","Data":"b571d6b2d66ad0170aad6b55e34e208d76cd79881032728f839b67e1e1bcb0e8"} Apr 24 21:27:55.567512 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:55.567482 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" event={"ID":"c2c4820b-6b31-4f7b-89f3-aae65e915513","Type":"ContainerStarted","Data":"4d31be1c47e6d3258f40d51efb73718b928954f86652b22f2c50d91de909bd10"} Apr 24 21:27:55.569036 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:55.568967 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fdh56" event={"ID":"6183aed2-60ab-4cae-8455-c797d1e3ebf6","Type":"ContainerStarted","Data":"98bc4d180aa1e9524f322c0f82168836f978d02450fc9005744447915543fbce"} Apr 24 21:27:55.589480 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:55.589436 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-fdh56" podStartSLOduration=4.86260571 podStartE2EDuration="21.589425624s" podCreationTimestamp="2026-04-24 21:27:34 +0000 UTC" firstStartedPulling="2026-04-24 21:27:36.997213228 +0000 UTC m=+3.172947894" lastFinishedPulling="2026-04-24 21:27:53.724033148 +0000 UTC m=+19.899767808" observedRunningTime="2026-04-24 21:27:55.588844365 +0000 UTC m=+21.764579036" watchObservedRunningTime="2026-04-24 21:27:55.589425624 +0000 UTC m=+21.765160296" Apr 24 21:27:56.341604 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:56.341574 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-48sm2_0f52ff53-2325-461a-9bb4-dde9a76323fb/dns-node-resolver/0.log" Apr 24 21:27:56.417329 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:56.417252 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:27:56.417470 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:56.417373 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:27:56.417470 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:56.417416 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:27:56.417599 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:56.417483 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:27:56.572965 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:56.572925 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" event={"ID":"c2c4820b-6b31-4f7b-89f3-aae65e915513","Type":"ContainerStarted","Data":"bd37c90490bd137e86a123d080e74eb0fb111f905436a6deadb162fd8d4c00f2"} Apr 24 21:27:56.598756 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:56.598714 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8ps7d" podStartSLOduration=3.454363023 podStartE2EDuration="22.5986967s" podCreationTimestamp="2026-04-24 21:27:34 +0000 UTC" firstStartedPulling="2026-04-24 21:27:36.995515425 +0000 UTC m=+3.171250077" lastFinishedPulling="2026-04-24 21:27:56.1398491 +0000 UTC m=+22.315583754" observedRunningTime="2026-04-24 21:27:56.59813462 +0000 UTC m=+22.773869314" watchObservedRunningTime="2026-04-24 21:27:56.5986967 +0000 UTC m=+22.774431371" Apr 24 21:27:57.321142 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:57.321114 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5t5rz_9c64a924-1f49-45bd-870b-9fb356e61e75/node-ca/0.log" Apr 24 21:27:57.734106 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:57.734076 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-csj8d" Apr 24 21:27:57.734798 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:57.734781 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-csj8d" Apr 24 21:27:58.417669 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:58.417596 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:27:58.417669 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:58.417639 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:27:58.417867 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:58.417733 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:27:58.417918 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:27:58.417874 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:27:58.575958 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:58.575938 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-csj8d" Apr 24 21:27:58.576635 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:58.576616 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-csj8d" Apr 24 21:27:59.579934 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:59.579747 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ecf62cd7-5041-4b2d-8eff-453431841db5","Type":"ContainerStarted","Data":"ee776210d58a116be3293594a078cb8ca8a36978c304545e27b3da7c9f76835a"} Apr 24 21:27:59.581152 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:59.581126 2578 generic.go:358] "Generic (PLEG): container finished" podID="05330bcd-753c-4b2c-add6-ad37ce95d4d1" containerID="a3d0798d5f03838806b376ba60e0a579e191fb49e9909f819ec66b3cc31c1443" exitCode=0 Apr 24 21:27:59.581260 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:27:59.581196 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fztlb" event={"ID":"05330bcd-753c-4b2c-add6-ad37ce95d4d1","Type":"ContainerDied","Data":"a3d0798d5f03838806b376ba60e0a579e191fb49e9909f819ec66b3cc31c1443"} Apr 24 21:28:00.417221 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:00.417197 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:28:00.417313 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:00.417205 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:28:00.417313 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:00.417299 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:28:00.417388 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:00.417374 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:28:00.583921 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:00.583864 2578 generic.go:358] "Generic (PLEG): container finished" podID="05330bcd-753c-4b2c-add6-ad37ce95d4d1" containerID="6dd5688edb78b697e156006f40191b19939a13169e7c018a962f359c49ecfd6d" exitCode=0 Apr 24 21:28:00.584195 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:00.583951 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fztlb" event={"ID":"05330bcd-753c-4b2c-add6-ad37ce95d4d1","Type":"ContainerDied","Data":"6dd5688edb78b697e156006f40191b19939a13169e7c018a962f359c49ecfd6d"} Apr 24 21:28:01.587004 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:01.586843 2578 generic.go:358] "Generic (PLEG): container finished" podID="05330bcd-753c-4b2c-add6-ad37ce95d4d1" containerID="3982ce3b2bf5239e8daf40c9d17ab48ca5577c825672f4e0925003eaf4f3b6fc" exitCode=0 Apr 24 21:28:01.587406 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:01.586922 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fztlb" event={"ID":"05330bcd-753c-4b2c-add6-ad37ce95d4d1","Type":"ContainerDied","Data":"3982ce3b2bf5239e8daf40c9d17ab48ca5577c825672f4e0925003eaf4f3b6fc"} Apr 24 21:28:01.590249 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:01.590229 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" event={"ID":"ecf62cd7-5041-4b2d-8eff-453431841db5","Type":"ContainerStarted","Data":"044366d6ce543561a3e34fe0ebd2d1d12ec2b7e4c8e03344d8adb222a83725e6"} Apr 24 21:28:01.590579 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:01.590558 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:28:01.590662 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:01.590591 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:28:01.590662 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:01.590604 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:28:01.604383 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:01.604358 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:28:01.604626 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:01.604585 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:28:01.644858 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:01.644811 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" podStartSLOduration=10.415545822 podStartE2EDuration="27.644800735s" podCreationTimestamp="2026-04-24 21:27:34 +0000 UTC" firstStartedPulling="2026-04-24 21:27:36.99697387 +0000 UTC m=+3.172708527" lastFinishedPulling="2026-04-24 21:27:54.226228775 +0000 UTC m=+20.401963440" observedRunningTime="2026-04-24 21:28:01.644450928 +0000 UTC m=+27.820185807" watchObservedRunningTime="2026-04-24 21:28:01.644800735 +0000 UTC m=+27.820535407" Apr 24 21:28:02.417907 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:02.417876 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:28:02.418081 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:02.417977 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:28:02.418081 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:02.418034 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:28:02.418203 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:02.418153 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:28:04.419341 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:04.419313 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:28:04.419748 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:04.419434 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:28:04.419748 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:04.419513 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:28:04.419748 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:04.419613 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:28:06.417631 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:06.417598 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:28:06.418142 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:06.417614 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:28:06.418142 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:06.417722 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:28:06.418142 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:06.417851 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:28:07.602238 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:07.602202 2578 generic.go:358] "Generic (PLEG): container finished" podID="05330bcd-753c-4b2c-add6-ad37ce95d4d1" containerID="886b053a58a744230ff78ea0a185d7d04e52be71441e439f3e59c4a64667aba2" exitCode=0 Apr 24 21:28:07.602692 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:07.602283 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fztlb" event={"ID":"05330bcd-753c-4b2c-add6-ad37ce95d4d1","Type":"ContainerDied","Data":"886b053a58a744230ff78ea0a185d7d04e52be71441e439f3e59c4a64667aba2"} Apr 24 21:28:08.037392 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:08.037363 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs\") pod \"network-metrics-daemon-6wxzd\" (UID: \"b536a581-6c7c-4e7e-9fb3-6223e4ab90f0\") " pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:28:08.037514 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:08.037500 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:08.037587 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:08.037572 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs podName:b536a581-6c7c-4e7e-9fb3-6223e4ab90f0 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:40.037557906 +0000 UTC m=+66.213292560 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs") pod "network-metrics-daemon-6wxzd" (UID: "b536a581-6c7c-4e7e-9fb3-6223e4ab90f0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:08.138121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:08.138099 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vbt4\" (UniqueName: \"kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4\") pod \"network-check-target-rkwhf\" (UID: \"c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e\") " pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:28:08.138204 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:08.138195 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:28:08.138239 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:08.138206 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:28:08.138239 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:08.138214 2578 projected.go:194] Error preparing data for projected volume kube-api-access-2vbt4 for pod openshift-network-diagnostics/network-check-target-rkwhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:08.138301 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:08.138250 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4 podName:c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:40.138240028 +0000 UTC m=+66.313974684 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-2vbt4" (UniqueName: "kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4") pod "network-check-target-rkwhf" (UID: "c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:08.417498 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:08.417442 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:28:08.417498 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:08.417455 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:28:08.417675 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:08.417564 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:28:08.417675 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:08.417661 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:28:08.606711 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:08.606679 2578 generic.go:358] "Generic (PLEG): container finished" podID="05330bcd-753c-4b2c-add6-ad37ce95d4d1" containerID="8d79dbf5d5fce2dbc9274a02ef68d0c41da8877b87ed063e0f498233732c88fa" exitCode=0 Apr 24 21:28:08.606997 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:08.606716 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fztlb" event={"ID":"05330bcd-753c-4b2c-add6-ad37ce95d4d1","Type":"ContainerDied","Data":"8d79dbf5d5fce2dbc9274a02ef68d0c41da8877b87ed063e0f498233732c88fa"} Apr 24 21:28:09.612357 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:09.612328 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fztlb" event={"ID":"05330bcd-753c-4b2c-add6-ad37ce95d4d1","Type":"ContainerStarted","Data":"5ac212a71af724d6362d8e8e33c82aea4aa6d3a2dc613474ed48bb2669ede534"} Apr 24 21:28:09.640736 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:09.640696 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fztlb" podStartSLOduration=5.838729975 podStartE2EDuration="35.640686646s" podCreationTimestamp="2026-04-24 21:27:34 +0000 UTC" firstStartedPulling="2026-04-24 21:27:36.992286985 +0000 UTC m=+3.168021636" lastFinishedPulling="2026-04-24 21:28:06.794243641 +0000 UTC m=+32.969978307" observedRunningTime="2026-04-24 21:28:09.640478061 +0000 UTC m=+35.816212733" watchObservedRunningTime="2026-04-24 21:28:09.640686646 +0000 UTC m=+35.816421319" Apr 24 21:28:10.417167 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:10.417131 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:28:10.417167 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:10.417159 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:28:10.417357 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:10.417223 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:28:10.417399 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:10.417356 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:28:12.417687 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:12.417656 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:28:12.418140 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:12.417656 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:28:12.418140 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:12.417765 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:28:12.418140 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:12.417818 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:28:14.418022 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:14.417996 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:28:14.418339 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:14.418076 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:28:14.418339 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:14.418153 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:28:14.418339 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:14.418244 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:28:16.418261 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:16.418109 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:28:16.418716 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:16.418174 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:28:16.418716 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:16.418369 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:28:16.418716 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:16.418411 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:28:16.621869 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:16.621840 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6wxzd"] Apr 24 21:28:16.623493 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:16.623472 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:28:16.623679 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:16.623655 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:28:16.624400 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:16.624062 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rkwhf"] Apr 24 21:28:16.624400 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:16.624148 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:28:16.624400 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:16.624243 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:28:18.417912 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:18.417884 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:28:18.417912 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:18.417911 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:28:18.418336 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:18.417994 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:28:18.418336 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:18.418075 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:28:20.417849 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.417814 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:28:20.418268 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.417859 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:28:20.418268 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:20.417949 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wxzd" podUID="b536a581-6c7c-4e7e-9fb3-6223e4ab90f0" Apr 24 21:28:20.418268 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:20.418052 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkwhf" podUID="c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e" Apr 24 21:28:20.639039 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.639016 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-124.ec2.internal" event="NodeReady" Apr 24 21:28:20.639139 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.639103 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:28:20.685952 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.685901 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rrzpv"] Apr 24 21:28:20.702709 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.702688 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-48v2j"] Apr 24 21:28:20.702876 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.702860 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rrzpv" Apr 24 21:28:20.705587 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.705565 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:28:20.706047 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.706031 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wtdzs\"" Apr 24 21:28:20.706144 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.706126 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:28:20.716404 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.716379 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-48v2j"] Apr 24 21:28:20.716469 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.716409 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rrzpv"] Apr 24 21:28:20.716469 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.716424 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-wbfvk"] Apr 24 21:28:20.716537 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.716507 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-48v2j" Apr 24 21:28:20.719037 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.719014 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:28:20.719122 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.719044 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:28:20.719122 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.719047 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gftjb\"" Apr 24 21:28:20.719122 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.719016 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:28:20.740317 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.740298 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wbfvk"] Apr 24 21:28:20.740413 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.740392 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wbfvk" Apr 24 21:28:20.742489 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.742473 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:28:20.742708 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.742691 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:28:20.742816 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.742768 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:28:20.742816 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.742782 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:28:20.742918 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.742875 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-6qqhw\"" Apr 24 21:28:20.837590 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.837567 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb2a7484-3606-4f41-8444-8efbab81200b-metrics-tls\") pod \"dns-default-rrzpv\" (UID: \"cb2a7484-3606-4f41-8444-8efbab81200b\") " pod="openshift-dns/dns-default-rrzpv" Apr 24 21:28:20.837590 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.837590 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e329951c-d495-4bf9-8751-384a26c4a2ce-cert\") pod \"ingress-canary-48v2j\" (UID: \"e329951c-d495-4bf9-8751-384a26c4a2ce\") " pod="openshift-ingress-canary/ingress-canary-48v2j" Apr 24 21:28:20.837715 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.837616 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqqhr\" (UniqueName: \"kubernetes.io/projected/e329951c-d495-4bf9-8751-384a26c4a2ce-kube-api-access-fqqhr\") pod \"ingress-canary-48v2j\" (UID: \"e329951c-d495-4bf9-8751-384a26c4a2ce\") " pod="openshift-ingress-canary/ingress-canary-48v2j" Apr 24 21:28:20.837715 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.837672 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bdd64acb-d899-4f75-b460-f0b05adbbbab-data-volume\") pod \"insights-runtime-extractor-wbfvk\" (UID: \"bdd64acb-d899-4f75-b460-f0b05adbbbab\") " pod="openshift-insights/insights-runtime-extractor-wbfvk" Apr 24 21:28:20.837776 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.837722 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb2a7484-3606-4f41-8444-8efbab81200b-config-volume\") pod \"dns-default-rrzpv\" (UID: \"cb2a7484-3606-4f41-8444-8efbab81200b\") " pod="openshift-dns/dns-default-rrzpv" Apr 24 21:28:20.837776 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.837743 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bdd64acb-d899-4f75-b460-f0b05adbbbab-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wbfvk\" (UID: \"bdd64acb-d899-4f75-b460-f0b05adbbbab\") " pod="openshift-insights/insights-runtime-extractor-wbfvk" Apr 24 21:28:20.837776 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.837763 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bdd64acb-d899-4f75-b460-f0b05adbbbab-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wbfvk\" (UID: \"bdd64acb-d899-4f75-b460-f0b05adbbbab\") " pod="openshift-insights/insights-runtime-extractor-wbfvk" Apr 24 21:28:20.837860 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.837789 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bdd64acb-d899-4f75-b460-f0b05adbbbab-crio-socket\") pod \"insights-runtime-extractor-wbfvk\" (UID: \"bdd64acb-d899-4f75-b460-f0b05adbbbab\") " pod="openshift-insights/insights-runtime-extractor-wbfvk" Apr 24 21:28:20.837860 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.837842 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cb2a7484-3606-4f41-8444-8efbab81200b-tmp-dir\") pod \"dns-default-rrzpv\" (UID: \"cb2a7484-3606-4f41-8444-8efbab81200b\") " pod="openshift-dns/dns-default-rrzpv" Apr 24 21:28:20.837917 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.837900 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpp9q\" (UniqueName: \"kubernetes.io/projected/cb2a7484-3606-4f41-8444-8efbab81200b-kube-api-access-vpp9q\") pod \"dns-default-rrzpv\" (UID: \"cb2a7484-3606-4f41-8444-8efbab81200b\") " pod="openshift-dns/dns-default-rrzpv" Apr 24 21:28:20.837972 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.837918 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8cvw\" (UniqueName: \"kubernetes.io/projected/bdd64acb-d899-4f75-b460-f0b05adbbbab-kube-api-access-k8cvw\") pod \"insights-runtime-extractor-wbfvk\" (UID: \"bdd64acb-d899-4f75-b460-f0b05adbbbab\") " pod="openshift-insights/insights-runtime-extractor-wbfvk" Apr 24 21:28:20.938417 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.938362 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bdd64acb-d899-4f75-b460-f0b05adbbbab-data-volume\") pod \"insights-runtime-extractor-wbfvk\" (UID: \"bdd64acb-d899-4f75-b460-f0b05adbbbab\") " pod="openshift-insights/insights-runtime-extractor-wbfvk" Apr 24 21:28:20.938417 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.938392 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb2a7484-3606-4f41-8444-8efbab81200b-config-volume\") pod \"dns-default-rrzpv\" (UID: \"cb2a7484-3606-4f41-8444-8efbab81200b\") " pod="openshift-dns/dns-default-rrzpv" Apr 24 21:28:20.938417 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.938407 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bdd64acb-d899-4f75-b460-f0b05adbbbab-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wbfvk\" (UID: \"bdd64acb-d899-4f75-b460-f0b05adbbbab\") " pod="openshift-insights/insights-runtime-extractor-wbfvk" Apr 24 21:28:20.938584 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.938429 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bdd64acb-d899-4f75-b460-f0b05adbbbab-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wbfvk\" (UID: \"bdd64acb-d899-4f75-b460-f0b05adbbbab\") " pod="openshift-insights/insights-runtime-extractor-wbfvk" Apr 24 21:28:20.938584 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.938456 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bdd64acb-d899-4f75-b460-f0b05adbbbab-crio-socket\") pod \"insights-runtime-extractor-wbfvk\" (UID: \"bdd64acb-d899-4f75-b460-f0b05adbbbab\") " pod="openshift-insights/insights-runtime-extractor-wbfvk" Apr 24 21:28:20.938584 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.938483 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cb2a7484-3606-4f41-8444-8efbab81200b-tmp-dir\") pod \"dns-default-rrzpv\" (UID: \"cb2a7484-3606-4f41-8444-8efbab81200b\") " pod="openshift-dns/dns-default-rrzpv" Apr 24 21:28:20.938584 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.938531 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpp9q\" (UniqueName: \"kubernetes.io/projected/cb2a7484-3606-4f41-8444-8efbab81200b-kube-api-access-vpp9q\") pod \"dns-default-rrzpv\" (UID: \"cb2a7484-3606-4f41-8444-8efbab81200b\") " pod="openshift-dns/dns-default-rrzpv" Apr 24 21:28:20.938584 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.938578 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8cvw\" (UniqueName: \"kubernetes.io/projected/bdd64acb-d899-4f75-b460-f0b05adbbbab-kube-api-access-k8cvw\") pod \"insights-runtime-extractor-wbfvk\" (UID: \"bdd64acb-d899-4f75-b460-f0b05adbbbab\") " pod="openshift-insights/insights-runtime-extractor-wbfvk" Apr 24 21:28:20.938750 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.938608 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb2a7484-3606-4f41-8444-8efbab81200b-metrics-tls\") pod \"dns-default-rrzpv\" (UID: \"cb2a7484-3606-4f41-8444-8efbab81200b\") " pod="openshift-dns/dns-default-rrzpv" Apr 24 21:28:20.938750 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.938630 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e329951c-d495-4bf9-8751-384a26c4a2ce-cert\") pod \"ingress-canary-48v2j\" (UID: \"e329951c-d495-4bf9-8751-384a26c4a2ce\") " pod="openshift-ingress-canary/ingress-canary-48v2j" Apr 24 21:28:20.938750 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.938671 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqqhr\" (UniqueName: \"kubernetes.io/projected/e329951c-d495-4bf9-8751-384a26c4a2ce-kube-api-access-fqqhr\") pod \"ingress-canary-48v2j\" (UID: \"e329951c-d495-4bf9-8751-384a26c4a2ce\") " pod="openshift-ingress-canary/ingress-canary-48v2j" Apr 24 21:28:20.938750 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.938711 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bdd64acb-d899-4f75-b460-f0b05adbbbab-crio-socket\") pod \"insights-runtime-extractor-wbfvk\" (UID: \"bdd64acb-d899-4f75-b460-f0b05adbbbab\") " pod="openshift-insights/insights-runtime-extractor-wbfvk" Apr 24 21:28:20.938750 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.938748 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bdd64acb-d899-4f75-b460-f0b05adbbbab-data-volume\") pod \"insights-runtime-extractor-wbfvk\" (UID: \"bdd64acb-d899-4f75-b460-f0b05adbbbab\") " pod="openshift-insights/insights-runtime-extractor-wbfvk" Apr 24 21:28:20.939059 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.939020 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cb2a7484-3606-4f41-8444-8efbab81200b-tmp-dir\") pod \"dns-default-rrzpv\" (UID: \"cb2a7484-3606-4f41-8444-8efbab81200b\") " pod="openshift-dns/dns-default-rrzpv" Apr 24 21:28:20.939059 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.939029 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb2a7484-3606-4f41-8444-8efbab81200b-config-volume\") pod \"dns-default-rrzpv\" (UID: \"cb2a7484-3606-4f41-8444-8efbab81200b\") " pod="openshift-dns/dns-default-rrzpv" Apr 24 21:28:20.939310 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.939288 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bdd64acb-d899-4f75-b460-f0b05adbbbab-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wbfvk\" (UID: \"bdd64acb-d899-4f75-b460-f0b05adbbbab\") " pod="openshift-insights/insights-runtime-extractor-wbfvk" Apr 24 21:28:20.942407 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.942379 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb2a7484-3606-4f41-8444-8efbab81200b-metrics-tls\") pod \"dns-default-rrzpv\" (UID: \"cb2a7484-3606-4f41-8444-8efbab81200b\") " pod="openshift-dns/dns-default-rrzpv" Apr 24 21:28:20.942521 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.942389 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bdd64acb-d899-4f75-b460-f0b05adbbbab-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wbfvk\" (UID: \"bdd64acb-d899-4f75-b460-f0b05adbbbab\") " pod="openshift-insights/insights-runtime-extractor-wbfvk" Apr 24 21:28:20.942612 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.942585 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e329951c-d495-4bf9-8751-384a26c4a2ce-cert\") pod \"ingress-canary-48v2j\" (UID: \"e329951c-d495-4bf9-8751-384a26c4a2ce\") " pod="openshift-ingress-canary/ingress-canary-48v2j" Apr 24 21:28:20.946781 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.946754 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8cvw\" (UniqueName: \"kubernetes.io/projected/bdd64acb-d899-4f75-b460-f0b05adbbbab-kube-api-access-k8cvw\") pod \"insights-runtime-extractor-wbfvk\" (UID: \"bdd64acb-d899-4f75-b460-f0b05adbbbab\") " pod="openshift-insights/insights-runtime-extractor-wbfvk" Apr 24 21:28:20.947656 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.947638 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpp9q\" (UniqueName: \"kubernetes.io/projected/cb2a7484-3606-4f41-8444-8efbab81200b-kube-api-access-vpp9q\") pod \"dns-default-rrzpv\" (UID: \"cb2a7484-3606-4f41-8444-8efbab81200b\") " pod="openshift-dns/dns-default-rrzpv" Apr 24 21:28:20.947928 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:20.947912 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqqhr\" (UniqueName: \"kubernetes.io/projected/e329951c-d495-4bf9-8751-384a26c4a2ce-kube-api-access-fqqhr\") pod \"ingress-canary-48v2j\" (UID: \"e329951c-d495-4bf9-8751-384a26c4a2ce\") " pod="openshift-ingress-canary/ingress-canary-48v2j" Apr 24 21:28:21.014653 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.014633 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rrzpv" Apr 24 21:28:21.024294 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.024257 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-48v2j" Apr 24 21:28:21.048134 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.048098 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wbfvk" Apr 24 21:28:21.196275 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.196209 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-48v2j"] Apr 24 21:28:21.200240 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:28:21.200206 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode329951c_d495_4bf9_8751_384a26c4a2ce.slice/crio-996ab8e6b85712d92b1be0c81d91a96d6b0bcdbffc6477779ac5a8b33d0affe4 WatchSource:0}: Error finding container 996ab8e6b85712d92b1be0c81d91a96d6b0bcdbffc6477779ac5a8b33d0affe4: Status 404 returned error can't find the container with id 996ab8e6b85712d92b1be0c81d91a96d6b0bcdbffc6477779ac5a8b33d0affe4 Apr 24 21:28:21.207067 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.207043 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rrzpv"] Apr 24 21:28:21.207678 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.207658 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wbfvk"] Apr 24 21:28:21.210290 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:28:21.210267 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb2a7484_3606_4f41_8444_8efbab81200b.slice/crio-f3794276496affa161ef9d4c5b1d3b921e63d53234cf03478537b2673c3ee1e3 WatchSource:0}: Error finding container f3794276496affa161ef9d4c5b1d3b921e63d53234cf03478537b2673c3ee1e3: Status 404 returned error can't find the container with id f3794276496affa161ef9d4c5b1d3b921e63d53234cf03478537b2673c3ee1e3 Apr 24 21:28:21.210914 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:28:21.210895 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdd64acb_d899_4f75_b460_f0b05adbbbab.slice/crio-5efb84a255f6734a8ce9e69771af1454df9075fa2e74f95da3b6404d2f98a4c6 WatchSource:0}: Error finding container 5efb84a255f6734a8ce9e69771af1454df9075fa2e74f95da3b6404d2f98a4c6: Status 404 returned error can't find the container with id 5efb84a255f6734a8ce9e69771af1454df9075fa2e74f95da3b6404d2f98a4c6 Apr 24 21:28:21.634303 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.634124 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-48v2j" event={"ID":"e329951c-d495-4bf9-8751-384a26c4a2ce","Type":"ContainerStarted","Data":"996ab8e6b85712d92b1be0c81d91a96d6b0bcdbffc6477779ac5a8b33d0affe4"} Apr 24 21:28:21.635812 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.635786 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wbfvk" event={"ID":"bdd64acb-d899-4f75-b460-f0b05adbbbab","Type":"ContainerStarted","Data":"32c65406c9c8a320799557201cbd7f4e1fc5482c6553ca52f69d8183a95bdb49"} Apr 24 21:28:21.635948 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.635815 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wbfvk" event={"ID":"bdd64acb-d899-4f75-b460-f0b05adbbbab","Type":"ContainerStarted","Data":"5efb84a255f6734a8ce9e69771af1454df9075fa2e74f95da3b6404d2f98a4c6"} Apr 24 21:28:21.636851 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.636821 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rrzpv" event={"ID":"cb2a7484-3606-4f41-8444-8efbab81200b","Type":"ContainerStarted","Data":"f3794276496affa161ef9d4c5b1d3b921e63d53234cf03478537b2673c3ee1e3"} Apr 24 21:28:21.668342 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.668315 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c956f48c6-vl47q"] Apr 24 21:28:21.687435 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.687160 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c956f48c6-vl47q"] Apr 24 21:28:21.687435 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.687284 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:21.692480 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.692460 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 21:28:21.693202 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.693180 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 21:28:21.693305 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.693292 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 21:28:21.693433 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.693416 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 21:28:21.693612 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.693590 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-7kxgc\"" Apr 24 21:28:21.693681 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.693640 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 21:28:21.693681 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.693665 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 21:28:21.693681 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.693639 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 21:28:21.845864 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.845830 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-console-oauth-config\") pod \"console-6c956f48c6-vl47q\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:21.846073 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.845892 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-522kq\" (UniqueName: \"kubernetes.io/projected/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-kube-api-access-522kq\") pod \"console-6c956f48c6-vl47q\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:21.846073 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.845936 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-console-config\") pod \"console-6c956f48c6-vl47q\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:21.846073 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.845964 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-service-ca\") pod \"console-6c956f48c6-vl47q\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:21.846073 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.845990 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-oauth-serving-cert\") pod \"console-6c956f48c6-vl47q\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:21.846073 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.846024 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-console-serving-cert\") pod \"console-6c956f48c6-vl47q\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:21.947251 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.947178 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-console-oauth-config\") pod \"console-6c956f48c6-vl47q\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:21.947251 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.947238 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-522kq\" (UniqueName: \"kubernetes.io/projected/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-kube-api-access-522kq\") pod \"console-6c956f48c6-vl47q\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:21.947435 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.947281 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-console-config\") pod \"console-6c956f48c6-vl47q\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:21.947435 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.947305 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-service-ca\") pod \"console-6c956f48c6-vl47q\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:21.947435 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.947324 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-oauth-serving-cert\") pod \"console-6c956f48c6-vl47q\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:21.947435 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.947353 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-console-serving-cert\") pod \"console-6c956f48c6-vl47q\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:21.948736 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.948712 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-console-config\") pod \"console-6c956f48c6-vl47q\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:21.949283 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.949260 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-service-ca\") pod \"console-6c956f48c6-vl47q\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:21.949610 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.949586 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-oauth-serving-cert\") pod \"console-6c956f48c6-vl47q\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:21.951895 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.951874 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-console-oauth-config\") pod \"console-6c956f48c6-vl47q\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:21.952115 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.952092 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-console-serving-cert\") pod \"console-6c956f48c6-vl47q\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:21.967248 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:21.967222 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-522kq\" (UniqueName: \"kubernetes.io/projected/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-kube-api-access-522kq\") pod \"console-6c956f48c6-vl47q\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:22.000490 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:22.000467 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:22.147668 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:22.147460 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c956f48c6-vl47q"] Apr 24 21:28:22.203309 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:28:22.203237 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa9bbfec_3195_4cf7_a22e_0120bed1f2a4.slice/crio-8713326e687169723415faacffd010a5db9375eca5d2e02c9b0bd211dee0d798 WatchSource:0}: Error finding container 8713326e687169723415faacffd010a5db9375eca5d2e02c9b0bd211dee0d798: Status 404 returned error can't find the container with id 8713326e687169723415faacffd010a5db9375eca5d2e02c9b0bd211dee0d798 Apr 24 21:28:22.418023 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:22.417988 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:28:22.418269 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:22.418027 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:28:22.422576 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:22.422066 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:28:22.422576 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:22.422126 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:28:22.422576 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:22.422263 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kgq96\"" Apr 24 21:28:22.422576 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:22.422523 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-cbm9w\"" Apr 24 21:28:22.423025 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:22.423004 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:28:22.640041 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:22.640009 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c956f48c6-vl47q" event={"ID":"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4","Type":"ContainerStarted","Data":"8713326e687169723415faacffd010a5db9375eca5d2e02c9b0bd211dee0d798"} Apr 24 21:28:24.647928 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:24.647654 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-48v2j" event={"ID":"e329951c-d495-4bf9-8751-384a26c4a2ce","Type":"ContainerStarted","Data":"9ac0838480354f1fa4e4ea5377c3550a01ff6dc0b8156d076b9670aa0c193148"} Apr 24 21:28:24.651424 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:24.651393 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wbfvk" event={"ID":"bdd64acb-d899-4f75-b460-f0b05adbbbab","Type":"ContainerStarted","Data":"4d9b509f0d19465a309a7fed9568b4cf37b23596959e83bcd549103ae1920527"} Apr 24 21:28:24.659568 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:24.656812 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rrzpv" event={"ID":"cb2a7484-3606-4f41-8444-8efbab81200b","Type":"ContainerStarted","Data":"ae2a26b8f4ee1b829fe5c93465ccf66188af8536a08fe2096d7198381a9f8976"} Apr 24 21:28:24.659568 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:24.656843 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rrzpv" event={"ID":"cb2a7484-3606-4f41-8444-8efbab81200b","Type":"ContainerStarted","Data":"3a8b30ace0fb29dce36c1a708ab19ea2d5eae7a26e5c1a6f9ef2e4ebf9bc476b"} Apr 24 21:28:24.659568 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:24.656925 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-rrzpv" Apr 24 21:28:24.667275 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:24.667224 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-48v2j" podStartSLOduration=2.111292399 podStartE2EDuration="4.667207786s" podCreationTimestamp="2026-04-24 21:28:20 +0000 UTC" firstStartedPulling="2026-04-24 21:28:21.202049665 +0000 UTC m=+47.377784317" lastFinishedPulling="2026-04-24 21:28:23.757965052 +0000 UTC m=+49.933699704" observedRunningTime="2026-04-24 21:28:24.665851405 +0000 UTC m=+50.841586087" watchObservedRunningTime="2026-04-24 21:28:24.667207786 +0000 UTC m=+50.842942462" Apr 24 21:28:24.681940 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:24.681902 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rrzpv" podStartSLOduration=2.141041483 podStartE2EDuration="4.681876889s" podCreationTimestamp="2026-04-24 21:28:20 +0000 UTC" firstStartedPulling="2026-04-24 21:28:21.211891516 +0000 UTC m=+47.387626168" lastFinishedPulling="2026-04-24 21:28:23.752726909 +0000 UTC m=+49.928461574" observedRunningTime="2026-04-24 21:28:24.681372309 +0000 UTC m=+50.857106984" watchObservedRunningTime="2026-04-24 21:28:24.681876889 +0000 UTC m=+50.857611552" Apr 24 21:28:26.662872 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:26.662836 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wbfvk" event={"ID":"bdd64acb-d899-4f75-b460-f0b05adbbbab","Type":"ContainerStarted","Data":"eec311c5126a67d7011ed49c2e5f3bd06b48ca52e0204a1658d6606f5c401d94"} Apr 24 21:28:26.664054 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:26.664031 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c956f48c6-vl47q" event={"ID":"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4","Type":"ContainerStarted","Data":"17f87aea5d42c4635970ea71a8fa8b07f2cbbaa17289e3e0787bfb856b6a5ea0"} Apr 24 21:28:26.680652 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:26.680616 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-wbfvk" podStartSLOduration=1.526915577 podStartE2EDuration="6.680602361s" podCreationTimestamp="2026-04-24 21:28:20 +0000 UTC" firstStartedPulling="2026-04-24 21:28:21.336591879 +0000 UTC m=+47.512326530" lastFinishedPulling="2026-04-24 21:28:26.490278662 +0000 UTC m=+52.666013314" observedRunningTime="2026-04-24 21:28:26.680003879 +0000 UTC m=+52.855738553" watchObservedRunningTime="2026-04-24 21:28:26.680602361 +0000 UTC m=+52.856337028" Apr 24 21:28:26.697798 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:26.697768 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c956f48c6-vl47q" podStartSLOduration=1.69594592 podStartE2EDuration="5.697757729s" podCreationTimestamp="2026-04-24 21:28:21 +0000 UTC" firstStartedPulling="2026-04-24 21:28:22.20552166 +0000 UTC m=+48.381256313" lastFinishedPulling="2026-04-24 21:28:26.207333471 +0000 UTC m=+52.383068122" observedRunningTime="2026-04-24 21:28:26.697726512 +0000 UTC m=+52.873461184" watchObservedRunningTime="2026-04-24 21:28:26.697757729 +0000 UTC m=+52.873492401" Apr 24 21:28:28.429952 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.429920 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw"] Apr 24 21:28:28.450322 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.450297 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw"] Apr 24 21:28:28.450322 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.450321 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-pqztz"] Apr 24 21:28:28.450530 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.450441 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw" Apr 24 21:28:28.453028 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.453003 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 21:28:28.453140 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.453016 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:28:28.453140 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.453070 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 21:28:28.453320 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.453300 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-gbktl\"" Apr 24 21:28:28.453413 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.453372 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:28:28.453833 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.453816 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:28:28.477640 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.477617 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-dhqs2"] Apr 24 21:28:28.477768 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.477755 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" Apr 24 21:28:28.479994 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.479968 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-ws248\"" Apr 24 21:28:28.480325 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.480312 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 21:28:28.480660 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.480645 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 21:28:28.481362 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.481349 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 21:28:28.492735 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.492711 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/95be5fef-54c5-493f-8d81-418b407f5be9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pqztz\" (UID: \"95be5fef-54c5-493f-8d81-418b407f5be9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" Apr 24 21:28:28.492840 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.492762 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/95be5fef-54c5-493f-8d81-418b407f5be9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-pqztz\" (UID: \"95be5fef-54c5-493f-8d81-418b407f5be9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" Apr 24 21:28:28.492840 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.492788 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mghw\" (UniqueName: \"kubernetes.io/projected/95be5fef-54c5-493f-8d81-418b407f5be9-kube-api-access-9mghw\") pod \"kube-state-metrics-69db897b98-pqztz\" (UID: \"95be5fef-54c5-493f-8d81-418b407f5be9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" Apr 24 21:28:28.492933 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.492839 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b3482684-9774-41e9-b6e1-2fc96e50331a-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-wp9pw\" (UID: \"b3482684-9774-41e9-b6e1-2fc96e50331a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw" Apr 24 21:28:28.492933 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.492879 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/95be5fef-54c5-493f-8d81-418b407f5be9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-pqztz\" (UID: \"95be5fef-54c5-493f-8d81-418b407f5be9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" Apr 24 21:28:28.492933 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.492919 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b3482684-9774-41e9-b6e1-2fc96e50331a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-wp9pw\" (UID: \"b3482684-9774-41e9-b6e1-2fc96e50331a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw" Apr 24 21:28:28.493052 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.492952 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3482684-9774-41e9-b6e1-2fc96e50331a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-wp9pw\" (UID: \"b3482684-9774-41e9-b6e1-2fc96e50331a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw" Apr 24 21:28:28.493052 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.493006 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdpdx\" (UniqueName: \"kubernetes.io/projected/b3482684-9774-41e9-b6e1-2fc96e50331a-kube-api-access-kdpdx\") pod \"openshift-state-metrics-9d44df66c-wp9pw\" (UID: \"b3482684-9774-41e9-b6e1-2fc96e50331a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw" Apr 24 21:28:28.493052 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.493041 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/95be5fef-54c5-493f-8d81-418b407f5be9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-pqztz\" (UID: \"95be5fef-54c5-493f-8d81-418b407f5be9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" Apr 24 21:28:28.493160 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.493081 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/95be5fef-54c5-493f-8d81-418b407f5be9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-pqztz\" (UID: \"95be5fef-54c5-493f-8d81-418b407f5be9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" Apr 24 21:28:28.496975 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.496959 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-pqztz"] Apr 24 21:28:28.497062 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.497053 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.499182 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.499167 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:28:28.499475 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.499457 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-kvvhf\"" Apr 24 21:28:28.499564 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.499478 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:28:28.499627 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.499579 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:28:28.593347 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.593327 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/96626301-1303-4330-95d7-03f32a1420c6-root\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.593436 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.593363 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/95be5fef-54c5-493f-8d81-418b407f5be9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-pqztz\" (UID: \"95be5fef-54c5-493f-8d81-418b407f5be9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" Apr 24 21:28:28.593436 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.593384 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/95be5fef-54c5-493f-8d81-418b407f5be9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pqztz\" (UID: \"95be5fef-54c5-493f-8d81-418b407f5be9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" Apr 24 21:28:28.593436 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.593422 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/96626301-1303-4330-95d7-03f32a1420c6-metrics-client-ca\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.593573 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.593454 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/96626301-1303-4330-95d7-03f32a1420c6-node-exporter-accelerators-collector-config\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.593573 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.593492 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/96626301-1303-4330-95d7-03f32a1420c6-node-exporter-tls\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.593573 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.593513 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3482684-9774-41e9-b6e1-2fc96e50331a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-wp9pw\" (UID: \"b3482684-9774-41e9-b6e1-2fc96e50331a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw" Apr 24 21:28:28.593573 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:28.593524 2578 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 24 21:28:28.593573 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.593563 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/96626301-1303-4330-95d7-03f32a1420c6-node-exporter-wtmp\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.593762 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.593584 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/96626301-1303-4330-95d7-03f32a1420c6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.593762 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:28.593610 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95be5fef-54c5-493f-8d81-418b407f5be9-kube-state-metrics-tls podName:95be5fef-54c5-493f-8d81-418b407f5be9 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:29.093591661 +0000 UTC m=+55.269326331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/95be5fef-54c5-493f-8d81-418b407f5be9-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-pqztz" (UID: "95be5fef-54c5-493f-8d81-418b407f5be9") : secret "kube-state-metrics-tls" not found Apr 24 21:28:28.593762 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.593639 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/95be5fef-54c5-493f-8d81-418b407f5be9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-pqztz\" (UID: \"95be5fef-54c5-493f-8d81-418b407f5be9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" Apr 24 21:28:28.593762 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:28.593665 2578 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 24 21:28:28.593762 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.593674 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/96626301-1303-4330-95d7-03f32a1420c6-node-exporter-textfile\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.593762 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:28.593703 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3482684-9774-41e9-b6e1-2fc96e50331a-openshift-state-metrics-tls podName:b3482684-9774-41e9-b6e1-2fc96e50331a nodeName:}" failed. No retries permitted until 2026-04-24 21:28:29.09368915 +0000 UTC m=+55.269423801 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/b3482684-9774-41e9-b6e1-2fc96e50331a-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-wp9pw" (UID: "b3482684-9774-41e9-b6e1-2fc96e50331a") : secret "openshift-state-metrics-tls" not found Apr 24 21:28:28.593762 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.593721 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phc78\" (UniqueName: \"kubernetes.io/projected/96626301-1303-4330-95d7-03f32a1420c6-kube-api-access-phc78\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.593762 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.593744 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/95be5fef-54c5-493f-8d81-418b407f5be9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-pqztz\" (UID: \"95be5fef-54c5-493f-8d81-418b407f5be9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" Apr 24 21:28:28.593762 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.593762 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/95be5fef-54c5-493f-8d81-418b407f5be9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-pqztz\" (UID: \"95be5fef-54c5-493f-8d81-418b407f5be9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" Apr 24 21:28:28.594187 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.593780 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mghw\" (UniqueName: \"kubernetes.io/projected/95be5fef-54c5-493f-8d81-418b407f5be9-kube-api-access-9mghw\") pod \"kube-state-metrics-69db897b98-pqztz\" (UID: \"95be5fef-54c5-493f-8d81-418b407f5be9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" Apr 24 21:28:28.594187 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.593804 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b3482684-9774-41e9-b6e1-2fc96e50331a-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-wp9pw\" (UID: \"b3482684-9774-41e9-b6e1-2fc96e50331a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw" Apr 24 21:28:28.594187 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.593820 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/96626301-1303-4330-95d7-03f32a1420c6-sys\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.594187 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.593841 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b3482684-9774-41e9-b6e1-2fc96e50331a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-wp9pw\" (UID: \"b3482684-9774-41e9-b6e1-2fc96e50331a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw" Apr 24 21:28:28.594187 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.593866 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdpdx\" (UniqueName: \"kubernetes.io/projected/b3482684-9774-41e9-b6e1-2fc96e50331a-kube-api-access-kdpdx\") pod \"openshift-state-metrics-9d44df66c-wp9pw\" (UID: \"b3482684-9774-41e9-b6e1-2fc96e50331a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw" Apr 24 21:28:28.594187 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.594063 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/95be5fef-54c5-493f-8d81-418b407f5be9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-pqztz\" (UID: \"95be5fef-54c5-493f-8d81-418b407f5be9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" Apr 24 21:28:28.594477 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.594313 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/95be5fef-54c5-493f-8d81-418b407f5be9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-pqztz\" (UID: \"95be5fef-54c5-493f-8d81-418b407f5be9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" Apr 24 21:28:28.594477 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.594451 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/95be5fef-54c5-493f-8d81-418b407f5be9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-pqztz\" (UID: \"95be5fef-54c5-493f-8d81-418b407f5be9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" Apr 24 21:28:28.594630 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.594611 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b3482684-9774-41e9-b6e1-2fc96e50331a-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-wp9pw\" (UID: \"b3482684-9774-41e9-b6e1-2fc96e50331a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw" Apr 24 21:28:28.597230 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.597215 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/95be5fef-54c5-493f-8d81-418b407f5be9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-pqztz\" (UID: \"95be5fef-54c5-493f-8d81-418b407f5be9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" Apr 24 21:28:28.597270 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.597247 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b3482684-9774-41e9-b6e1-2fc96e50331a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-wp9pw\" (UID: \"b3482684-9774-41e9-b6e1-2fc96e50331a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw" Apr 24 21:28:28.602803 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.602779 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mghw\" (UniqueName: \"kubernetes.io/projected/95be5fef-54c5-493f-8d81-418b407f5be9-kube-api-access-9mghw\") pod \"kube-state-metrics-69db897b98-pqztz\" (UID: \"95be5fef-54c5-493f-8d81-418b407f5be9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" Apr 24 21:28:28.606611 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.606592 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdpdx\" (UniqueName: \"kubernetes.io/projected/b3482684-9774-41e9-b6e1-2fc96e50331a-kube-api-access-kdpdx\") pod \"openshift-state-metrics-9d44df66c-wp9pw\" (UID: \"b3482684-9774-41e9-b6e1-2fc96e50331a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw" Apr 24 21:28:28.694187 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.694118 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/96626301-1303-4330-95d7-03f32a1420c6-metrics-client-ca\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.694187 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.694151 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/96626301-1303-4330-95d7-03f32a1420c6-node-exporter-accelerators-collector-config\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.694187 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.694179 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/96626301-1303-4330-95d7-03f32a1420c6-node-exporter-tls\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.694389 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.694217 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/96626301-1303-4330-95d7-03f32a1420c6-node-exporter-wtmp\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.694389 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.694233 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/96626301-1303-4330-95d7-03f32a1420c6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.694389 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.694260 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/96626301-1303-4330-95d7-03f32a1420c6-node-exporter-textfile\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.694389 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.694288 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phc78\" (UniqueName: \"kubernetes.io/projected/96626301-1303-4330-95d7-03f32a1420c6-kube-api-access-phc78\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.694389 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.694322 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/96626301-1303-4330-95d7-03f32a1420c6-sys\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.694389 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:28.694351 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 21:28:28.694389 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.694389 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/96626301-1303-4330-95d7-03f32a1420c6-root\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.694673 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:28.694416 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96626301-1303-4330-95d7-03f32a1420c6-node-exporter-tls podName:96626301-1303-4330-95d7-03f32a1420c6 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:29.194399309 +0000 UTC m=+55.370133965 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/96626301-1303-4330-95d7-03f32a1420c6-node-exporter-tls") pod "node-exporter-dhqs2" (UID: "96626301-1303-4330-95d7-03f32a1420c6") : secret "node-exporter-tls" not found Apr 24 21:28:28.694673 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.694355 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/96626301-1303-4330-95d7-03f32a1420c6-root\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.694673 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.694429 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/96626301-1303-4330-95d7-03f32a1420c6-sys\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.694673 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.694433 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/96626301-1303-4330-95d7-03f32a1420c6-node-exporter-wtmp\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.705676 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.705651 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/96626301-1303-4330-95d7-03f32a1420c6-node-exporter-textfile\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.705839 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.705812 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/96626301-1303-4330-95d7-03f32a1420c6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.705938 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.705923 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/96626301-1303-4330-95d7-03f32a1420c6-node-exporter-accelerators-collector-config\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.705980 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.705940 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/96626301-1303-4330-95d7-03f32a1420c6-metrics-client-ca\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:28.707484 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:28.707468 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phc78\" (UniqueName: \"kubernetes.io/projected/96626301-1303-4330-95d7-03f32a1420c6-kube-api-access-phc78\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:29.097589 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.097565 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/95be5fef-54c5-493f-8d81-418b407f5be9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pqztz\" (UID: \"95be5fef-54c5-493f-8d81-418b407f5be9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" Apr 24 21:28:29.097734 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.097610 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3482684-9774-41e9-b6e1-2fc96e50331a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-wp9pw\" (UID: \"b3482684-9774-41e9-b6e1-2fc96e50331a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw" Apr 24 21:28:29.097734 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:29.097728 2578 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 24 21:28:29.097841 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:29.097776 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3482684-9774-41e9-b6e1-2fc96e50331a-openshift-state-metrics-tls podName:b3482684-9774-41e9-b6e1-2fc96e50331a nodeName:}" failed. No retries permitted until 2026-04-24 21:28:30.097762989 +0000 UTC m=+56.273497640 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/b3482684-9774-41e9-b6e1-2fc96e50331a-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-wp9pw" (UID: "b3482684-9774-41e9-b6e1-2fc96e50331a") : secret "openshift-state-metrics-tls" not found Apr 24 21:28:29.099834 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.099813 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/95be5fef-54c5-493f-8d81-418b407f5be9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pqztz\" (UID: \"95be5fef-54c5-493f-8d81-418b407f5be9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" Apr 24 21:28:29.198898 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.198877 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/96626301-1303-4330-95d7-03f32a1420c6-node-exporter-tls\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:29.200940 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.200923 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/96626301-1303-4330-95d7-03f32a1420c6-node-exporter-tls\") pod \"node-exporter-dhqs2\" (UID: \"96626301-1303-4330-95d7-03f32a1420c6\") " pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:29.385664 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.385610 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" Apr 24 21:28:29.405462 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.405440 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dhqs2" Apr 24 21:28:29.412466 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:28:29.412441 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96626301_1303_4330_95d7_03f32a1420c6.slice/crio-5d1010b0a2ea2e4d9f03d533dae2939ce772d931c2f62cccd17564bacff8bbe2 WatchSource:0}: Error finding container 5d1010b0a2ea2e4d9f03d533dae2939ce772d931c2f62cccd17564bacff8bbe2: Status 404 returned error can't find the container with id 5d1010b0a2ea2e4d9f03d533dae2939ce772d931c2f62cccd17564bacff8bbe2 Apr 24 21:28:29.510747 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.510725 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-pqztz"] Apr 24 21:28:29.512916 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:28:29.512888 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95be5fef_54c5_493f_8d81_418b407f5be9.slice/crio-c964b6f39e080c517b485713e4edf0b94df446c84e0540ed0dad876bee2a0237 WatchSource:0}: Error finding container c964b6f39e080c517b485713e4edf0b94df446c84e0540ed0dad876bee2a0237: Status 404 returned error can't find the container with id c964b6f39e080c517b485713e4edf0b94df446c84e0540ed0dad876bee2a0237 Apr 24 21:28:29.675478 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.675404 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" event={"ID":"95be5fef-54c5-493f-8d81-418b407f5be9","Type":"ContainerStarted","Data":"c964b6f39e080c517b485713e4edf0b94df446c84e0540ed0dad876bee2a0237"} Apr 24 21:28:29.676422 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.676389 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dhqs2" event={"ID":"96626301-1303-4330-95d7-03f32a1420c6","Type":"ContainerStarted","Data":"5d1010b0a2ea2e4d9f03d533dae2939ce772d931c2f62cccd17564bacff8bbe2"} Apr 24 21:28:29.686205 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.686182 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:28:29.690933 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.690918 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.694146 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.694126 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 21:28:29.694146 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.694137 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 21:28:29.694278 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.694126 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 21:28:29.694278 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.694134 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 21:28:29.694478 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.694460 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 21:28:29.694478 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.694480 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 21:28:29.694605 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.694529 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 21:28:29.694605 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.694483 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 21:28:29.694605 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.694530 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-xg49n\"" Apr 24 21:28:29.694735 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.694466 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 21:28:29.703679 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.703660 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8f2739b7-bd60-4645-80e1-15fbf600cd25-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.703772 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.703692 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8f2739b7-bd60-4645-80e1-15fbf600cd25-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.703772 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.703711 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8f2739b7-bd60-4645-80e1-15fbf600cd25-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.703772 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.703756 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6bdj\" (UniqueName: \"kubernetes.io/projected/8f2739b7-bd60-4645-80e1-15fbf600cd25-kube-api-access-q6bdj\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.703895 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.703818 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8f2739b7-bd60-4645-80e1-15fbf600cd25-config-volume\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.703895 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.703835 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8f2739b7-bd60-4645-80e1-15fbf600cd25-web-config\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.703895 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.703857 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8f2739b7-bd60-4645-80e1-15fbf600cd25-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.704115 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.703906 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8f2739b7-bd60-4645-80e1-15fbf600cd25-config-out\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.704115 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.703932 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8f2739b7-bd60-4645-80e1-15fbf600cd25-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.704115 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.703949 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f2739b7-bd60-4645-80e1-15fbf600cd25-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.704115 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.703979 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8f2739b7-bd60-4645-80e1-15fbf600cd25-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.704115 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.704054 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8f2739b7-bd60-4645-80e1-15fbf600cd25-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.704367 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.704129 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8f2739b7-bd60-4645-80e1-15fbf600cd25-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.705950 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.705931 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:28:29.805156 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.805117 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8f2739b7-bd60-4645-80e1-15fbf600cd25-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.805319 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.805169 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8f2739b7-bd60-4645-80e1-15fbf600cd25-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.805319 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.805205 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8f2739b7-bd60-4645-80e1-15fbf600cd25-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.805319 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.805227 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6bdj\" (UniqueName: \"kubernetes.io/projected/8f2739b7-bd60-4645-80e1-15fbf600cd25-kube-api-access-q6bdj\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.805319 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.805258 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8f2739b7-bd60-4645-80e1-15fbf600cd25-config-volume\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.805319 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.805279 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8f2739b7-bd60-4645-80e1-15fbf600cd25-web-config\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.805319 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.805302 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8f2739b7-bd60-4645-80e1-15fbf600cd25-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.805607 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.805333 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8f2739b7-bd60-4645-80e1-15fbf600cd25-config-out\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.805607 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.805360 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8f2739b7-bd60-4645-80e1-15fbf600cd25-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.805607 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.805386 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f2739b7-bd60-4645-80e1-15fbf600cd25-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.805607 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.805434 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8f2739b7-bd60-4645-80e1-15fbf600cd25-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.805607 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.805476 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8f2739b7-bd60-4645-80e1-15fbf600cd25-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.805839 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.805600 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8f2739b7-bd60-4645-80e1-15fbf600cd25-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.806198 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:29.806176 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8f2739b7-bd60-4645-80e1-15fbf600cd25-alertmanager-trusted-ca-bundle podName:8f2739b7-bd60-4645-80e1-15fbf600cd25 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:30.306157298 +0000 UTC m=+56.481891956 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/8f2739b7-bd60-4645-80e1-15fbf600cd25-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "8f2739b7-bd60-4645-80e1-15fbf600cd25") : configmap references non-existent config key: ca-bundle.crt Apr 24 21:28:29.806460 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.806439 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8f2739b7-bd60-4645-80e1-15fbf600cd25-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.807136 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.807064 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8f2739b7-bd60-4645-80e1-15fbf600cd25-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.808713 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.808689 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8f2739b7-bd60-4645-80e1-15fbf600cd25-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.809520 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.809272 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8f2739b7-bd60-4645-80e1-15fbf600cd25-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.809520 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.809383 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8f2739b7-bd60-4645-80e1-15fbf600cd25-config-out\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.809520 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.809451 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8f2739b7-bd60-4645-80e1-15fbf600cd25-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.809520 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.809478 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8f2739b7-bd60-4645-80e1-15fbf600cd25-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.810139 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.810118 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8f2739b7-bd60-4645-80e1-15fbf600cd25-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.810973 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.810950 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8f2739b7-bd60-4645-80e1-15fbf600cd25-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.811067 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.810974 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8f2739b7-bd60-4645-80e1-15fbf600cd25-config-volume\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.811664 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.811644 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8f2739b7-bd60-4645-80e1-15fbf600cd25-web-config\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:29.815466 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:29.815449 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6bdj\" (UniqueName: \"kubernetes.io/projected/8f2739b7-bd60-4645-80e1-15fbf600cd25-kube-api-access-q6bdj\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:30.108284 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.108253 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3482684-9774-41e9-b6e1-2fc96e50331a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-wp9pw\" (UID: \"b3482684-9774-41e9-b6e1-2fc96e50331a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw" Apr 24 21:28:30.111043 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.111008 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3482684-9774-41e9-b6e1-2fc96e50331a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-wp9pw\" (UID: \"b3482684-9774-41e9-b6e1-2fc96e50331a\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw" Apr 24 21:28:30.269813 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.269773 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw" Apr 24 21:28:30.310113 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.310090 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f2739b7-bd60-4645-80e1-15fbf600cd25-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:30.311095 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.311071 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f2739b7-bd60-4645-80e1-15fbf600cd25-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8f2739b7-bd60-4645-80e1-15fbf600cd25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:30.585485 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.585459 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6d7ffb5569-nnk78"] Apr 24 21:28:30.591339 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.591322 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.593671 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.593647 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 21:28:30.593815 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.593782 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 21:28:30.593883 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.593649 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 21:28:30.593883 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.593853 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-5n6u8vcjm7btd\"" Apr 24 21:28:30.593981 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.593783 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 21:28:30.594055 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.594038 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-tp8zb\"" Apr 24 21:28:30.594113 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.594071 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 21:28:30.599439 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.599422 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:30.602761 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.602716 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6d7ffb5569-nnk78"] Apr 24 21:28:30.612534 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.612515 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8814da05-08f7-4703-81e5-78d626a5bcd6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.612653 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.612580 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8814da05-08f7-4703-81e5-78d626a5bcd6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.612653 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.612646 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8814da05-08f7-4703-81e5-78d626a5bcd6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.612749 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.612675 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8814da05-08f7-4703-81e5-78d626a5bcd6-secret-grpc-tls\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.612749 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.612716 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8814da05-08f7-4703-81e5-78d626a5bcd6-secret-thanos-querier-tls\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.612827 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.612754 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8814da05-08f7-4703-81e5-78d626a5bcd6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.612827 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.612772 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8814da05-08f7-4703-81e5-78d626a5bcd6-metrics-client-ca\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.612827 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.612796 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9hqs\" (UniqueName: \"kubernetes.io/projected/8814da05-08f7-4703-81e5-78d626a5bcd6-kube-api-access-k9hqs\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.638009 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.637977 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw"] Apr 24 21:28:30.642140 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:28:30.642116 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3482684_9774_41e9_b6e1_2fc96e50331a.slice/crio-8ce4aa2969a59c1c5b3b09586c656b835feccc8dc13fda8ce431eea9362d7d8c WatchSource:0}: Error finding container 8ce4aa2969a59c1c5b3b09586c656b835feccc8dc13fda8ce431eea9362d7d8c: Status 404 returned error can't find the container with id 8ce4aa2969a59c1c5b3b09586c656b835feccc8dc13fda8ce431eea9362d7d8c Apr 24 21:28:30.682033 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.682003 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" event={"ID":"95be5fef-54c5-493f-8d81-418b407f5be9","Type":"ContainerStarted","Data":"0b509e9924eaa909222ddcc298f24eba2d5f046ed466db538517a200a46ec581"} Apr 24 21:28:30.687498 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.687465 2578 generic.go:358] "Generic (PLEG): container finished" podID="96626301-1303-4330-95d7-03f32a1420c6" containerID="06f30c69e6d5bc71445f211049f2a4a2fb69095296dd6f6c083fa0d52ead9d86" exitCode=0 Apr 24 21:28:30.687607 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.687567 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dhqs2" event={"ID":"96626301-1303-4330-95d7-03f32a1420c6","Type":"ContainerDied","Data":"06f30c69e6d5bc71445f211049f2a4a2fb69095296dd6f6c083fa0d52ead9d86"} Apr 24 21:28:30.693029 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.692534 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw" event={"ID":"b3482684-9774-41e9-b6e1-2fc96e50331a","Type":"ContainerStarted","Data":"8ce4aa2969a59c1c5b3b09586c656b835feccc8dc13fda8ce431eea9362d7d8c"} Apr 24 21:28:30.714381 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.713464 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8814da05-08f7-4703-81e5-78d626a5bcd6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.714381 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.714300 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8814da05-08f7-4703-81e5-78d626a5bcd6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.717848 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.717316 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8814da05-08f7-4703-81e5-78d626a5bcd6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.717848 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.717364 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8814da05-08f7-4703-81e5-78d626a5bcd6-secret-grpc-tls\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.717848 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.717406 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8814da05-08f7-4703-81e5-78d626a5bcd6-secret-thanos-querier-tls\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.717848 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.717494 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8814da05-08f7-4703-81e5-78d626a5bcd6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.719985 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.718222 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8814da05-08f7-4703-81e5-78d626a5bcd6-metrics-client-ca\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.719985 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.718273 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9hqs\" (UniqueName: \"kubernetes.io/projected/8814da05-08f7-4703-81e5-78d626a5bcd6-kube-api-access-k9hqs\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.719985 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.719307 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8814da05-08f7-4703-81e5-78d626a5bcd6-metrics-client-ca\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.722972 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.722949 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8814da05-08f7-4703-81e5-78d626a5bcd6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.723434 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.723393 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8814da05-08f7-4703-81e5-78d626a5bcd6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.723700 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.723661 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8814da05-08f7-4703-81e5-78d626a5bcd6-secret-grpc-tls\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.727459 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.727434 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8814da05-08f7-4703-81e5-78d626a5bcd6-secret-thanos-querier-tls\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.729977 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.729327 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8814da05-08f7-4703-81e5-78d626a5bcd6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.729977 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.729839 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9hqs\" (UniqueName: \"kubernetes.io/projected/8814da05-08f7-4703-81e5-78d626a5bcd6-kube-api-access-k9hqs\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.729977 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.729904 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8814da05-08f7-4703-81e5-78d626a5bcd6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6d7ffb5569-nnk78\" (UID: \"8814da05-08f7-4703-81e5-78d626a5bcd6\") " pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:30.751046 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.751023 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:28:30.767681 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:28:30.767598 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f2739b7_bd60_4645_80e1_15fbf600cd25.slice/crio-700e826eeba603aa53181b0b4d43a1537e317b19d025b8852a30c04d72aac025 WatchSource:0}: Error finding container 700e826eeba603aa53181b0b4d43a1537e317b19d025b8852a30c04d72aac025: Status 404 returned error can't find the container with id 700e826eeba603aa53181b0b4d43a1537e317b19d025b8852a30c04d72aac025 Apr 24 21:28:30.939726 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:30.939692 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:31.063492 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:31.063406 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6d7ffb5569-nnk78"] Apr 24 21:28:31.066723 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:28:31.066696 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8814da05_08f7_4703_81e5_78d626a5bcd6.slice/crio-be572014fd614804d3e2880cc61836c7f265c2e60d02c0a537a68e82de578e2d WatchSource:0}: Error finding container be572014fd614804d3e2880cc61836c7f265c2e60d02c0a537a68e82de578e2d: Status 404 returned error can't find the container with id be572014fd614804d3e2880cc61836c7f265c2e60d02c0a537a68e82de578e2d Apr 24 21:28:31.697644 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:31.697590 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f2739b7-bd60-4645-80e1-15fbf600cd25","Type":"ContainerStarted","Data":"700e826eeba603aa53181b0b4d43a1537e317b19d025b8852a30c04d72aac025"} Apr 24 21:28:31.698912 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:31.698875 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" event={"ID":"8814da05-08f7-4703-81e5-78d626a5bcd6","Type":"ContainerStarted","Data":"be572014fd614804d3e2880cc61836c7f265c2e60d02c0a537a68e82de578e2d"} Apr 24 21:28:31.701335 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:31.701311 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dhqs2" event={"ID":"96626301-1303-4330-95d7-03f32a1420c6","Type":"ContainerStarted","Data":"b33921081f53a0d103480cd520fa9d8442eb879c155961faff90f4b11d9cf96e"} Apr 24 21:28:31.701444 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:31.701338 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dhqs2" event={"ID":"96626301-1303-4330-95d7-03f32a1420c6","Type":"ContainerStarted","Data":"b0f4cfc08d5bc274a0e8371147b8b89a7bfcae8c55755fe1a518a0e047ce0658"} Apr 24 21:28:31.703200 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:31.703180 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw" event={"ID":"b3482684-9774-41e9-b6e1-2fc96e50331a","Type":"ContainerStarted","Data":"01f24cfaad919c8118308c635627759b2ad7dc557acbbc76576259baffb402da"} Apr 24 21:28:31.703299 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:31.703209 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw" event={"ID":"b3482684-9774-41e9-b6e1-2fc96e50331a","Type":"ContainerStarted","Data":"fcf59d50ece8049ec6b459ced6d8bf98f1b2bce12ee14a6cfc07d3378e6533aa"} Apr 24 21:28:31.705403 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:31.705357 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" event={"ID":"95be5fef-54c5-493f-8d81-418b407f5be9","Type":"ContainerStarted","Data":"e50963f817e3fa64de96d2a7fbe5f7782feedf07b92ad19a65ae681c48f04814"} Apr 24 21:28:31.705403 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:31.705384 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" event={"ID":"95be5fef-54c5-493f-8d81-418b407f5be9","Type":"ContainerStarted","Data":"9701aa9ee6fb42bdd904b030463e33b24d4a713e76780e645a645ca2415729f5"} Apr 24 21:28:31.725104 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:31.725044 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-dhqs2" podStartSLOduration=3.0413683 podStartE2EDuration="3.725029075s" podCreationTimestamp="2026-04-24 21:28:28 +0000 UTC" firstStartedPulling="2026-04-24 21:28:29.414284073 +0000 UTC m=+55.590018728" lastFinishedPulling="2026-04-24 21:28:30.097944838 +0000 UTC m=+56.273679503" observedRunningTime="2026-04-24 21:28:31.723039847 +0000 UTC m=+57.898774520" watchObservedRunningTime="2026-04-24 21:28:31.725029075 +0000 UTC m=+57.900763752" Apr 24 21:28:31.744064 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:31.744021 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-pqztz" podStartSLOduration=2.7041266840000002 podStartE2EDuration="3.744007555s" podCreationTimestamp="2026-04-24 21:28:28 +0000 UTC" firstStartedPulling="2026-04-24 21:28:29.514673245 +0000 UTC m=+55.690407897" lastFinishedPulling="2026-04-24 21:28:30.554554102 +0000 UTC m=+56.730288768" observedRunningTime="2026-04-24 21:28:31.743621905 +0000 UTC m=+57.919356596" watchObservedRunningTime="2026-04-24 21:28:31.744007555 +0000 UTC m=+57.919742228" Apr 24 21:28:32.001153 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.000838 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:32.001153 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.000878 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:32.007404 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.007380 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:32.709034 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.709012 2578 generic.go:358] "Generic (PLEG): container finished" podID="8f2739b7-bd60-4645-80e1-15fbf600cd25" containerID="931ba986179d5f455afea0adc3f51892c064b7b4234afb20b219a11c1d641eb8" exitCode=0 Apr 24 21:28:32.709302 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.709074 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f2739b7-bd60-4645-80e1-15fbf600cd25","Type":"ContainerDied","Data":"931ba986179d5f455afea0adc3f51892c064b7b4234afb20b219a11c1d641eb8"} Apr 24 21:28:32.710597 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.710568 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" event={"ID":"8814da05-08f7-4703-81e5-78d626a5bcd6","Type":"ContainerStarted","Data":"d846951d7dad4d174cbe5a4bf1e1195d320886fbeaf8bb625338b2e34895a6a7"} Apr 24 21:28:32.712359 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.712338 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw" event={"ID":"b3482684-9774-41e9-b6e1-2fc96e50331a","Type":"ContainerStarted","Data":"8df81c42a53778ce8e76b40de4d4aaba82b91631a577639579d192826fc216f8"} Apr 24 21:28:32.717099 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.717082 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:28:32.757225 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.757104 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wp9pw" podStartSLOduration=3.651660625 podStartE2EDuration="4.757085687s" podCreationTimestamp="2026-04-24 21:28:28 +0000 UTC" firstStartedPulling="2026-04-24 21:28:30.840974 +0000 UTC m=+57.016708658" lastFinishedPulling="2026-04-24 21:28:31.946399069 +0000 UTC m=+58.122133720" observedRunningTime="2026-04-24 21:28:32.756512979 +0000 UTC m=+58.932247651" watchObservedRunningTime="2026-04-24 21:28:32.757085687 +0000 UTC m=+58.932820360" Apr 24 21:28:32.762794 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.762772 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-fb766d8dc-4j2hh"] Apr 24 21:28:32.766284 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.766213 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:32.768800 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.768766 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 21:28:32.768945 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.768929 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-fo56b00rb3c8j\"" Apr 24 21:28:32.769024 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.769002 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 21:28:32.769093 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.768933 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 21:28:32.769093 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.768932 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-nklt9\"" Apr 24 21:28:32.769418 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.769400 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 21:28:32.775047 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.775029 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-fb766d8dc-4j2hh"] Apr 24 21:28:32.837411 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.837380 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dabc3649-986c-417d-8a62-0996e4d2bc1c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-fb766d8dc-4j2hh\" (UID: \"dabc3649-986c-417d-8a62-0996e4d2bc1c\") " pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:32.837485 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.837442 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8nv8\" (UniqueName: \"kubernetes.io/projected/dabc3649-986c-417d-8a62-0996e4d2bc1c-kube-api-access-p8nv8\") pod \"metrics-server-fb766d8dc-4j2hh\" (UID: \"dabc3649-986c-417d-8a62-0996e4d2bc1c\") " pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:32.837485 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.837475 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/dabc3649-986c-417d-8a62-0996e4d2bc1c-secret-metrics-server-tls\") pod \"metrics-server-fb766d8dc-4j2hh\" (UID: \"dabc3649-986c-417d-8a62-0996e4d2bc1c\") " pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:32.837657 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.837515 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dabc3649-986c-417d-8a62-0996e4d2bc1c-client-ca-bundle\") pod \"metrics-server-fb766d8dc-4j2hh\" (UID: \"dabc3649-986c-417d-8a62-0996e4d2bc1c\") " pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:32.837657 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.837564 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/dabc3649-986c-417d-8a62-0996e4d2bc1c-audit-log\") pod \"metrics-server-fb766d8dc-4j2hh\" (UID: \"dabc3649-986c-417d-8a62-0996e4d2bc1c\") " pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:32.837724 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.837653 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/dabc3649-986c-417d-8a62-0996e4d2bc1c-metrics-server-audit-profiles\") pod \"metrics-server-fb766d8dc-4j2hh\" (UID: \"dabc3649-986c-417d-8a62-0996e4d2bc1c\") " pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:32.837890 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.837874 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/dabc3649-986c-417d-8a62-0996e4d2bc1c-secret-metrics-server-client-certs\") pod \"metrics-server-fb766d8dc-4j2hh\" (UID: \"dabc3649-986c-417d-8a62-0996e4d2bc1c\") " pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:32.939152 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.939078 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8nv8\" (UniqueName: \"kubernetes.io/projected/dabc3649-986c-417d-8a62-0996e4d2bc1c-kube-api-access-p8nv8\") pod \"metrics-server-fb766d8dc-4j2hh\" (UID: \"dabc3649-986c-417d-8a62-0996e4d2bc1c\") " pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:32.939152 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.939116 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/dabc3649-986c-417d-8a62-0996e4d2bc1c-secret-metrics-server-tls\") pod \"metrics-server-fb766d8dc-4j2hh\" (UID: \"dabc3649-986c-417d-8a62-0996e4d2bc1c\") " pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:32.939282 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.939243 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dabc3649-986c-417d-8a62-0996e4d2bc1c-client-ca-bundle\") pod \"metrics-server-fb766d8dc-4j2hh\" (UID: \"dabc3649-986c-417d-8a62-0996e4d2bc1c\") " pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:32.939312 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.939278 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/dabc3649-986c-417d-8a62-0996e4d2bc1c-audit-log\") pod \"metrics-server-fb766d8dc-4j2hh\" (UID: \"dabc3649-986c-417d-8a62-0996e4d2bc1c\") " pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:32.939348 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.939324 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/dabc3649-986c-417d-8a62-0996e4d2bc1c-metrics-server-audit-profiles\") pod \"metrics-server-fb766d8dc-4j2hh\" (UID: \"dabc3649-986c-417d-8a62-0996e4d2bc1c\") " pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:32.939388 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.939369 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/dabc3649-986c-417d-8a62-0996e4d2bc1c-secret-metrics-server-client-certs\") pod \"metrics-server-fb766d8dc-4j2hh\" (UID: \"dabc3649-986c-417d-8a62-0996e4d2bc1c\") " pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:32.939447 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.939438 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dabc3649-986c-417d-8a62-0996e4d2bc1c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-fb766d8dc-4j2hh\" (UID: \"dabc3649-986c-417d-8a62-0996e4d2bc1c\") " pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:32.939729 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.939703 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/dabc3649-986c-417d-8a62-0996e4d2bc1c-audit-log\") pod \"metrics-server-fb766d8dc-4j2hh\" (UID: \"dabc3649-986c-417d-8a62-0996e4d2bc1c\") " pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:32.940071 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.940049 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dabc3649-986c-417d-8a62-0996e4d2bc1c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-fb766d8dc-4j2hh\" (UID: \"dabc3649-986c-417d-8a62-0996e4d2bc1c\") " pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:32.940583 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.940532 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/dabc3649-986c-417d-8a62-0996e4d2bc1c-metrics-server-audit-profiles\") pod \"metrics-server-fb766d8dc-4j2hh\" (UID: \"dabc3649-986c-417d-8a62-0996e4d2bc1c\") " pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:32.941625 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.941596 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/dabc3649-986c-417d-8a62-0996e4d2bc1c-secret-metrics-server-tls\") pod \"metrics-server-fb766d8dc-4j2hh\" (UID: \"dabc3649-986c-417d-8a62-0996e4d2bc1c\") " pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:32.941735 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.941715 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dabc3649-986c-417d-8a62-0996e4d2bc1c-client-ca-bundle\") pod \"metrics-server-fb766d8dc-4j2hh\" (UID: \"dabc3649-986c-417d-8a62-0996e4d2bc1c\") " pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:32.941833 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.941817 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/dabc3649-986c-417d-8a62-0996e4d2bc1c-secret-metrics-server-client-certs\") pod \"metrics-server-fb766d8dc-4j2hh\" (UID: \"dabc3649-986c-417d-8a62-0996e4d2bc1c\") " pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:32.947170 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:32.947150 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8nv8\" (UniqueName: \"kubernetes.io/projected/dabc3649-986c-417d-8a62-0996e4d2bc1c-kube-api-access-p8nv8\") pod \"metrics-server-fb766d8dc-4j2hh\" (UID: \"dabc3649-986c-417d-8a62-0996e4d2bc1c\") " pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:33.076194 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.076168 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:33.209653 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.209562 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-fb766d8dc-4j2hh"] Apr 24 21:28:33.214108 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:28:33.214084 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddabc3649_986c_417d_8a62_0996e4d2bc1c.slice/crio-6d9de21dc233091f4aa1afb11ed0427bf0869b9c480285c7723a92fc73a30dc6 WatchSource:0}: Error finding container 6d9de21dc233091f4aa1afb11ed0427bf0869b9c480285c7723a92fc73a30dc6: Status 404 returned error can't find the container with id 6d9de21dc233091f4aa1afb11ed0427bf0869b9c480285c7723a92fc73a30dc6 Apr 24 21:28:33.320777 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.319647 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-qqfdb"] Apr 24 21:28:33.323145 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.323118 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qqfdb" Apr 24 21:28:33.325626 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.325449 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 21:28:33.325738 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.325703 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-fxbjp\"" Apr 24 21:28:33.331350 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.331329 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-qqfdb"] Apr 24 21:28:33.343430 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.343406 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/747794d3-8a8d-4ce6-8607-23994becf49d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-qqfdb\" (UID: \"747794d3-8a8d-4ce6-8607-23994becf49d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qqfdb" Apr 24 21:28:33.444075 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.444045 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/747794d3-8a8d-4ce6-8607-23994becf49d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-qqfdb\" (UID: \"747794d3-8a8d-4ce6-8607-23994becf49d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qqfdb" Apr 24 21:28:33.444237 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:33.444213 2578 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 24 21:28:33.444306 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:28:33.444292 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/747794d3-8a8d-4ce6-8607-23994becf49d-monitoring-plugin-cert podName:747794d3-8a8d-4ce6-8607-23994becf49d nodeName:}" failed. No retries permitted until 2026-04-24 21:28:33.944272039 +0000 UTC m=+60.120006694 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/747794d3-8a8d-4ce6-8607-23994becf49d-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-qqfdb" (UID: "747794d3-8a8d-4ce6-8607-23994becf49d") : secret "monitoring-plugin-cert" not found Apr 24 21:28:33.460112 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.460005 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f455c8f5-q626q"] Apr 24 21:28:33.463348 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.463326 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.471116 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.471009 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 21:28:33.482924 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.482905 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f455c8f5-q626q"] Apr 24 21:28:33.545453 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.545421 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5kqz\" (UniqueName: \"kubernetes.io/projected/d5544c7a-a55a-4f52-807c-78716f3d81d6-kube-api-access-c5kqz\") pod \"console-6f455c8f5-q626q\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.545644 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.545478 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-oauth-serving-cert\") pod \"console-6f455c8f5-q626q\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.545644 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.545573 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5544c7a-a55a-4f52-807c-78716f3d81d6-console-serving-cert\") pod \"console-6f455c8f5-q626q\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.545644 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.545635 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5544c7a-a55a-4f52-807c-78716f3d81d6-console-oauth-config\") pod \"console-6f455c8f5-q626q\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.545808 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.545665 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-console-config\") pod \"console-6f455c8f5-q626q\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.545808 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.545682 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-service-ca\") pod \"console-6f455c8f5-q626q\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.545808 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.545702 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-trusted-ca-bundle\") pod \"console-6f455c8f5-q626q\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.605450 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.605426 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4256n" Apr 24 21:28:33.646834 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.646802 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5544c7a-a55a-4f52-807c-78716f3d81d6-console-serving-cert\") pod \"console-6f455c8f5-q626q\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.647007 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.646888 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5544c7a-a55a-4f52-807c-78716f3d81d6-console-oauth-config\") pod \"console-6f455c8f5-q626q\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.647007 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.646909 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-console-config\") pod \"console-6f455c8f5-q626q\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.647007 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.646930 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-service-ca\") pod \"console-6f455c8f5-q626q\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.647007 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.646976 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-trusted-ca-bundle\") pod \"console-6f455c8f5-q626q\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.647213 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.647041 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5kqz\" (UniqueName: \"kubernetes.io/projected/d5544c7a-a55a-4f52-807c-78716f3d81d6-kube-api-access-c5kqz\") pod \"console-6f455c8f5-q626q\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.647213 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.647067 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-oauth-serving-cert\") pod \"console-6f455c8f5-q626q\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.647718 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.647691 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-oauth-serving-cert\") pod \"console-6f455c8f5-q626q\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.648296 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.648271 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-trusted-ca-bundle\") pod \"console-6f455c8f5-q626q\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.648392 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.648307 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-console-config\") pod \"console-6f455c8f5-q626q\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.648817 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.648791 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-service-ca\") pod \"console-6f455c8f5-q626q\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.650575 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.650532 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5544c7a-a55a-4f52-807c-78716f3d81d6-console-serving-cert\") pod \"console-6f455c8f5-q626q\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.650802 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.650784 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5544c7a-a55a-4f52-807c-78716f3d81d6-console-oauth-config\") pod \"console-6f455c8f5-q626q\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.657242 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.657220 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp"] Apr 24 21:28:33.661212 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.661192 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5kqz\" (UniqueName: \"kubernetes.io/projected/d5544c7a-a55a-4f52-807c-78716f3d81d6-kube-api-access-c5kqz\") pod \"console-6f455c8f5-q626q\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.662198 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.662182 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.664510 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.664482 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 21:28:33.664629 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.664613 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-6bvdt\"" Apr 24 21:28:33.664694 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.664661 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 21:28:33.664694 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.664612 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 21:28:33.664826 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.664811 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 21:28:33.665016 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.665002 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 21:28:33.670905 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.670886 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 21:28:33.673738 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.673718 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp"] Apr 24 21:28:33.721118 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.720535 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" event={"ID":"8814da05-08f7-4703-81e5-78d626a5bcd6","Type":"ContainerStarted","Data":"cde1f7d1dd89beee879f04f90a5f031259d157221193719a2b8e6c8ed9f55751"} Apr 24 21:28:33.721118 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.721079 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" event={"ID":"8814da05-08f7-4703-81e5-78d626a5bcd6","Type":"ContainerStarted","Data":"81617592dac2afd720df804bd986621201118421d927d8621a714db7b7b3f7ad"} Apr 24 21:28:33.726740 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.726699 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" event={"ID":"dabc3649-986c-417d-8a62-0996e4d2bc1c","Type":"ContainerStarted","Data":"6d9de21dc233091f4aa1afb11ed0427bf0869b9c480285c7723a92fc73a30dc6"} Apr 24 21:28:33.749844 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.748320 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e3fdaddf-786f-413e-83cb-3bab2578a8a5-metrics-client-ca\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.749844 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.748370 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/e3fdaddf-786f-413e-83cb-3bab2578a8a5-telemeter-client-tls\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.749844 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.748402 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/e3fdaddf-786f-413e-83cb-3bab2578a8a5-secret-telemeter-client\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.749844 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.748446 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fnr9\" (UniqueName: \"kubernetes.io/projected/e3fdaddf-786f-413e-83cb-3bab2578a8a5-kube-api-access-4fnr9\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.749844 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.748510 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/e3fdaddf-786f-413e-83cb-3bab2578a8a5-federate-client-tls\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.749844 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.748535 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3fdaddf-786f-413e-83cb-3bab2578a8a5-serving-certs-ca-bundle\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.749844 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.748591 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3fdaddf-786f-413e-83cb-3bab2578a8a5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.749844 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.748657 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e3fdaddf-786f-413e-83cb-3bab2578a8a5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.774939 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.774904 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:33.849866 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.849830 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/e3fdaddf-786f-413e-83cb-3bab2578a8a5-telemeter-client-tls\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.850049 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.849880 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/e3fdaddf-786f-413e-83cb-3bab2578a8a5-secret-telemeter-client\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.850049 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.849921 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4fnr9\" (UniqueName: \"kubernetes.io/projected/e3fdaddf-786f-413e-83cb-3bab2578a8a5-kube-api-access-4fnr9\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.850263 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.850236 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/e3fdaddf-786f-413e-83cb-3bab2578a8a5-federate-client-tls\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.850328 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.850302 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3fdaddf-786f-413e-83cb-3bab2578a8a5-serving-certs-ca-bundle\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.850381 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.850340 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3fdaddf-786f-413e-83cb-3bab2578a8a5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.850433 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.850399 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e3fdaddf-786f-413e-83cb-3bab2578a8a5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.852004 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.850534 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e3fdaddf-786f-413e-83cb-3bab2578a8a5-metrics-client-ca\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.852004 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.851244 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e3fdaddf-786f-413e-83cb-3bab2578a8a5-metrics-client-ca\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.852004 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.851461 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3fdaddf-786f-413e-83cb-3bab2578a8a5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.852004 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.851970 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3fdaddf-786f-413e-83cb-3bab2578a8a5-serving-certs-ca-bundle\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.853108 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.853087 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/e3fdaddf-786f-413e-83cb-3bab2578a8a5-telemeter-client-tls\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.853204 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.853192 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/e3fdaddf-786f-413e-83cb-3bab2578a8a5-federate-client-tls\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.853503 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.853483 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/e3fdaddf-786f-413e-83cb-3bab2578a8a5-secret-telemeter-client\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.853618 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.853520 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e3fdaddf-786f-413e-83cb-3bab2578a8a5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.858915 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.858897 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fnr9\" (UniqueName: \"kubernetes.io/projected/e3fdaddf-786f-413e-83cb-3bab2578a8a5-kube-api-access-4fnr9\") pod \"telemeter-client-5d6b7fc5bf-9lxwp\" (UID: \"e3fdaddf-786f-413e-83cb-3bab2578a8a5\") " pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:33.951516 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.951493 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/747794d3-8a8d-4ce6-8607-23994becf49d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-qqfdb\" (UID: \"747794d3-8a8d-4ce6-8607-23994becf49d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qqfdb" Apr 24 21:28:33.953809 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.953784 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/747794d3-8a8d-4ce6-8607-23994becf49d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-qqfdb\" (UID: \"747794d3-8a8d-4ce6-8607-23994becf49d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qqfdb" Apr 24 21:28:33.979735 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:33.978286 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" Apr 24 21:28:34.007102 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.007062 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f455c8f5-q626q"] Apr 24 21:28:34.017953 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:28:34.017700 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5544c7a_a55a_4f52_807c_78716f3d81d6.slice/crio-fc568b962b32433c36c252463795dd39043c37ba6d2f586f830bf623f1c5527d WatchSource:0}: Error finding container fc568b962b32433c36c252463795dd39043c37ba6d2f586f830bf623f1c5527d: Status 404 returned error can't find the container with id fc568b962b32433c36c252463795dd39043c37ba6d2f586f830bf623f1c5527d Apr 24 21:28:34.143296 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.143267 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp"] Apr 24 21:28:34.147976 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:28:34.147947 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3fdaddf_786f_413e_83cb_3bab2578a8a5.slice/crio-0f8e6d9f6304309e0956a6fa5d895f9580816aa2d2299c11731b0ec8ea5d0610 WatchSource:0}: Error finding container 0f8e6d9f6304309e0956a6fa5d895f9580816aa2d2299c11731b0ec8ea5d0610: Status 404 returned error can't find the container with id 0f8e6d9f6304309e0956a6fa5d895f9580816aa2d2299c11731b0ec8ea5d0610 Apr 24 21:28:34.235142 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.235119 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qqfdb" Apr 24 21:28:34.408168 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.408144 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-qqfdb"] Apr 24 21:28:34.414917 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:28:34.414888 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod747794d3_8a8d_4ce6_8607_23994becf49d.slice/crio-dc9ab32e2e5e1a7c87e09cb6df0f5f71b646ad3f77d76486a48363a051478957 WatchSource:0}: Error finding container dc9ab32e2e5e1a7c87e09cb6df0f5f71b646ad3f77d76486a48363a051478957: Status 404 returned error can't find the container with id dc9ab32e2e5e1a7c87e09cb6df0f5f71b646ad3f77d76486a48363a051478957 Apr 24 21:28:34.731698 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.731659 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" event={"ID":"e3fdaddf-786f-413e-83cb-3bab2578a8a5","Type":"ContainerStarted","Data":"0f8e6d9f6304309e0956a6fa5d895f9580816aa2d2299c11731b0ec8ea5d0610"} Apr 24 21:28:34.735424 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.735282 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f2739b7-bd60-4645-80e1-15fbf600cd25","Type":"ContainerStarted","Data":"09435358a8139ea34b3c4cf7292d3da24e4608dc5b66442669639e7855c9a456"} Apr 24 21:28:34.735424 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.735314 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f2739b7-bd60-4645-80e1-15fbf600cd25","Type":"ContainerStarted","Data":"9c1140d6783d29dce6b1230173198c89f8b8663128ad6b99dcb279b048968891"} Apr 24 21:28:34.735424 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.735329 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f2739b7-bd60-4645-80e1-15fbf600cd25","Type":"ContainerStarted","Data":"444cd522b42c58cd44d79b68931be43081e420e1352b314b4f13f3f885992537"} Apr 24 21:28:34.735424 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.735340 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f2739b7-bd60-4645-80e1-15fbf600cd25","Type":"ContainerStarted","Data":"6b214e9e591c93b7cb6d03800b7b93b69ff3cd6582d62b97fbab28733c341468"} Apr 24 21:28:34.735424 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.735351 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f2739b7-bd60-4645-80e1-15fbf600cd25","Type":"ContainerStarted","Data":"f832877faeed8c24bd161a06800c87d88bc376f040d02cb8e1d1cbed67c7a38c"} Apr 24 21:28:34.735424 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.735400 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8f2739b7-bd60-4645-80e1-15fbf600cd25","Type":"ContainerStarted","Data":"50e3b59731db041891cee36c3c2c43dab2f326fca263f14430257eae0d9b388a"} Apr 24 21:28:34.739190 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.739130 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" event={"ID":"8814da05-08f7-4703-81e5-78d626a5bcd6","Type":"ContainerStarted","Data":"490bc4f37deece2bf613cf72e1637ac5fbfa43f18642806f64696418ec003f5f"} Apr 24 21:28:34.739190 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.739160 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" event={"ID":"8814da05-08f7-4703-81e5-78d626a5bcd6","Type":"ContainerStarted","Data":"023c9c083e42e7547b77dea5c741ca371d3e1a02642b267698705f52992fab2f"} Apr 24 21:28:34.739190 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.739173 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" event={"ID":"8814da05-08f7-4703-81e5-78d626a5bcd6","Type":"ContainerStarted","Data":"dcd45760baae5cbf6b718217ed823d3bd1f54aa96d1321253d511ba35d4355b7"} Apr 24 21:28:34.739496 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.739461 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:34.740678 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.740636 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qqfdb" event={"ID":"747794d3-8a8d-4ce6-8607-23994becf49d","Type":"ContainerStarted","Data":"dc9ab32e2e5e1a7c87e09cb6df0f5f71b646ad3f77d76486a48363a051478957"} Apr 24 21:28:34.742058 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.742012 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f455c8f5-q626q" event={"ID":"d5544c7a-a55a-4f52-807c-78716f3d81d6","Type":"ContainerStarted","Data":"374a860b8bf4d2871c62b11d45464c4300f504a78ffbe1b92ecfd964aa74b5a3"} Apr 24 21:28:34.742058 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.742037 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f455c8f5-q626q" event={"ID":"d5544c7a-a55a-4f52-807c-78716f3d81d6","Type":"ContainerStarted","Data":"fc568b962b32433c36c252463795dd39043c37ba6d2f586f830bf623f1c5527d"} Apr 24 21:28:34.764220 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.763611 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.60789025 podStartE2EDuration="5.763594581s" podCreationTimestamp="2026-04-24 21:28:29 +0000 UTC" firstStartedPulling="2026-04-24 21:28:30.774399856 +0000 UTC m=+56.950134509" lastFinishedPulling="2026-04-24 21:28:33.930104189 +0000 UTC m=+60.105838840" observedRunningTime="2026-04-24 21:28:34.761639897 +0000 UTC m=+60.937374583" watchObservedRunningTime="2026-04-24 21:28:34.763594581 +0000 UTC m=+60.939329253" Apr 24 21:28:34.780197 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.780158 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f455c8f5-q626q" podStartSLOduration=1.7801461459999999 podStartE2EDuration="1.780146146s" podCreationTimestamp="2026-04-24 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:34.779148315 +0000 UTC m=+60.954882988" watchObservedRunningTime="2026-04-24 21:28:34.780146146 +0000 UTC m=+60.955880818" Apr 24 21:28:34.808470 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.808414 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" podStartSLOduration=1.99363725 podStartE2EDuration="4.808401306s" podCreationTimestamp="2026-04-24 21:28:30 +0000 UTC" firstStartedPulling="2026-04-24 21:28:31.068869841 +0000 UTC m=+57.244604497" lastFinishedPulling="2026-04-24 21:28:33.883633886 +0000 UTC m=+60.059368553" observedRunningTime="2026-04-24 21:28:34.80704199 +0000 UTC m=+60.982776664" watchObservedRunningTime="2026-04-24 21:28:34.808401306 +0000 UTC m=+60.984135981" Apr 24 21:28:34.835268 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.835242 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:28:34.840559 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.840495 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:34.842918 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.842884 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 21:28:34.843151 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.843127 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 21:28:34.843411 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.843379 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 21:28:34.843503 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.843459 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 21:28:34.843635 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.843615 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 21:28:34.843826 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.843803 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 21:28:34.843963 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.843945 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 21:28:34.844068 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.844027 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-2jmnntlggdq05\"" Apr 24 21:28:34.844156 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.844142 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 21:28:34.844362 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.844218 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-zmdxt\"" Apr 24 21:28:34.844362 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.844327 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 21:28:34.844362 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.844352 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 21:28:34.845857 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.844747 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 21:28:34.864318 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.864291 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:28:34.865358 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.865187 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 21:28:34.964499 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.964461 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:34.964666 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.964528 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:34.964666 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.964587 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-config\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:34.964666 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.964613 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:34.964666 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.964637 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:34.964666 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.964661 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:34.964905 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.964682 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:34.964905 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.964709 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:34.964905 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.964733 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-config-out\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:34.964905 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.964772 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:34.964905 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.964819 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:34.964905 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.964862 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-web-config\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:34.964905 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.964887 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:34.965167 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.964919 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:34.965167 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.964948 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:34.965167 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.964977 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:34.965167 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.965004 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:34.965167 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:34.965060 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85t2v\" (UniqueName: \"kubernetes.io/projected/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-kube-api-access-85t2v\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.066059 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.065975 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-config\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.066059 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.066023 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.066059 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.066053 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.066259 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.066077 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.066259 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.066092 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.066259 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.066114 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.066259 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.066133 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-config-out\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.066259 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.066164 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.066259 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.066186 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.066259 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.066225 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-web-config\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.066259 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.066250 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.066582 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.066282 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.066582 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.066310 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.066582 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.066338 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.066582 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.066366 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.066582 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.066390 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85t2v\" (UniqueName: \"kubernetes.io/projected/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-kube-api-access-85t2v\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.066582 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.066420 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.066582 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.066476 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.067907 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.067878 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.070069 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.070041 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.070179 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.070156 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.072323 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.070916 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.072323 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.071365 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.072323 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.071432 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.072323 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.071811 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.072323 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.072052 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.072641 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.072528 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.072702 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.072686 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-config\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.073497 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.073324 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.073497 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.073456 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.073738 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.073717 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.073804 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.073760 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.075613 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.075571 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.075613 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.075602 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-config-out\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.076129 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.076092 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-web-config\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.076915 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.076858 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85t2v\" (UniqueName: \"kubernetes.io/projected/5be176d2-7d94-46ec-82df-9f25aaa1ffd4-kube-api-access-85t2v\") pod \"prometheus-k8s-0\" (UID: \"5be176d2-7d94-46ec-82df-9f25aaa1ffd4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.167793 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.167761 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:35.666777 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.666748 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rrzpv" Apr 24 21:28:35.748577 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.746771 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" event={"ID":"dabc3649-986c-417d-8a62-0996e4d2bc1c","Type":"ContainerStarted","Data":"6acdaa54c6ff38d66ddf2bdcfa7b7632e00e89b4d7821d9bde0d8b7e6ff570c0"} Apr 24 21:28:35.764419 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:35.764378 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" podStartSLOduration=1.8347336300000001 podStartE2EDuration="3.764361385s" podCreationTimestamp="2026-04-24 21:28:32 +0000 UTC" firstStartedPulling="2026-04-24 21:28:33.216849861 +0000 UTC m=+59.392584527" lastFinishedPulling="2026-04-24 21:28:35.146477617 +0000 UTC m=+61.322212282" observedRunningTime="2026-04-24 21:28:35.763457083 +0000 UTC m=+61.939191760" watchObservedRunningTime="2026-04-24 21:28:35.764361385 +0000 UTC m=+61.940096059" Apr 24 21:28:36.182489 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:36.182466 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:28:36.183474 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:28:36.183452 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5be176d2_7d94_46ec_82df_9f25aaa1ffd4.slice/crio-a31ec65d4d4efc531f0c3d570cda092ef53222c1adbcaddfcca2b590a5fb2afd WatchSource:0}: Error finding container a31ec65d4d4efc531f0c3d570cda092ef53222c1adbcaddfcca2b590a5fb2afd: Status 404 returned error can't find the container with id a31ec65d4d4efc531f0c3d570cda092ef53222c1adbcaddfcca2b590a5fb2afd Apr 24 21:28:36.751653 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:36.751608 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" event={"ID":"e3fdaddf-786f-413e-83cb-3bab2578a8a5","Type":"ContainerStarted","Data":"06117f3218d2a5c974c4c5297d4a6c5ceef3cb6f8abc23fe0e68a98d4be36809"} Apr 24 21:28:36.752051 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:36.751660 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" event={"ID":"e3fdaddf-786f-413e-83cb-3bab2578a8a5","Type":"ContainerStarted","Data":"97feadf2544fb25d229385a657bc28f04bd1ffda663613643e1a0b9c47235158"} Apr 24 21:28:36.752051 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:36.751675 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" event={"ID":"e3fdaddf-786f-413e-83cb-3bab2578a8a5","Type":"ContainerStarted","Data":"61da395f330a87573171f10bc945da6ba09180d56d46ce4d61c713a88b9ecd7d"} Apr 24 21:28:36.752991 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:36.752968 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qqfdb" event={"ID":"747794d3-8a8d-4ce6-8607-23994becf49d","Type":"ContainerStarted","Data":"d77903c4186b19f55c063ccc7e9aeb55379de92371ed4ef130bb2fc230e8bd72"} Apr 24 21:28:36.753196 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:36.753182 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qqfdb" Apr 24 21:28:36.754435 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:36.754408 2578 generic.go:358] "Generic (PLEG): container finished" podID="5be176d2-7d94-46ec-82df-9f25aaa1ffd4" containerID="9cde245d06be45a37a4802d5ef2da40e015a240698f7aa8c5edeab587c4f2c46" exitCode=0 Apr 24 21:28:36.754559 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:36.754503 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5be176d2-7d94-46ec-82df-9f25aaa1ffd4","Type":"ContainerDied","Data":"9cde245d06be45a37a4802d5ef2da40e015a240698f7aa8c5edeab587c4f2c46"} Apr 24 21:28:36.754559 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:36.754536 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5be176d2-7d94-46ec-82df-9f25aaa1ffd4","Type":"ContainerStarted","Data":"a31ec65d4d4efc531f0c3d570cda092ef53222c1adbcaddfcca2b590a5fb2afd"} Apr 24 21:28:36.758275 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:36.758259 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qqfdb" Apr 24 21:28:36.776046 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:36.775997 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5d6b7fc5bf-9lxwp" podStartSLOduration=1.876188606 podStartE2EDuration="3.77598337s" podCreationTimestamp="2026-04-24 21:28:33 +0000 UTC" firstStartedPulling="2026-04-24 21:28:34.150449843 +0000 UTC m=+60.326184495" lastFinishedPulling="2026-04-24 21:28:36.050244608 +0000 UTC m=+62.225979259" observedRunningTime="2026-04-24 21:28:36.774440839 +0000 UTC m=+62.950175512" watchObservedRunningTime="2026-04-24 21:28:36.77598337 +0000 UTC m=+62.951718043" Apr 24 21:28:36.821190 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:36.821147 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qqfdb" podStartSLOduration=2.187135167 podStartE2EDuration="3.821137654s" podCreationTimestamp="2026-04-24 21:28:33 +0000 UTC" firstStartedPulling="2026-04-24 21:28:34.417357061 +0000 UTC m=+60.593091715" lastFinishedPulling="2026-04-24 21:28:36.05135954 +0000 UTC m=+62.227094202" observedRunningTime="2026-04-24 21:28:36.82061031 +0000 UTC m=+62.996344982" watchObservedRunningTime="2026-04-24 21:28:36.821137654 +0000 UTC m=+62.996872326" Apr 24 21:28:39.767811 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:39.767779 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5be176d2-7d94-46ec-82df-9f25aaa1ffd4","Type":"ContainerStarted","Data":"80f6a455159bc19136e9962f147f55377a1725739b1192dbc904e4ce4d0716ee"} Apr 24 21:28:39.767811 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:39.767813 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5be176d2-7d94-46ec-82df-9f25aaa1ffd4","Type":"ContainerStarted","Data":"54699c2019e5a474f1ad7ad9bac9fb282bb2cbc7db9d97ee69b4102e0df5aab6"} Apr 24 21:28:39.768223 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:39.767822 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5be176d2-7d94-46ec-82df-9f25aaa1ffd4","Type":"ContainerStarted","Data":"4267c604b485740607302d99773e273d394b1e61ff094990382422b8bdaa4aed"} Apr 24 21:28:39.768223 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:39.767830 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5be176d2-7d94-46ec-82df-9f25aaa1ffd4","Type":"ContainerStarted","Data":"aa3a0b18c2398bb44f925a3df198eeab8ef8a517751bbb2169aa39eb7aecc0e9"} Apr 24 21:28:39.768223 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:39.767839 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5be176d2-7d94-46ec-82df-9f25aaa1ffd4","Type":"ContainerStarted","Data":"70ae55231bc8734fde3f999994ef31cf23ff4f59eff5988c7bd196c115e0a271"} Apr 24 21:28:39.768223 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:39.767847 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5be176d2-7d94-46ec-82df-9f25aaa1ffd4","Type":"ContainerStarted","Data":"359abbc1a928532ba33849a925919d70c56fee690088aa70c284cc64dc772762"} Apr 24 21:28:39.800757 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:39.800715 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.507975041 podStartE2EDuration="5.800701473s" podCreationTimestamp="2026-04-24 21:28:34 +0000 UTC" firstStartedPulling="2026-04-24 21:28:36.755740295 +0000 UTC m=+62.931474946" lastFinishedPulling="2026-04-24 21:28:39.048466727 +0000 UTC m=+65.224201378" observedRunningTime="2026-04-24 21:28:39.797757284 +0000 UTC m=+65.973491956" watchObservedRunningTime="2026-04-24 21:28:39.800701473 +0000 UTC m=+65.976436146" Apr 24 21:28:40.118924 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:40.118859 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs\") pod \"network-metrics-daemon-6wxzd\" (UID: \"b536a581-6c7c-4e7e-9fb3-6223e4ab90f0\") " pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:28:40.121377 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:40.121363 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:28:40.131643 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:40.131625 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b536a581-6c7c-4e7e-9fb3-6223e4ab90f0-metrics-certs\") pod \"network-metrics-daemon-6wxzd\" (UID: \"b536a581-6c7c-4e7e-9fb3-6223e4ab90f0\") " pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:28:40.142027 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:40.142009 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-cbm9w\"" Apr 24 21:28:40.149919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:40.149906 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wxzd" Apr 24 21:28:40.168114 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:40.168094 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:40.219878 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:40.219844 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vbt4\" (UniqueName: \"kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4\") pod \"network-check-target-rkwhf\" (UID: \"c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e\") " pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:28:40.226764 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:40.226745 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:28:40.233291 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:40.233091 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:28:40.243829 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:40.243810 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vbt4\" (UniqueName: \"kubernetes.io/projected/c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e-kube-api-access-2vbt4\") pod \"network-check-target-rkwhf\" (UID: \"c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e\") " pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:28:40.277238 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:40.277215 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6wxzd"] Apr 24 21:28:40.291236 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:28:40.291201 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb536a581_6c7c_4e7e_9fb3_6223e4ab90f0.slice/crio-e0ce56e32d85e8fbabc5ade4a0addbcaad5eb2a97a962d3e0e194faaa7f9cf2e WatchSource:0}: Error finding container e0ce56e32d85e8fbabc5ade4a0addbcaad5eb2a97a962d3e0e194faaa7f9cf2e: Status 404 returned error can't find the container with id e0ce56e32d85e8fbabc5ade4a0addbcaad5eb2a97a962d3e0e194faaa7f9cf2e Apr 24 21:28:40.435511 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:40.435445 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kgq96\"" Apr 24 21:28:40.443429 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:40.443409 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:28:40.555637 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:40.555616 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rkwhf"] Apr 24 21:28:40.557982 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:28:40.557950 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2ebfd07_c3ce_4ce2_b482_4596f9db1c1e.slice/crio-3e180912abaf818cd8e6226b9311a350f8ea7c8fa6d656c8cba8d3e9ac6fbe36 WatchSource:0}: Error finding container 3e180912abaf818cd8e6226b9311a350f8ea7c8fa6d656c8cba8d3e9ac6fbe36: Status 404 returned error can't find the container with id 3e180912abaf818cd8e6226b9311a350f8ea7c8fa6d656c8cba8d3e9ac6fbe36 Apr 24 21:28:40.753958 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:40.753934 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6d7ffb5569-nnk78" Apr 24 21:28:40.773232 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:40.773198 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rkwhf" event={"ID":"c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e","Type":"ContainerStarted","Data":"3e180912abaf818cd8e6226b9311a350f8ea7c8fa6d656c8cba8d3e9ac6fbe36"} Apr 24 21:28:40.774437 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:40.774413 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6wxzd" event={"ID":"b536a581-6c7c-4e7e-9fb3-6223e4ab90f0","Type":"ContainerStarted","Data":"e0ce56e32d85e8fbabc5ade4a0addbcaad5eb2a97a962d3e0e194faaa7f9cf2e"} Apr 24 21:28:41.294894 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.294854 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f455c8f5-q626q"] Apr 24 21:28:41.341503 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.341472 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-77479bc4bc-jjmmr"] Apr 24 21:28:41.367981 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.367950 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77479bc4bc-jjmmr"] Apr 24 21:28:41.368129 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.368072 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:41.432757 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.432722 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-console-config\") pod \"console-77479bc4bc-jjmmr\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:41.432944 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.432771 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f07cdd42-fef3-4026-999a-8f3d50c31305-console-oauth-config\") pod \"console-77479bc4bc-jjmmr\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:41.432944 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.432818 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqzc5\" (UniqueName: \"kubernetes.io/projected/f07cdd42-fef3-4026-999a-8f3d50c31305-kube-api-access-bqzc5\") pod \"console-77479bc4bc-jjmmr\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:41.432944 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.432907 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-trusted-ca-bundle\") pod \"console-77479bc4bc-jjmmr\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:41.433129 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.432949 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-oauth-serving-cert\") pod \"console-77479bc4bc-jjmmr\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:41.433129 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.432977 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-service-ca\") pod \"console-77479bc4bc-jjmmr\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:41.433129 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.433035 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f07cdd42-fef3-4026-999a-8f3d50c31305-console-serving-cert\") pod \"console-77479bc4bc-jjmmr\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:41.534182 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.534144 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-trusted-ca-bundle\") pod \"console-77479bc4bc-jjmmr\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:41.534182 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.534192 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-oauth-serving-cert\") pod \"console-77479bc4bc-jjmmr\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:41.534396 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.534213 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-service-ca\") pod \"console-77479bc4bc-jjmmr\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:41.534396 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.534234 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f07cdd42-fef3-4026-999a-8f3d50c31305-console-serving-cert\") pod \"console-77479bc4bc-jjmmr\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:41.534396 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.534284 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-console-config\") pod \"console-77479bc4bc-jjmmr\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:41.534396 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.534346 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f07cdd42-fef3-4026-999a-8f3d50c31305-console-oauth-config\") pod \"console-77479bc4bc-jjmmr\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:41.534396 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.534380 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqzc5\" (UniqueName: \"kubernetes.io/projected/f07cdd42-fef3-4026-999a-8f3d50c31305-kube-api-access-bqzc5\") pod \"console-77479bc4bc-jjmmr\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:41.535202 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.535173 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-oauth-serving-cert\") pod \"console-77479bc4bc-jjmmr\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:41.535668 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.535651 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-console-config\") pod \"console-77479bc4bc-jjmmr\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:41.538773 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.538740 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-trusted-ca-bundle\") pod \"console-77479bc4bc-jjmmr\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:41.538909 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.538884 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-service-ca\") pod \"console-77479bc4bc-jjmmr\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:41.539134 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.539112 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f07cdd42-fef3-4026-999a-8f3d50c31305-console-oauth-config\") pod \"console-77479bc4bc-jjmmr\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:41.545110 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.545048 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f07cdd42-fef3-4026-999a-8f3d50c31305-console-serving-cert\") pod \"console-77479bc4bc-jjmmr\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:41.548179 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.548155 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqzc5\" (UniqueName: \"kubernetes.io/projected/f07cdd42-fef3-4026-999a-8f3d50c31305-kube-api-access-bqzc5\") pod \"console-77479bc4bc-jjmmr\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:41.681345 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:41.681306 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:42.044160 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:42.044135 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77479bc4bc-jjmmr"] Apr 24 21:28:42.047604 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:28:42.047576 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf07cdd42_fef3_4026_999a_8f3d50c31305.slice/crio-c251caed76100adf21a4b22911179e7fd48211d4576ed57c05c44f3717368e6c WatchSource:0}: Error finding container c251caed76100adf21a4b22911179e7fd48211d4576ed57c05c44f3717368e6c: Status 404 returned error can't find the container with id c251caed76100adf21a4b22911179e7fd48211d4576ed57c05c44f3717368e6c Apr 24 21:28:42.783592 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:42.783554 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6wxzd" event={"ID":"b536a581-6c7c-4e7e-9fb3-6223e4ab90f0","Type":"ContainerStarted","Data":"c2bf55d4b523b8d17816b4d025a451d44bc2363995b2106ba06a515912cdfbc4"} Apr 24 21:28:42.783783 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:42.783599 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6wxzd" event={"ID":"b536a581-6c7c-4e7e-9fb3-6223e4ab90f0","Type":"ContainerStarted","Data":"95d628dfab24a5657af74a201e0f646e3ba53c122dd74b6094c31e61ccfbc150"} Apr 24 21:28:42.785163 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:42.785133 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77479bc4bc-jjmmr" event={"ID":"f07cdd42-fef3-4026-999a-8f3d50c31305","Type":"ContainerStarted","Data":"1c389d27dc8cbe47abbb7f52a0db3b6d5ff4d33142f3634377e45e007b3e8dff"} Apr 24 21:28:42.785291 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:42.785169 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77479bc4bc-jjmmr" event={"ID":"f07cdd42-fef3-4026-999a-8f3d50c31305","Type":"ContainerStarted","Data":"c251caed76100adf21a4b22911179e7fd48211d4576ed57c05c44f3717368e6c"} Apr 24 21:28:42.805001 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:42.804910 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6wxzd" podStartSLOduration=67.201753974 podStartE2EDuration="1m8.804892432s" podCreationTimestamp="2026-04-24 21:27:34 +0000 UTC" firstStartedPulling="2026-04-24 21:28:40.29306276 +0000 UTC m=+66.468797416" lastFinishedPulling="2026-04-24 21:28:41.896201209 +0000 UTC m=+68.071935874" observedRunningTime="2026-04-24 21:28:42.803740155 +0000 UTC m=+68.979474830" watchObservedRunningTime="2026-04-24 21:28:42.804892432 +0000 UTC m=+68.980627108" Apr 24 21:28:42.824746 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:42.824698 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77479bc4bc-jjmmr" podStartSLOduration=1.824683303 podStartE2EDuration="1.824683303s" podCreationTimestamp="2026-04-24 21:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:42.822785569 +0000 UTC m=+68.998520255" watchObservedRunningTime="2026-04-24 21:28:42.824683303 +0000 UTC m=+69.000417975" Apr 24 21:28:43.776026 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:43.775987 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:28:43.792530 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:43.792493 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rkwhf" event={"ID":"c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e","Type":"ContainerStarted","Data":"d83f632be2412cbcc82d712453a3b2aca020c3593fffdd64c6b806d2ac70f504"} Apr 24 21:28:43.810824 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:43.810782 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-rkwhf" podStartSLOduration=66.927723485 podStartE2EDuration="1m9.810769395s" podCreationTimestamp="2026-04-24 21:27:34 +0000 UTC" firstStartedPulling="2026-04-24 21:28:40.560086974 +0000 UTC m=+66.735821628" lastFinishedPulling="2026-04-24 21:28:43.443132887 +0000 UTC m=+69.618867538" observedRunningTime="2026-04-24 21:28:43.809755826 +0000 UTC m=+69.985490500" watchObservedRunningTime="2026-04-24 21:28:43.810769395 +0000 UTC m=+69.986504067" Apr 24 21:28:44.795775 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:44.795735 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:28:51.681699 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:51.681559 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:51.681699 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:51.681611 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:51.686400 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:51.686380 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:51.821567 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:51.821527 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:28:51.878583 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:51.878536 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c956f48c6-vl47q"] Apr 24 21:28:53.077103 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:53.077072 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:53.077103 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:53.077110 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:28:59.904778 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:28:59.904753 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-48v2j_e329951c-d495-4bf9-8751-384a26c4a2ce/serve-healthcheck-canary/0.log" Apr 24 21:29:06.319001 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.318960 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6f455c8f5-q626q" podUID="d5544c7a-a55a-4f52-807c-78716f3d81d6" containerName="console" containerID="cri-o://374a860b8bf4d2871c62b11d45464c4300f504a78ffbe1b92ecfd964aa74b5a3" gracePeriod=15 Apr 24 21:29:06.562602 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.562581 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f455c8f5-q626q_d5544c7a-a55a-4f52-807c-78716f3d81d6/console/0.log" Apr 24 21:29:06.562731 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.562672 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:29:06.637419 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.637338 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-trusted-ca-bundle\") pod \"d5544c7a-a55a-4f52-807c-78716f3d81d6\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " Apr 24 21:29:06.637419 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.637389 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-oauth-serving-cert\") pod \"d5544c7a-a55a-4f52-807c-78716f3d81d6\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " Apr 24 21:29:06.637609 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.637422 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-console-config\") pod \"d5544c7a-a55a-4f52-807c-78716f3d81d6\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " Apr 24 21:29:06.637609 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.637467 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-service-ca\") pod \"d5544c7a-a55a-4f52-807c-78716f3d81d6\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " Apr 24 21:29:06.637609 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.637511 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5544c7a-a55a-4f52-807c-78716f3d81d6-console-serving-cert\") pod \"d5544c7a-a55a-4f52-807c-78716f3d81d6\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " Apr 24 21:29:06.637609 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.637535 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5544c7a-a55a-4f52-807c-78716f3d81d6-console-oauth-config\") pod \"d5544c7a-a55a-4f52-807c-78716f3d81d6\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " Apr 24 21:29:06.637609 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.637590 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5kqz\" (UniqueName: \"kubernetes.io/projected/d5544c7a-a55a-4f52-807c-78716f3d81d6-kube-api-access-c5kqz\") pod \"d5544c7a-a55a-4f52-807c-78716f3d81d6\" (UID: \"d5544c7a-a55a-4f52-807c-78716f3d81d6\") " Apr 24 21:29:06.637848 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.637771 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d5544c7a-a55a-4f52-807c-78716f3d81d6" (UID: "d5544c7a-a55a-4f52-807c-78716f3d81d6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:06.637905 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.637867 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d5544c7a-a55a-4f52-807c-78716f3d81d6" (UID: "d5544c7a-a55a-4f52-807c-78716f3d81d6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:06.637956 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.637921 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-service-ca" (OuterVolumeSpecName: "service-ca") pod "d5544c7a-a55a-4f52-807c-78716f3d81d6" (UID: "d5544c7a-a55a-4f52-807c-78716f3d81d6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:06.637956 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.637933 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-console-config" (OuterVolumeSpecName: "console-config") pod "d5544c7a-a55a-4f52-807c-78716f3d81d6" (UID: "d5544c7a-a55a-4f52-807c-78716f3d81d6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:06.637956 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.637949 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-trusted-ca-bundle\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:29:06.638063 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.637969 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-oauth-serving-cert\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:29:06.639894 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.639867 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5544c7a-a55a-4f52-807c-78716f3d81d6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d5544c7a-a55a-4f52-807c-78716f3d81d6" (UID: "d5544c7a-a55a-4f52-807c-78716f3d81d6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:06.639894 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.639880 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5544c7a-a55a-4f52-807c-78716f3d81d6-kube-api-access-c5kqz" (OuterVolumeSpecName: "kube-api-access-c5kqz") pod "d5544c7a-a55a-4f52-807c-78716f3d81d6" (UID: "d5544c7a-a55a-4f52-807c-78716f3d81d6"). InnerVolumeSpecName "kube-api-access-c5kqz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:06.640005 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.639914 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5544c7a-a55a-4f52-807c-78716f3d81d6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d5544c7a-a55a-4f52-807c-78716f3d81d6" (UID: "d5544c7a-a55a-4f52-807c-78716f3d81d6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:06.739228 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.739201 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-console-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:29:06.739228 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.739225 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5544c7a-a55a-4f52-807c-78716f3d81d6-service-ca\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:29:06.739363 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.739238 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5544c7a-a55a-4f52-807c-78716f3d81d6-console-serving-cert\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:29:06.739363 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.739252 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5544c7a-a55a-4f52-807c-78716f3d81d6-console-oauth-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:29:06.739363 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.739264 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c5kqz\" (UniqueName: \"kubernetes.io/projected/d5544c7a-a55a-4f52-807c-78716f3d81d6-kube-api-access-c5kqz\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:29:06.864211 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.864188 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f455c8f5-q626q_d5544c7a-a55a-4f52-807c-78716f3d81d6/console/0.log" Apr 24 21:29:06.864321 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.864222 2578 generic.go:358] "Generic (PLEG): container finished" podID="d5544c7a-a55a-4f52-807c-78716f3d81d6" containerID="374a860b8bf4d2871c62b11d45464c4300f504a78ffbe1b92ecfd964aa74b5a3" exitCode=2 Apr 24 21:29:06.864321 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.864276 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f455c8f5-q626q" event={"ID":"d5544c7a-a55a-4f52-807c-78716f3d81d6","Type":"ContainerDied","Data":"374a860b8bf4d2871c62b11d45464c4300f504a78ffbe1b92ecfd964aa74b5a3"} Apr 24 21:29:06.864321 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.864296 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f455c8f5-q626q" Apr 24 21:29:06.864321 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.864312 2578 scope.go:117] "RemoveContainer" containerID="374a860b8bf4d2871c62b11d45464c4300f504a78ffbe1b92ecfd964aa74b5a3" Apr 24 21:29:06.864510 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.864303 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f455c8f5-q626q" event={"ID":"d5544c7a-a55a-4f52-807c-78716f3d81d6","Type":"ContainerDied","Data":"fc568b962b32433c36c252463795dd39043c37ba6d2f586f830bf623f1c5527d"} Apr 24 21:29:06.872733 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.872717 2578 scope.go:117] "RemoveContainer" containerID="374a860b8bf4d2871c62b11d45464c4300f504a78ffbe1b92ecfd964aa74b5a3" Apr 24 21:29:06.872998 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:29:06.872978 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"374a860b8bf4d2871c62b11d45464c4300f504a78ffbe1b92ecfd964aa74b5a3\": container with ID starting with 374a860b8bf4d2871c62b11d45464c4300f504a78ffbe1b92ecfd964aa74b5a3 not found: ID does not exist" containerID="374a860b8bf4d2871c62b11d45464c4300f504a78ffbe1b92ecfd964aa74b5a3" Apr 24 21:29:06.873062 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.873006 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"374a860b8bf4d2871c62b11d45464c4300f504a78ffbe1b92ecfd964aa74b5a3"} err="failed to get container status \"374a860b8bf4d2871c62b11d45464c4300f504a78ffbe1b92ecfd964aa74b5a3\": rpc error: code = NotFound desc = could not find container \"374a860b8bf4d2871c62b11d45464c4300f504a78ffbe1b92ecfd964aa74b5a3\": container with ID starting with 374a860b8bf4d2871c62b11d45464c4300f504a78ffbe1b92ecfd964aa74b5a3 not found: ID does not exist" Apr 24 21:29:06.903639 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.903587 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f455c8f5-q626q"] Apr 24 21:29:06.914392 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:06.914368 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6f455c8f5-q626q"] Apr 24 21:29:08.421604 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:08.421536 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5544c7a-a55a-4f52-807c-78716f3d81d6" path="/var/lib/kubelet/pods/d5544c7a-a55a-4f52-807c-78716f3d81d6/volumes" Apr 24 21:29:13.082574 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:13.082526 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:29:13.088591 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:13.088524 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-fb766d8dc-4j2hh" Apr 24 21:29:15.800440 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:15.800412 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-rkwhf" Apr 24 21:29:16.897804 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:16.897770 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6c956f48c6-vl47q" podUID="aa9bbfec-3195-4cf7-a22e-0120bed1f2a4" containerName="console" containerID="cri-o://17f87aea5d42c4635970ea71a8fa8b07f2cbbaa17289e3e0787bfb856b6a5ea0" gracePeriod=15 Apr 24 21:29:17.198830 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.198810 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c956f48c6-vl47q_aa9bbfec-3195-4cf7-a22e-0120bed1f2a4/console/0.log" Apr 24 21:29:17.198955 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.198874 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:29:17.234277 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.234253 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-console-config\") pod \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " Apr 24 21:29:17.234401 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.234336 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-522kq\" (UniqueName: \"kubernetes.io/projected/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-kube-api-access-522kq\") pod \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " Apr 24 21:29:17.234401 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.234371 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-console-oauth-config\") pod \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " Apr 24 21:29:17.234401 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.234386 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-console-serving-cert\") pod \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " Apr 24 21:29:17.234575 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.234412 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-service-ca\") pod \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " Apr 24 21:29:17.234575 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.234447 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-oauth-serving-cert\") pod \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\" (UID: \"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4\") " Apr 24 21:29:17.234701 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.234666 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-console-config" (OuterVolumeSpecName: "console-config") pod "aa9bbfec-3195-4cf7-a22e-0120bed1f2a4" (UID: "aa9bbfec-3195-4cf7-a22e-0120bed1f2a4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:17.234951 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.234925 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "aa9bbfec-3195-4cf7-a22e-0120bed1f2a4" (UID: "aa9bbfec-3195-4cf7-a22e-0120bed1f2a4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:17.235065 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.235042 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-service-ca" (OuterVolumeSpecName: "service-ca") pod "aa9bbfec-3195-4cf7-a22e-0120bed1f2a4" (UID: "aa9bbfec-3195-4cf7-a22e-0120bed1f2a4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:17.236998 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.236968 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "aa9bbfec-3195-4cf7-a22e-0120bed1f2a4" (UID: "aa9bbfec-3195-4cf7-a22e-0120bed1f2a4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:17.237104 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.237074 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "aa9bbfec-3195-4cf7-a22e-0120bed1f2a4" (UID: "aa9bbfec-3195-4cf7-a22e-0120bed1f2a4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:17.237666 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.237626 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-kube-api-access-522kq" (OuterVolumeSpecName: "kube-api-access-522kq") pod "aa9bbfec-3195-4cf7-a22e-0120bed1f2a4" (UID: "aa9bbfec-3195-4cf7-a22e-0120bed1f2a4"). InnerVolumeSpecName "kube-api-access-522kq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:17.335359 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.335332 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-console-oauth-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:29:17.335359 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.335353 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-console-serving-cert\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:29:17.335359 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.335363 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-service-ca\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:29:17.335599 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.335372 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-oauth-serving-cert\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:29:17.335599 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.335380 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-console-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:29:17.335599 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.335388 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-522kq\" (UniqueName: \"kubernetes.io/projected/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4-kube-api-access-522kq\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:29:17.903581 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.903558 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c956f48c6-vl47q_aa9bbfec-3195-4cf7-a22e-0120bed1f2a4/console/0.log" Apr 24 21:29:17.903985 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.903595 2578 generic.go:358] "Generic (PLEG): container finished" podID="aa9bbfec-3195-4cf7-a22e-0120bed1f2a4" containerID="17f87aea5d42c4635970ea71a8fa8b07f2cbbaa17289e3e0787bfb856b6a5ea0" exitCode=2 Apr 24 21:29:17.903985 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.903624 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c956f48c6-vl47q" event={"ID":"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4","Type":"ContainerDied","Data":"17f87aea5d42c4635970ea71a8fa8b07f2cbbaa17289e3e0787bfb856b6a5ea0"} Apr 24 21:29:17.903985 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.903656 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c956f48c6-vl47q" Apr 24 21:29:17.903985 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.903664 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c956f48c6-vl47q" event={"ID":"aa9bbfec-3195-4cf7-a22e-0120bed1f2a4","Type":"ContainerDied","Data":"8713326e687169723415faacffd010a5db9375eca5d2e02c9b0bd211dee0d798"} Apr 24 21:29:17.903985 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.903680 2578 scope.go:117] "RemoveContainer" containerID="17f87aea5d42c4635970ea71a8fa8b07f2cbbaa17289e3e0787bfb856b6a5ea0" Apr 24 21:29:17.911870 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.911845 2578 scope.go:117] "RemoveContainer" containerID="17f87aea5d42c4635970ea71a8fa8b07f2cbbaa17289e3e0787bfb856b6a5ea0" Apr 24 21:29:17.912113 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:29:17.912091 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17f87aea5d42c4635970ea71a8fa8b07f2cbbaa17289e3e0787bfb856b6a5ea0\": container with ID starting with 17f87aea5d42c4635970ea71a8fa8b07f2cbbaa17289e3e0787bfb856b6a5ea0 not found: ID does not exist" containerID="17f87aea5d42c4635970ea71a8fa8b07f2cbbaa17289e3e0787bfb856b6a5ea0" Apr 24 21:29:17.912157 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.912120 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f87aea5d42c4635970ea71a8fa8b07f2cbbaa17289e3e0787bfb856b6a5ea0"} err="failed to get container status \"17f87aea5d42c4635970ea71a8fa8b07f2cbbaa17289e3e0787bfb856b6a5ea0\": rpc error: code = NotFound desc = could not find container \"17f87aea5d42c4635970ea71a8fa8b07f2cbbaa17289e3e0787bfb856b6a5ea0\": container with ID starting with 17f87aea5d42c4635970ea71a8fa8b07f2cbbaa17289e3e0787bfb856b6a5ea0 not found: ID does not exist" Apr 24 21:29:17.928315 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.928290 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c956f48c6-vl47q"] Apr 24 21:29:17.936880 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:17.936857 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c956f48c6-vl47q"] Apr 24 21:29:18.421853 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:18.421828 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa9bbfec-3195-4cf7-a22e-0120bed1f2a4" path="/var/lib/kubelet/pods/aa9bbfec-3195-4cf7-a22e-0120bed1f2a4/volumes" Apr 24 21:29:35.168156 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:35.168127 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:35.187190 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:35.187163 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:35.969162 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:29:35.969130 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:15.547021 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.546987 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-66ffc9db74-gvw5t"] Apr 24 21:30:15.547500 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.547283 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5544c7a-a55a-4f52-807c-78716f3d81d6" containerName="console" Apr 24 21:30:15.547500 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.547293 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5544c7a-a55a-4f52-807c-78716f3d81d6" containerName="console" Apr 24 21:30:15.547500 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.547314 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa9bbfec-3195-4cf7-a22e-0120bed1f2a4" containerName="console" Apr 24 21:30:15.547500 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.547320 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa9bbfec-3195-4cf7-a22e-0120bed1f2a4" containerName="console" Apr 24 21:30:15.547500 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.547368 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d5544c7a-a55a-4f52-807c-78716f3d81d6" containerName="console" Apr 24 21:30:15.547500 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.547377 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa9bbfec-3195-4cf7-a22e-0120bed1f2a4" containerName="console" Apr 24 21:30:15.550466 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.550444 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.555908 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.555884 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-console-config\") pod \"console-66ffc9db74-gvw5t\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.556030 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.555944 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f04af0-b878-4065-8684-c6af48ade737-console-serving-cert\") pod \"console-66ffc9db74-gvw5t\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.556030 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.555970 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-oauth-serving-cert\") pod \"console-66ffc9db74-gvw5t\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.556030 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.555993 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hnlw\" (UniqueName: \"kubernetes.io/projected/e7f04af0-b878-4065-8684-c6af48ade737-kube-api-access-6hnlw\") pod \"console-66ffc9db74-gvw5t\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.556190 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.556059 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e7f04af0-b878-4065-8684-c6af48ade737-console-oauth-config\") pod \"console-66ffc9db74-gvw5t\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.556190 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.556106 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-service-ca\") pod \"console-66ffc9db74-gvw5t\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.556274 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.556195 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-trusted-ca-bundle\") pod \"console-66ffc9db74-gvw5t\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.561776 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.561750 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66ffc9db74-gvw5t"] Apr 24 21:30:15.656732 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.656709 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-trusted-ca-bundle\") pod \"console-66ffc9db74-gvw5t\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.656857 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.656746 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-console-config\") pod \"console-66ffc9db74-gvw5t\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.656857 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.656784 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f04af0-b878-4065-8684-c6af48ade737-console-serving-cert\") pod \"console-66ffc9db74-gvw5t\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.656857 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.656820 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-oauth-serving-cert\") pod \"console-66ffc9db74-gvw5t\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.657036 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.656860 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hnlw\" (UniqueName: \"kubernetes.io/projected/e7f04af0-b878-4065-8684-c6af48ade737-kube-api-access-6hnlw\") pod \"console-66ffc9db74-gvw5t\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.657036 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.656896 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e7f04af0-b878-4065-8684-c6af48ade737-console-oauth-config\") pod \"console-66ffc9db74-gvw5t\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.657036 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.656917 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-service-ca\") pod \"console-66ffc9db74-gvw5t\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.657465 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.657444 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-console-config\") pod \"console-66ffc9db74-gvw5t\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.657605 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.657585 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-oauth-serving-cert\") pod \"console-66ffc9db74-gvw5t\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.657676 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.657657 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-service-ca\") pod \"console-66ffc9db74-gvw5t\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.657765 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.657750 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-trusted-ca-bundle\") pod \"console-66ffc9db74-gvw5t\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.659904 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.659868 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e7f04af0-b878-4065-8684-c6af48ade737-console-oauth-config\") pod \"console-66ffc9db74-gvw5t\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.660004 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.659939 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f04af0-b878-4065-8684-c6af48ade737-console-serving-cert\") pod \"console-66ffc9db74-gvw5t\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.664716 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.664693 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hnlw\" (UniqueName: \"kubernetes.io/projected/e7f04af0-b878-4065-8684-c6af48ade737-kube-api-access-6hnlw\") pod \"console-66ffc9db74-gvw5t\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.859369 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.859297 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:15.975229 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:15.975207 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66ffc9db74-gvw5t"] Apr 24 21:30:15.977557 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:30:15.977515 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7f04af0_b878_4065_8684_c6af48ade737.slice/crio-e4335e13d936915ee17b00309412b71f1562ba61cde65ee434624df1326c63d3 WatchSource:0}: Error finding container e4335e13d936915ee17b00309412b71f1562ba61cde65ee434624df1326c63d3: Status 404 returned error can't find the container with id e4335e13d936915ee17b00309412b71f1562ba61cde65ee434624df1326c63d3 Apr 24 21:30:16.072760 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:16.072726 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66ffc9db74-gvw5t" event={"ID":"e7f04af0-b878-4065-8684-c6af48ade737","Type":"ContainerStarted","Data":"ab2cdf2ad7b67607be82cc6fffa8dc97d10d05c8cac589c88ec41181c620361a"} Apr 24 21:30:16.072893 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:16.072767 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66ffc9db74-gvw5t" event={"ID":"e7f04af0-b878-4065-8684-c6af48ade737","Type":"ContainerStarted","Data":"e4335e13d936915ee17b00309412b71f1562ba61cde65ee434624df1326c63d3"} Apr 24 21:30:16.099053 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:16.098998 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66ffc9db74-gvw5t" podStartSLOduration=1.098976667 podStartE2EDuration="1.098976667s" podCreationTimestamp="2026-04-24 21:30:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:30:16.096293012 +0000 UTC m=+162.272027686" watchObservedRunningTime="2026-04-24 21:30:16.098976667 +0000 UTC m=+162.274711340" Apr 24 21:30:25.859581 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:25.859475 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:25.859581 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:25.859522 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:25.863739 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:25.863718 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:26.103094 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:26.103068 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:30:26.158867 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:26.158838 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77479bc4bc-jjmmr"] Apr 24 21:30:34.281952 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:34.281925 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-trqmn"] Apr 24 21:30:34.285348 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:34.285333 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-trqmn" Apr 24 21:30:34.287432 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:34.287412 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:30:34.292612 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:34.292464 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-trqmn"] Apr 24 21:30:34.394767 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:34.394742 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/01bf52ee-b1fb-4321-b6af-07d7d9f23bf8-kubelet-config\") pod \"global-pull-secret-syncer-trqmn\" (UID: \"01bf52ee-b1fb-4321-b6af-07d7d9f23bf8\") " pod="kube-system/global-pull-secret-syncer-trqmn" Apr 24 21:30:34.394900 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:34.394778 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/01bf52ee-b1fb-4321-b6af-07d7d9f23bf8-dbus\") pod \"global-pull-secret-syncer-trqmn\" (UID: \"01bf52ee-b1fb-4321-b6af-07d7d9f23bf8\") " pod="kube-system/global-pull-secret-syncer-trqmn" Apr 24 21:30:34.394900 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:34.394810 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/01bf52ee-b1fb-4321-b6af-07d7d9f23bf8-original-pull-secret\") pod \"global-pull-secret-syncer-trqmn\" (UID: \"01bf52ee-b1fb-4321-b6af-07d7d9f23bf8\") " pod="kube-system/global-pull-secret-syncer-trqmn" Apr 24 21:30:34.495536 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:34.495495 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/01bf52ee-b1fb-4321-b6af-07d7d9f23bf8-kubelet-config\") pod \"global-pull-secret-syncer-trqmn\" (UID: \"01bf52ee-b1fb-4321-b6af-07d7d9f23bf8\") " pod="kube-system/global-pull-secret-syncer-trqmn" Apr 24 21:30:34.495536 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:34.495539 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/01bf52ee-b1fb-4321-b6af-07d7d9f23bf8-dbus\") pod \"global-pull-secret-syncer-trqmn\" (UID: \"01bf52ee-b1fb-4321-b6af-07d7d9f23bf8\") " pod="kube-system/global-pull-secret-syncer-trqmn" Apr 24 21:30:34.495762 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:34.495590 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/01bf52ee-b1fb-4321-b6af-07d7d9f23bf8-original-pull-secret\") pod \"global-pull-secret-syncer-trqmn\" (UID: \"01bf52ee-b1fb-4321-b6af-07d7d9f23bf8\") " pod="kube-system/global-pull-secret-syncer-trqmn" Apr 24 21:30:34.495762 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:34.495620 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/01bf52ee-b1fb-4321-b6af-07d7d9f23bf8-kubelet-config\") pod \"global-pull-secret-syncer-trqmn\" (UID: \"01bf52ee-b1fb-4321-b6af-07d7d9f23bf8\") " pod="kube-system/global-pull-secret-syncer-trqmn" Apr 24 21:30:34.495762 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:34.495745 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/01bf52ee-b1fb-4321-b6af-07d7d9f23bf8-dbus\") pod \"global-pull-secret-syncer-trqmn\" (UID: \"01bf52ee-b1fb-4321-b6af-07d7d9f23bf8\") " pod="kube-system/global-pull-secret-syncer-trqmn" Apr 24 21:30:34.497754 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:34.497735 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/01bf52ee-b1fb-4321-b6af-07d7d9f23bf8-original-pull-secret\") pod \"global-pull-secret-syncer-trqmn\" (UID: \"01bf52ee-b1fb-4321-b6af-07d7d9f23bf8\") " pod="kube-system/global-pull-secret-syncer-trqmn" Apr 24 21:30:34.595321 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:34.595264 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-trqmn" Apr 24 21:30:34.709923 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:34.709901 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-trqmn"] Apr 24 21:30:34.712000 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:30:34.711969 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01bf52ee_b1fb_4321_b6af_07d7d9f23bf8.slice/crio-3d1e23309902fc42a84da2d53cb214c781313dee894d2c0afdae4f25fa843f63 WatchSource:0}: Error finding container 3d1e23309902fc42a84da2d53cb214c781313dee894d2c0afdae4f25fa843f63: Status 404 returned error can't find the container with id 3d1e23309902fc42a84da2d53cb214c781313dee894d2c0afdae4f25fa843f63 Apr 24 21:30:35.129155 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:35.129114 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-trqmn" event={"ID":"01bf52ee-b1fb-4321-b6af-07d7d9f23bf8","Type":"ContainerStarted","Data":"3d1e23309902fc42a84da2d53cb214c781313dee894d2c0afdae4f25fa843f63"} Apr 24 21:30:39.143087 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:39.143053 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-trqmn" event={"ID":"01bf52ee-b1fb-4321-b6af-07d7d9f23bf8","Type":"ContainerStarted","Data":"d78a2714f27c0f56f56340b378b767d744b7a45b11916f06ce676a852c5042a4"} Apr 24 21:30:39.160243 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:39.160194 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-trqmn" podStartSLOduration=1.829020871 podStartE2EDuration="5.160180617s" podCreationTimestamp="2026-04-24 21:30:34 +0000 UTC" firstStartedPulling="2026-04-24 21:30:34.713927657 +0000 UTC m=+180.889662308" lastFinishedPulling="2026-04-24 21:30:38.045087385 +0000 UTC m=+184.220822054" observedRunningTime="2026-04-24 21:30:39.158193182 +0000 UTC m=+185.333927855" watchObservedRunningTime="2026-04-24 21:30:39.160180617 +0000 UTC m=+185.335915288" Apr 24 21:30:51.177418 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.177379 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-77479bc4bc-jjmmr" podUID="f07cdd42-fef3-4026-999a-8f3d50c31305" containerName="console" containerID="cri-o://1c389d27dc8cbe47abbb7f52a0db3b6d5ff4d33142f3634377e45e007b3e8dff" gracePeriod=15 Apr 24 21:30:51.436607 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.436560 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77479bc4bc-jjmmr_f07cdd42-fef3-4026-999a-8f3d50c31305/console/0.log" Apr 24 21:30:51.436705 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.436618 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:30:51.537244 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.537217 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-oauth-serving-cert\") pod \"f07cdd42-fef3-4026-999a-8f3d50c31305\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " Apr 24 21:30:51.537397 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.537260 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-service-ca\") pod \"f07cdd42-fef3-4026-999a-8f3d50c31305\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " Apr 24 21:30:51.537397 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.537284 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-console-config\") pod \"f07cdd42-fef3-4026-999a-8f3d50c31305\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " Apr 24 21:30:51.537506 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.537464 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f07cdd42-fef3-4026-999a-8f3d50c31305-console-oauth-config\") pod \"f07cdd42-fef3-4026-999a-8f3d50c31305\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " Apr 24 21:30:51.537506 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.537494 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f07cdd42-fef3-4026-999a-8f3d50c31305-console-serving-cert\") pod \"f07cdd42-fef3-4026-999a-8f3d50c31305\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " Apr 24 21:30:51.537626 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.537517 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqzc5\" (UniqueName: \"kubernetes.io/projected/f07cdd42-fef3-4026-999a-8f3d50c31305-kube-api-access-bqzc5\") pod \"f07cdd42-fef3-4026-999a-8f3d50c31305\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " Apr 24 21:30:51.537626 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.537585 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-trusted-ca-bundle\") pod \"f07cdd42-fef3-4026-999a-8f3d50c31305\" (UID: \"f07cdd42-fef3-4026-999a-8f3d50c31305\") " Apr 24 21:30:51.537717 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.537689 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-service-ca" (OuterVolumeSpecName: "service-ca") pod "f07cdd42-fef3-4026-999a-8f3d50c31305" (UID: "f07cdd42-fef3-4026-999a-8f3d50c31305"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:51.537893 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.537822 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-console-config" (OuterVolumeSpecName: "console-config") pod "f07cdd42-fef3-4026-999a-8f3d50c31305" (UID: "f07cdd42-fef3-4026-999a-8f3d50c31305"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:51.537994 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.537834 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f07cdd42-fef3-4026-999a-8f3d50c31305" (UID: "f07cdd42-fef3-4026-999a-8f3d50c31305"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:51.537994 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.537913 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-service-ca\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:30:51.538147 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.538024 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-console-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:30:51.538147 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.538019 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f07cdd42-fef3-4026-999a-8f3d50c31305" (UID: "f07cdd42-fef3-4026-999a-8f3d50c31305"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:51.539758 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.539735 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f07cdd42-fef3-4026-999a-8f3d50c31305-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f07cdd42-fef3-4026-999a-8f3d50c31305" (UID: "f07cdd42-fef3-4026-999a-8f3d50c31305"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:51.539851 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.539805 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f07cdd42-fef3-4026-999a-8f3d50c31305-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f07cdd42-fef3-4026-999a-8f3d50c31305" (UID: "f07cdd42-fef3-4026-999a-8f3d50c31305"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:51.539851 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.539841 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f07cdd42-fef3-4026-999a-8f3d50c31305-kube-api-access-bqzc5" (OuterVolumeSpecName: "kube-api-access-bqzc5") pod "f07cdd42-fef3-4026-999a-8f3d50c31305" (UID: "f07cdd42-fef3-4026-999a-8f3d50c31305"). InnerVolumeSpecName "kube-api-access-bqzc5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:51.639378 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.639353 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-trusted-ca-bundle\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:30:51.639378 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.639378 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f07cdd42-fef3-4026-999a-8f3d50c31305-oauth-serving-cert\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:30:51.639536 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.639388 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f07cdd42-fef3-4026-999a-8f3d50c31305-console-oauth-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:30:51.639536 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.639396 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f07cdd42-fef3-4026-999a-8f3d50c31305-console-serving-cert\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:30:51.639536 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:51.639407 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bqzc5\" (UniqueName: \"kubernetes.io/projected/f07cdd42-fef3-4026-999a-8f3d50c31305-kube-api-access-bqzc5\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:30:52.183398 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:52.183374 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77479bc4bc-jjmmr_f07cdd42-fef3-4026-999a-8f3d50c31305/console/0.log" Apr 24 21:30:52.183769 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:52.183409 2578 generic.go:358] "Generic (PLEG): container finished" podID="f07cdd42-fef3-4026-999a-8f3d50c31305" containerID="1c389d27dc8cbe47abbb7f52a0db3b6d5ff4d33142f3634377e45e007b3e8dff" exitCode=2 Apr 24 21:30:52.183769 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:52.183444 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77479bc4bc-jjmmr" event={"ID":"f07cdd42-fef3-4026-999a-8f3d50c31305","Type":"ContainerDied","Data":"1c389d27dc8cbe47abbb7f52a0db3b6d5ff4d33142f3634377e45e007b3e8dff"} Apr 24 21:30:52.183769 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:52.183471 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77479bc4bc-jjmmr" Apr 24 21:30:52.183769 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:52.183482 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77479bc4bc-jjmmr" event={"ID":"f07cdd42-fef3-4026-999a-8f3d50c31305","Type":"ContainerDied","Data":"c251caed76100adf21a4b22911179e7fd48211d4576ed57c05c44f3717368e6c"} Apr 24 21:30:52.183769 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:52.183496 2578 scope.go:117] "RemoveContainer" containerID="1c389d27dc8cbe47abbb7f52a0db3b6d5ff4d33142f3634377e45e007b3e8dff" Apr 24 21:30:52.191496 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:52.191476 2578 scope.go:117] "RemoveContainer" containerID="1c389d27dc8cbe47abbb7f52a0db3b6d5ff4d33142f3634377e45e007b3e8dff" Apr 24 21:30:52.191760 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:30:52.191743 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c389d27dc8cbe47abbb7f52a0db3b6d5ff4d33142f3634377e45e007b3e8dff\": container with ID starting with 1c389d27dc8cbe47abbb7f52a0db3b6d5ff4d33142f3634377e45e007b3e8dff not found: ID does not exist" containerID="1c389d27dc8cbe47abbb7f52a0db3b6d5ff4d33142f3634377e45e007b3e8dff" Apr 24 21:30:52.191805 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:52.191769 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c389d27dc8cbe47abbb7f52a0db3b6d5ff4d33142f3634377e45e007b3e8dff"} err="failed to get container status \"1c389d27dc8cbe47abbb7f52a0db3b6d5ff4d33142f3634377e45e007b3e8dff\": rpc error: code = NotFound desc = could not find container \"1c389d27dc8cbe47abbb7f52a0db3b6d5ff4d33142f3634377e45e007b3e8dff\": container with ID starting with 1c389d27dc8cbe47abbb7f52a0db3b6d5ff4d33142f3634377e45e007b3e8dff not found: ID does not exist" Apr 24 21:30:52.209419 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:52.209401 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77479bc4bc-jjmmr"] Apr 24 21:30:52.213457 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:52.213440 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-77479bc4bc-jjmmr"] Apr 24 21:30:52.421175 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:30:52.421138 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f07cdd42-fef3-4026-999a-8f3d50c31305" path="/var/lib/kubelet/pods/f07cdd42-fef3-4026-999a-8f3d50c31305/volumes" Apr 24 21:32:34.306695 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:32:34.306669 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:33:55.976107 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:55.976079 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-557bb87958-ps7pj"] Apr 24 21:33:55.976560 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:55.976401 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f07cdd42-fef3-4026-999a-8f3d50c31305" containerName="console" Apr 24 21:33:55.976560 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:55.976414 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f07cdd42-fef3-4026-999a-8f3d50c31305" containerName="console" Apr 24 21:33:55.976560 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:55.976473 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f07cdd42-fef3-4026-999a-8f3d50c31305" containerName="console" Apr 24 21:33:55.979238 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:55.979221 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:55.992138 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:55.992115 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-557bb87958-ps7pj"] Apr 24 21:33:56.022916 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.022892 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4553a899-e53e-4df4-a0c2-c6dd0e52fb68-console-config\") pod \"console-557bb87958-ps7pj\" (UID: \"4553a899-e53e-4df4-a0c2-c6dd0e52fb68\") " pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:56.023017 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.022917 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4553a899-e53e-4df4-a0c2-c6dd0e52fb68-oauth-serving-cert\") pod \"console-557bb87958-ps7pj\" (UID: \"4553a899-e53e-4df4-a0c2-c6dd0e52fb68\") " pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:56.023017 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.022936 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6kdv\" (UniqueName: \"kubernetes.io/projected/4553a899-e53e-4df4-a0c2-c6dd0e52fb68-kube-api-access-d6kdv\") pod \"console-557bb87958-ps7pj\" (UID: \"4553a899-e53e-4df4-a0c2-c6dd0e52fb68\") " pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:56.023017 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.022960 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4553a899-e53e-4df4-a0c2-c6dd0e52fb68-console-serving-cert\") pod \"console-557bb87958-ps7pj\" (UID: \"4553a899-e53e-4df4-a0c2-c6dd0e52fb68\") " pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:56.023017 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.022978 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4553a899-e53e-4df4-a0c2-c6dd0e52fb68-service-ca\") pod \"console-557bb87958-ps7pj\" (UID: \"4553a899-e53e-4df4-a0c2-c6dd0e52fb68\") " pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:56.023017 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.023010 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4553a899-e53e-4df4-a0c2-c6dd0e52fb68-trusted-ca-bundle\") pod \"console-557bb87958-ps7pj\" (UID: \"4553a899-e53e-4df4-a0c2-c6dd0e52fb68\") " pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:56.023192 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.023024 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4553a899-e53e-4df4-a0c2-c6dd0e52fb68-console-oauth-config\") pod \"console-557bb87958-ps7pj\" (UID: \"4553a899-e53e-4df4-a0c2-c6dd0e52fb68\") " pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:56.123829 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.123806 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4553a899-e53e-4df4-a0c2-c6dd0e52fb68-console-config\") pod \"console-557bb87958-ps7pj\" (UID: \"4553a899-e53e-4df4-a0c2-c6dd0e52fb68\") " pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:56.123829 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.123832 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4553a899-e53e-4df4-a0c2-c6dd0e52fb68-oauth-serving-cert\") pod \"console-557bb87958-ps7pj\" (UID: \"4553a899-e53e-4df4-a0c2-c6dd0e52fb68\") " pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:56.123975 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.123847 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6kdv\" (UniqueName: \"kubernetes.io/projected/4553a899-e53e-4df4-a0c2-c6dd0e52fb68-kube-api-access-d6kdv\") pod \"console-557bb87958-ps7pj\" (UID: \"4553a899-e53e-4df4-a0c2-c6dd0e52fb68\") " pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:56.123975 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.123878 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4553a899-e53e-4df4-a0c2-c6dd0e52fb68-console-serving-cert\") pod \"console-557bb87958-ps7pj\" (UID: \"4553a899-e53e-4df4-a0c2-c6dd0e52fb68\") " pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:56.123975 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.123903 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4553a899-e53e-4df4-a0c2-c6dd0e52fb68-service-ca\") pod \"console-557bb87958-ps7pj\" (UID: \"4553a899-e53e-4df4-a0c2-c6dd0e52fb68\") " pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:56.123975 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.123943 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4553a899-e53e-4df4-a0c2-c6dd0e52fb68-trusted-ca-bundle\") pod \"console-557bb87958-ps7pj\" (UID: \"4553a899-e53e-4df4-a0c2-c6dd0e52fb68\") " pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:56.123975 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.123967 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4553a899-e53e-4df4-a0c2-c6dd0e52fb68-console-oauth-config\") pod \"console-557bb87958-ps7pj\" (UID: \"4553a899-e53e-4df4-a0c2-c6dd0e52fb68\") " pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:56.124934 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.124902 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4553a899-e53e-4df4-a0c2-c6dd0e52fb68-oauth-serving-cert\") pod \"console-557bb87958-ps7pj\" (UID: \"4553a899-e53e-4df4-a0c2-c6dd0e52fb68\") " pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:56.125071 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.125045 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4553a899-e53e-4df4-a0c2-c6dd0e52fb68-service-ca\") pod \"console-557bb87958-ps7pj\" (UID: \"4553a899-e53e-4df4-a0c2-c6dd0e52fb68\") " pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:56.125144 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.125100 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4553a899-e53e-4df4-a0c2-c6dd0e52fb68-trusted-ca-bundle\") pod \"console-557bb87958-ps7pj\" (UID: \"4553a899-e53e-4df4-a0c2-c6dd0e52fb68\") " pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:56.127165 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.127135 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4553a899-e53e-4df4-a0c2-c6dd0e52fb68-console-serving-cert\") pod \"console-557bb87958-ps7pj\" (UID: \"4553a899-e53e-4df4-a0c2-c6dd0e52fb68\") " pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:56.130337 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.130310 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4553a899-e53e-4df4-a0c2-c6dd0e52fb68-console-config\") pod \"console-557bb87958-ps7pj\" (UID: \"4553a899-e53e-4df4-a0c2-c6dd0e52fb68\") " pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:56.130935 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.130914 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4553a899-e53e-4df4-a0c2-c6dd0e52fb68-console-oauth-config\") pod \"console-557bb87958-ps7pj\" (UID: \"4553a899-e53e-4df4-a0c2-c6dd0e52fb68\") " pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:56.136311 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.136290 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6kdv\" (UniqueName: \"kubernetes.io/projected/4553a899-e53e-4df4-a0c2-c6dd0e52fb68-kube-api-access-d6kdv\") pod \"console-557bb87958-ps7pj\" (UID: \"4553a899-e53e-4df4-a0c2-c6dd0e52fb68\") " pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:56.288361 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.288306 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:33:56.409730 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.409706 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-557bb87958-ps7pj"] Apr 24 21:33:56.411724 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:33:56.411691 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4553a899_e53e_4df4_a0c2_c6dd0e52fb68.slice/crio-25cf6390cb5127fa9ce7f1633903aa3587eaf46e39b1261591b1351e6a8f0113 WatchSource:0}: Error finding container 25cf6390cb5127fa9ce7f1633903aa3587eaf46e39b1261591b1351e6a8f0113: Status 404 returned error can't find the container with id 25cf6390cb5127fa9ce7f1633903aa3587eaf46e39b1261591b1351e6a8f0113 Apr 24 21:33:56.413863 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.413847 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:33:56.688871 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.688794 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-557bb87958-ps7pj" event={"ID":"4553a899-e53e-4df4-a0c2-c6dd0e52fb68","Type":"ContainerStarted","Data":"1d9a10363c3057019a0de509c8e4f5b6a28f86b305e96a32df19bc1a6d6aa581"} Apr 24 21:33:56.688871 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.688833 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-557bb87958-ps7pj" event={"ID":"4553a899-e53e-4df4-a0c2-c6dd0e52fb68","Type":"ContainerStarted","Data":"25cf6390cb5127fa9ce7f1633903aa3587eaf46e39b1261591b1351e6a8f0113"} Apr 24 21:33:56.714914 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:33:56.714874 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-557bb87958-ps7pj" podStartSLOduration=1.714861833 podStartE2EDuration="1.714861833s" podCreationTimestamp="2026-04-24 21:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:33:56.712904982 +0000 UTC m=+382.888639634" watchObservedRunningTime="2026-04-24 21:33:56.714861833 +0000 UTC m=+382.890596505" Apr 24 21:34:05.735610 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:05.735570 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-szfgc"] Apr 24 21:34:05.739294 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:05.739273 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-szfgc" Apr 24 21:34:05.745051 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:05.745026 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:34:05.745161 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:05.745026 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 21:34:05.745161 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:05.745029 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:34:05.745686 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:05.745668 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-dbdtf\"" Apr 24 21:34:05.757084 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:05.757062 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-szfgc"] Apr 24 21:34:05.799316 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:05.799290 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jml5z\" (UniqueName: \"kubernetes.io/projected/4ad9ac5d-8db6-48b3-b364-07a2d30cd0e5-kube-api-access-jml5z\") pod \"s3-init-szfgc\" (UID: \"4ad9ac5d-8db6-48b3-b364-07a2d30cd0e5\") " pod="kserve/s3-init-szfgc" Apr 24 21:34:05.899899 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:05.899875 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jml5z\" (UniqueName: \"kubernetes.io/projected/4ad9ac5d-8db6-48b3-b364-07a2d30cd0e5-kube-api-access-jml5z\") pod \"s3-init-szfgc\" (UID: \"4ad9ac5d-8db6-48b3-b364-07a2d30cd0e5\") " pod="kserve/s3-init-szfgc" Apr 24 21:34:05.914696 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:05.914670 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jml5z\" (UniqueName: \"kubernetes.io/projected/4ad9ac5d-8db6-48b3-b364-07a2d30cd0e5-kube-api-access-jml5z\") pod \"s3-init-szfgc\" (UID: \"4ad9ac5d-8db6-48b3-b364-07a2d30cd0e5\") " pod="kserve/s3-init-szfgc" Apr 24 21:34:06.062447 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:06.062382 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-szfgc" Apr 24 21:34:06.191437 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:06.191409 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-szfgc"] Apr 24 21:34:06.194554 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:34:06.194520 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ad9ac5d_8db6_48b3_b364_07a2d30cd0e5.slice/crio-1bf2e9a14a094dfd390b7e901f969bc2f0914cf0cf97b11065cc400b2eefac0b WatchSource:0}: Error finding container 1bf2e9a14a094dfd390b7e901f969bc2f0914cf0cf97b11065cc400b2eefac0b: Status 404 returned error can't find the container with id 1bf2e9a14a094dfd390b7e901f969bc2f0914cf0cf97b11065cc400b2eefac0b Apr 24 21:34:06.289193 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:06.289169 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:34:06.289288 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:06.289207 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:34:06.293433 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:06.293413 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:34:06.717649 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:06.717566 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-szfgc" event={"ID":"4ad9ac5d-8db6-48b3-b364-07a2d30cd0e5","Type":"ContainerStarted","Data":"1bf2e9a14a094dfd390b7e901f969bc2f0914cf0cf97b11065cc400b2eefac0b"} Apr 24 21:34:06.723517 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:06.723472 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-557bb87958-ps7pj" Apr 24 21:34:06.820887 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:06.820856 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66ffc9db74-gvw5t"] Apr 24 21:34:10.731363 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:10.731328 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-szfgc" event={"ID":"4ad9ac5d-8db6-48b3-b364-07a2d30cd0e5","Type":"ContainerStarted","Data":"ff54c6ad810a0bed86288c9ea72ac3da73805ab2bf8f305a5a44c948fb4ebd88"} Apr 24 21:34:10.793336 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:10.793293 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-szfgc" podStartSLOduration=1.5554018699999999 podStartE2EDuration="5.793279991s" podCreationTimestamp="2026-04-24 21:34:05 +0000 UTC" firstStartedPulling="2026-04-24 21:34:06.196653625 +0000 UTC m=+392.372388280" lastFinishedPulling="2026-04-24 21:34:10.43453175 +0000 UTC m=+396.610266401" observedRunningTime="2026-04-24 21:34:10.791814076 +0000 UTC m=+396.967548748" watchObservedRunningTime="2026-04-24 21:34:10.793279991 +0000 UTC m=+396.969014662" Apr 24 21:34:13.740214 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:13.740183 2578 generic.go:358] "Generic (PLEG): container finished" podID="4ad9ac5d-8db6-48b3-b364-07a2d30cd0e5" containerID="ff54c6ad810a0bed86288c9ea72ac3da73805ab2bf8f305a5a44c948fb4ebd88" exitCode=0 Apr 24 21:34:13.740609 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:13.740259 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-szfgc" event={"ID":"4ad9ac5d-8db6-48b3-b364-07a2d30cd0e5","Type":"ContainerDied","Data":"ff54c6ad810a0bed86288c9ea72ac3da73805ab2bf8f305a5a44c948fb4ebd88"} Apr 24 21:34:14.861052 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:14.861034 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-szfgc" Apr 24 21:34:14.986089 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:14.986061 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jml5z\" (UniqueName: \"kubernetes.io/projected/4ad9ac5d-8db6-48b3-b364-07a2d30cd0e5-kube-api-access-jml5z\") pod \"4ad9ac5d-8db6-48b3-b364-07a2d30cd0e5\" (UID: \"4ad9ac5d-8db6-48b3-b364-07a2d30cd0e5\") " Apr 24 21:34:14.988379 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:14.988354 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ad9ac5d-8db6-48b3-b364-07a2d30cd0e5-kube-api-access-jml5z" (OuterVolumeSpecName: "kube-api-access-jml5z") pod "4ad9ac5d-8db6-48b3-b364-07a2d30cd0e5" (UID: "4ad9ac5d-8db6-48b3-b364-07a2d30cd0e5"). InnerVolumeSpecName "kube-api-access-jml5z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:34:15.086811 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:15.086764 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jml5z\" (UniqueName: \"kubernetes.io/projected/4ad9ac5d-8db6-48b3-b364-07a2d30cd0e5-kube-api-access-jml5z\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:34:15.747348 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:15.747321 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-szfgc" Apr 24 21:34:15.747530 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:15.747320 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-szfgc" event={"ID":"4ad9ac5d-8db6-48b3-b364-07a2d30cd0e5","Type":"ContainerDied","Data":"1bf2e9a14a094dfd390b7e901f969bc2f0914cf0cf97b11065cc400b2eefac0b"} Apr 24 21:34:15.747530 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:15.747476 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bf2e9a14a094dfd390b7e901f969bc2f0914cf0cf97b11065cc400b2eefac0b" Apr 24 21:34:24.907514 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:24.907482 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9"] Apr 24 21:34:24.907879 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:24.907865 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ad9ac5d-8db6-48b3-b364-07a2d30cd0e5" containerName="s3-init" Apr 24 21:34:24.907920 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:24.907881 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad9ac5d-8db6-48b3-b364-07a2d30cd0e5" containerName="s3-init" Apr 24 21:34:24.907959 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:24.907935 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ad9ac5d-8db6-48b3-b364-07a2d30cd0e5" containerName="s3-init" Apr 24 21:34:24.910954 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:24.910940 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" Apr 24 21:34:24.914192 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:24.914170 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-f87qg\"" Apr 24 21:34:24.914302 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:24.914283 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:34:24.914696 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:24.914676 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-9db4b-kube-rbac-proxy-sar-config\"" Apr 24 21:34:24.915097 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:24.915081 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-9db4b-predictor-serving-cert\"" Apr 24 21:34:24.915180 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:24.915099 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:34:24.920984 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:24.920967 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9"] Apr 24 21:34:25.062226 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.062198 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6c2ac472-afc3-42bb-98cb-b32a47f4574c-proxy-tls\") pod \"success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9\" (UID: \"6c2ac472-afc3-42bb-98cb-b32a47f4574c\") " pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" Apr 24 21:34:25.062362 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.062279 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8nk8\" (UniqueName: \"kubernetes.io/projected/6c2ac472-afc3-42bb-98cb-b32a47f4574c-kube-api-access-k8nk8\") pod \"success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9\" (UID: \"6c2ac472-afc3-42bb-98cb-b32a47f4574c\") " pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" Apr 24 21:34:25.062362 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.062323 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-9db4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6c2ac472-afc3-42bb-98cb-b32a47f4574c-success-200-isvc-9db4b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9\" (UID: \"6c2ac472-afc3-42bb-98cb-b32a47f4574c\") " pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" Apr 24 21:34:25.163421 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.163344 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-9db4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6c2ac472-afc3-42bb-98cb-b32a47f4574c-success-200-isvc-9db4b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9\" (UID: \"6c2ac472-afc3-42bb-98cb-b32a47f4574c\") " pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" Apr 24 21:34:25.163421 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.163403 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6c2ac472-afc3-42bb-98cb-b32a47f4574c-proxy-tls\") pod \"success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9\" (UID: \"6c2ac472-afc3-42bb-98cb-b32a47f4574c\") " pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" Apr 24 21:34:25.163694 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.163453 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8nk8\" (UniqueName: \"kubernetes.io/projected/6c2ac472-afc3-42bb-98cb-b32a47f4574c-kube-api-access-k8nk8\") pod \"success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9\" (UID: \"6c2ac472-afc3-42bb-98cb-b32a47f4574c\") " pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" Apr 24 21:34:25.163943 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.163924 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-9db4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6c2ac472-afc3-42bb-98cb-b32a47f4574c-success-200-isvc-9db4b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9\" (UID: \"6c2ac472-afc3-42bb-98cb-b32a47f4574c\") " pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" Apr 24 21:34:25.165819 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.165797 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6c2ac472-afc3-42bb-98cb-b32a47f4574c-proxy-tls\") pod \"success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9\" (UID: \"6c2ac472-afc3-42bb-98cb-b32a47f4574c\") " pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" Apr 24 21:34:25.172933 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.172910 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8nk8\" (UniqueName: \"kubernetes.io/projected/6c2ac472-afc3-42bb-98cb-b32a47f4574c-kube-api-access-k8nk8\") pod \"success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9\" (UID: \"6c2ac472-afc3-42bb-98cb-b32a47f4574c\") " pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" Apr 24 21:34:25.213970 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.213943 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f"] Apr 24 21:34:25.217676 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.217659 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" Apr 24 21:34:25.220076 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.220056 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-predictor-serving-cert\"" Apr 24 21:34:25.220338 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.220325 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\"" Apr 24 21:34:25.220733 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.220716 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" Apr 24 21:34:25.229911 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.229890 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f"] Apr 24 21:34:25.356920 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.356892 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9"] Apr 24 21:34:25.359742 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:34:25.359717 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c2ac472_afc3_42bb_98cb_b32a47f4574c.slice/crio-75fd18949d304aee294cdd8671eef66bf8385c67ac12ec8d52a1fd6ef904684e WatchSource:0}: Error finding container 75fd18949d304aee294cdd8671eef66bf8385c67ac12ec8d52a1fd6ef904684e: Status 404 returned error can't find the container with id 75fd18949d304aee294cdd8671eef66bf8385c67ac12ec8d52a1fd6ef904684e Apr 24 21:34:25.365978 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.365951 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9dd38aa-c701-426e-aa68-982add5b0621-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f\" (UID: \"d9dd38aa-c701-426e-aa68-982add5b0621\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" Apr 24 21:34:25.366084 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.366009 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dxlh\" (UniqueName: \"kubernetes.io/projected/d9dd38aa-c701-426e-aa68-982add5b0621-kube-api-access-5dxlh\") pod \"isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f\" (UID: \"d9dd38aa-c701-426e-aa68-982add5b0621\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" Apr 24 21:34:25.366084 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.366053 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9dd38aa-c701-426e-aa68-982add5b0621-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f\" (UID: \"d9dd38aa-c701-426e-aa68-982add5b0621\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" Apr 24 21:34:25.366206 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.366091 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9dd38aa-c701-426e-aa68-982add5b0621-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f\" (UID: \"d9dd38aa-c701-426e-aa68-982add5b0621\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" Apr 24 21:34:25.467504 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.467480 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9dd38aa-c701-426e-aa68-982add5b0621-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f\" (UID: \"d9dd38aa-c701-426e-aa68-982add5b0621\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" Apr 24 21:34:25.467681 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.467591 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9dd38aa-c701-426e-aa68-982add5b0621-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f\" (UID: \"d9dd38aa-c701-426e-aa68-982add5b0621\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" Apr 24 21:34:25.467681 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.467613 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dxlh\" (UniqueName: \"kubernetes.io/projected/d9dd38aa-c701-426e-aa68-982add5b0621-kube-api-access-5dxlh\") pod \"isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f\" (UID: \"d9dd38aa-c701-426e-aa68-982add5b0621\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" Apr 24 21:34:25.467681 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.467635 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9dd38aa-c701-426e-aa68-982add5b0621-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f\" (UID: \"d9dd38aa-c701-426e-aa68-982add5b0621\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" Apr 24 21:34:25.467972 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.467956 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9dd38aa-c701-426e-aa68-982add5b0621-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f\" (UID: \"d9dd38aa-c701-426e-aa68-982add5b0621\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" Apr 24 21:34:25.468235 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.468216 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9dd38aa-c701-426e-aa68-982add5b0621-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f\" (UID: \"d9dd38aa-c701-426e-aa68-982add5b0621\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" Apr 24 21:34:25.470116 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.470088 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9dd38aa-c701-426e-aa68-982add5b0621-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f\" (UID: \"d9dd38aa-c701-426e-aa68-982add5b0621\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" Apr 24 21:34:25.476395 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.476373 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dxlh\" (UniqueName: \"kubernetes.io/projected/d9dd38aa-c701-426e-aa68-982add5b0621-kube-api-access-5dxlh\") pod \"isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f\" (UID: \"d9dd38aa-c701-426e-aa68-982add5b0621\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" Apr 24 21:34:25.508569 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.508532 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r"] Apr 24 21:34:25.513224 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.513210 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" Apr 24 21:34:25.515645 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.515628 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-9db4b-kube-rbac-proxy-sar-config\"" Apr 24 21:34:25.515711 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.515641 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-9db4b-predictor-serving-cert\"" Apr 24 21:34:25.523166 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.523145 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r"] Apr 24 21:34:25.529841 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.529817 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" Apr 24 21:34:25.661232 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.661210 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f"] Apr 24 21:34:25.663644 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:34:25.663612 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9dd38aa_c701_426e_aa68_982add5b0621.slice/crio-0348ead6d8fbac229076e1b39b26741b1da47f2196b00ba27d7768780f3e447c WatchSource:0}: Error finding container 0348ead6d8fbac229076e1b39b26741b1da47f2196b00ba27d7768780f3e447c: Status 404 returned error can't find the container with id 0348ead6d8fbac229076e1b39b26741b1da47f2196b00ba27d7768780f3e447c Apr 24 21:34:25.670130 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.670110 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f7de5da-b616-41d1-90a8-4c2f4e9f81e7-proxy-tls\") pod \"error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r\" (UID: \"5f7de5da-b616-41d1-90a8-4c2f4e9f81e7\") " pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" Apr 24 21:34:25.670205 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.670163 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-9db4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5f7de5da-b616-41d1-90a8-4c2f4e9f81e7-error-404-isvc-9db4b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r\" (UID: \"5f7de5da-b616-41d1-90a8-4c2f4e9f81e7\") " pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" Apr 24 21:34:25.670205 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.670182 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bjhr\" (UniqueName: \"kubernetes.io/projected/5f7de5da-b616-41d1-90a8-4c2f4e9f81e7-kube-api-access-4bjhr\") pod \"error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r\" (UID: \"5f7de5da-b616-41d1-90a8-4c2f4e9f81e7\") " pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" Apr 24 21:34:25.770857 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.770784 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f7de5da-b616-41d1-90a8-4c2f4e9f81e7-proxy-tls\") pod \"error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r\" (UID: \"5f7de5da-b616-41d1-90a8-4c2f4e9f81e7\") " pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" Apr 24 21:34:25.771004 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.770875 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-9db4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5f7de5da-b616-41d1-90a8-4c2f4e9f81e7-error-404-isvc-9db4b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r\" (UID: \"5f7de5da-b616-41d1-90a8-4c2f4e9f81e7\") " pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" Apr 24 21:34:25.771004 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.770905 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bjhr\" (UniqueName: \"kubernetes.io/projected/5f7de5da-b616-41d1-90a8-4c2f4e9f81e7-kube-api-access-4bjhr\") pod \"error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r\" (UID: \"5f7de5da-b616-41d1-90a8-4c2f4e9f81e7\") " pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" Apr 24 21:34:25.771850 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.771821 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-9db4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5f7de5da-b616-41d1-90a8-4c2f4e9f81e7-error-404-isvc-9db4b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r\" (UID: \"5f7de5da-b616-41d1-90a8-4c2f4e9f81e7\") " pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" Apr 24 21:34:25.773583 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.773563 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f7de5da-b616-41d1-90a8-4c2f4e9f81e7-proxy-tls\") pod \"error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r\" (UID: \"5f7de5da-b616-41d1-90a8-4c2f4e9f81e7\") " pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" Apr 24 21:34:25.775990 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.775963 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" event={"ID":"6c2ac472-afc3-42bb-98cb-b32a47f4574c","Type":"ContainerStarted","Data":"75fd18949d304aee294cdd8671eef66bf8385c67ac12ec8d52a1fd6ef904684e"} Apr 24 21:34:25.777297 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.777272 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" event={"ID":"d9dd38aa-c701-426e-aa68-982add5b0621","Type":"ContainerStarted","Data":"0348ead6d8fbac229076e1b39b26741b1da47f2196b00ba27d7768780f3e447c"} Apr 24 21:34:25.779521 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.779496 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bjhr\" (UniqueName: \"kubernetes.io/projected/5f7de5da-b616-41d1-90a8-4c2f4e9f81e7-kube-api-access-4bjhr\") pod \"error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r\" (UID: \"5f7de5da-b616-41d1-90a8-4c2f4e9f81e7\") " pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" Apr 24 21:34:25.824071 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:25.824029 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" Apr 24 21:34:26.012783 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:26.012727 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r"] Apr 24 21:34:26.018391 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:34:26.018356 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f7de5da_b616_41d1_90a8_4c2f4e9f81e7.slice/crio-fddb86e021c14ee03657f370d7abde55b242444455720571052124d9925bcdae WatchSource:0}: Error finding container fddb86e021c14ee03657f370d7abde55b242444455720571052124d9925bcdae: Status 404 returned error can't find the container with id fddb86e021c14ee03657f370d7abde55b242444455720571052124d9925bcdae Apr 24 21:34:26.796836 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:26.796795 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" event={"ID":"5f7de5da-b616-41d1-90a8-4c2f4e9f81e7","Type":"ContainerStarted","Data":"fddb86e021c14ee03657f370d7abde55b242444455720571052124d9925bcdae"} Apr 24 21:34:31.852435 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:31.852307 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-66ffc9db74-gvw5t" podUID="e7f04af0-b878-4065-8684-c6af48ade737" containerName="console" containerID="cri-o://ab2cdf2ad7b67607be82cc6fffa8dc97d10d05c8cac589c88ec41181c620361a" gracePeriod=15 Apr 24 21:34:34.845990 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:34.845962 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66ffc9db74-gvw5t_e7f04af0-b878-4065-8684-c6af48ade737/console/0.log" Apr 24 21:34:34.846431 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:34.846000 2578 generic.go:358] "Generic (PLEG): container finished" podID="e7f04af0-b878-4065-8684-c6af48ade737" containerID="ab2cdf2ad7b67607be82cc6fffa8dc97d10d05c8cac589c88ec41181c620361a" exitCode=2 Apr 24 21:34:34.846431 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:34.846093 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66ffc9db74-gvw5t" event={"ID":"e7f04af0-b878-4065-8684-c6af48ade737","Type":"ContainerDied","Data":"ab2cdf2ad7b67607be82cc6fffa8dc97d10d05c8cac589c88ec41181c620361a"} Apr 24 21:34:37.100171 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:37.100129 2578 patch_prober.go:28] interesting pod/console-66ffc9db74-gvw5t container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.134.0.19:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 24 21:34:37.100648 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:37.100290 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-66ffc9db74-gvw5t" podUID="e7f04af0-b878-4065-8684-c6af48ade737" containerName="console" probeResult="failure" output="Get \"https://10.134.0.19:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 24 21:34:37.946996 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:37.946975 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66ffc9db74-gvw5t_e7f04af0-b878-4065-8684-c6af48ade737/console/0.log" Apr 24 21:34:37.947079 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:37.947050 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:34:38.027196 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.027166 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-console-config\") pod \"e7f04af0-b878-4065-8684-c6af48ade737\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " Apr 24 21:34:38.027368 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.027212 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f04af0-b878-4065-8684-c6af48ade737-console-serving-cert\") pod \"e7f04af0-b878-4065-8684-c6af48ade737\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " Apr 24 21:34:38.027368 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.027229 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e7f04af0-b878-4065-8684-c6af48ade737-console-oauth-config\") pod \"e7f04af0-b878-4065-8684-c6af48ade737\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " Apr 24 21:34:38.027476 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.027377 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hnlw\" (UniqueName: \"kubernetes.io/projected/e7f04af0-b878-4065-8684-c6af48ade737-kube-api-access-6hnlw\") pod \"e7f04af0-b878-4065-8684-c6af48ade737\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " Apr 24 21:34:38.027476 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.027451 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-service-ca\") pod \"e7f04af0-b878-4065-8684-c6af48ade737\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " Apr 24 21:34:38.027611 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.027513 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-trusted-ca-bundle\") pod \"e7f04af0-b878-4065-8684-c6af48ade737\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " Apr 24 21:34:38.027611 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.027576 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-oauth-serving-cert\") pod \"e7f04af0-b878-4065-8684-c6af48ade737\" (UID: \"e7f04af0-b878-4065-8684-c6af48ade737\") " Apr 24 21:34:38.027611 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.027595 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-console-config" (OuterVolumeSpecName: "console-config") pod "e7f04af0-b878-4065-8684-c6af48ade737" (UID: "e7f04af0-b878-4065-8684-c6af48ade737"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:34:38.027861 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.027841 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-console-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:34:38.028149 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.028119 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-service-ca" (OuterVolumeSpecName: "service-ca") pod "e7f04af0-b878-4065-8684-c6af48ade737" (UID: "e7f04af0-b878-4065-8684-c6af48ade737"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:34:38.028149 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.028162 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e7f04af0-b878-4065-8684-c6af48ade737" (UID: "e7f04af0-b878-4065-8684-c6af48ade737"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:34:38.028488 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.028279 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e7f04af0-b878-4065-8684-c6af48ade737" (UID: "e7f04af0-b878-4065-8684-c6af48ade737"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:34:38.029867 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.029839 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f04af0-b878-4065-8684-c6af48ade737-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e7f04af0-b878-4065-8684-c6af48ade737" (UID: "e7f04af0-b878-4065-8684-c6af48ade737"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:34:38.030068 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.030037 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f04af0-b878-4065-8684-c6af48ade737-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e7f04af0-b878-4065-8684-c6af48ade737" (UID: "e7f04af0-b878-4065-8684-c6af48ade737"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:34:38.030162 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.030094 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7f04af0-b878-4065-8684-c6af48ade737-kube-api-access-6hnlw" (OuterVolumeSpecName: "kube-api-access-6hnlw") pod "e7f04af0-b878-4065-8684-c6af48ade737" (UID: "e7f04af0-b878-4065-8684-c6af48ade737"). InnerVolumeSpecName "kube-api-access-6hnlw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:34:38.129271 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.129228 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6hnlw\" (UniqueName: \"kubernetes.io/projected/e7f04af0-b878-4065-8684-c6af48ade737-kube-api-access-6hnlw\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:34:38.129271 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.129271 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-service-ca\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:34:38.129715 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.129288 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-trusted-ca-bundle\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:34:38.129715 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.129304 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e7f04af0-b878-4065-8684-c6af48ade737-oauth-serving-cert\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:34:38.129715 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.129320 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f04af0-b878-4065-8684-c6af48ade737-console-serving-cert\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:34:38.129715 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.129336 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e7f04af0-b878-4065-8684-c6af48ade737-console-oauth-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:34:38.860718 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.860690 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66ffc9db74-gvw5t_e7f04af0-b878-4065-8684-c6af48ade737/console/0.log" Apr 24 21:34:38.860897 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.860743 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66ffc9db74-gvw5t" event={"ID":"e7f04af0-b878-4065-8684-c6af48ade737","Type":"ContainerDied","Data":"e4335e13d936915ee17b00309412b71f1562ba61cde65ee434624df1326c63d3"} Apr 24 21:34:38.860897 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.860785 2578 scope.go:117] "RemoveContainer" containerID="ab2cdf2ad7b67607be82cc6fffa8dc97d10d05c8cac589c88ec41181c620361a" Apr 24 21:34:38.860897 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.860808 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66ffc9db74-gvw5t" Apr 24 21:34:38.881773 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.881748 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66ffc9db74-gvw5t"] Apr 24 21:34:38.885753 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:38.885732 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-66ffc9db74-gvw5t"] Apr 24 21:34:40.422081 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:40.422052 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7f04af0-b878-4065-8684-c6af48ade737" path="/var/lib/kubelet/pods/e7f04af0-b878-4065-8684-c6af48ade737/volumes" Apr 24 21:34:42.881321 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:42.881280 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" event={"ID":"6c2ac472-afc3-42bb-98cb-b32a47f4574c","Type":"ContainerStarted","Data":"82ed1ca8b04a3bd328a48a9838d51380c369b99add884cca91af08c2da979435"} Apr 24 21:34:42.883416 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:42.883386 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" event={"ID":"5f7de5da-b616-41d1-90a8-4c2f4e9f81e7","Type":"ContainerStarted","Data":"cc92a70309cbc68f4f0f2abb183c10ea8cc002547b6de0e93135d4a1b250e536"} Apr 24 21:34:42.885947 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:42.885499 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" event={"ID":"d9dd38aa-c701-426e-aa68-982add5b0621","Type":"ContainerStarted","Data":"5ca475d0e3c013ee239fdb1713eaeb56aba4101d2e4829f254ad3c0754228434"} Apr 24 21:34:45.896265 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:45.896229 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" event={"ID":"6c2ac472-afc3-42bb-98cb-b32a47f4574c","Type":"ContainerStarted","Data":"4a373f2fa59468fee91da1c120806ee6c5dc76102fc7ea7c631122c7a5ba8794"} Apr 24 21:34:45.896760 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:45.896387 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" Apr 24 21:34:45.896760 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:45.896511 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" Apr 24 21:34:45.897537 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:45.897509 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" podUID="6c2ac472-afc3-42bb-98cb-b32a47f4574c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:34:45.897846 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:45.897827 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" event={"ID":"5f7de5da-b616-41d1-90a8-4c2f4e9f81e7","Type":"ContainerStarted","Data":"bf3a3a5452ba41e0a9059e617c64b6ec8c65985bd2b4437a2df525256cd8a9bc"} Apr 24 21:34:45.897996 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:45.897978 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" Apr 24 21:34:45.915891 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:45.915839 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" podStartSLOduration=2.3696589 podStartE2EDuration="21.915823799s" podCreationTimestamp="2026-04-24 21:34:24 +0000 UTC" firstStartedPulling="2026-04-24 21:34:25.362304854 +0000 UTC m=+411.538039505" lastFinishedPulling="2026-04-24 21:34:44.908469739 +0000 UTC m=+431.084204404" observedRunningTime="2026-04-24 21:34:45.913464681 +0000 UTC m=+432.089199353" watchObservedRunningTime="2026-04-24 21:34:45.915823799 +0000 UTC m=+432.091558472" Apr 24 21:34:45.937352 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:45.937307 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" podStartSLOduration=2.056866803 podStartE2EDuration="20.93729404s" podCreationTimestamp="2026-04-24 21:34:25 +0000 UTC" firstStartedPulling="2026-04-24 21:34:26.021229486 +0000 UTC m=+412.196964141" lastFinishedPulling="2026-04-24 21:34:44.901656714 +0000 UTC m=+431.077391378" observedRunningTime="2026-04-24 21:34:45.935675652 +0000 UTC m=+432.111410325" watchObservedRunningTime="2026-04-24 21:34:45.93729404 +0000 UTC m=+432.113028713" Apr 24 21:34:46.901740 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:46.901705 2578 generic.go:358] "Generic (PLEG): container finished" podID="d9dd38aa-c701-426e-aa68-982add5b0621" containerID="5ca475d0e3c013ee239fdb1713eaeb56aba4101d2e4829f254ad3c0754228434" exitCode=0 Apr 24 21:34:46.902135 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:46.901784 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" event={"ID":"d9dd38aa-c701-426e-aa68-982add5b0621","Type":"ContainerDied","Data":"5ca475d0e3c013ee239fdb1713eaeb56aba4101d2e4829f254ad3c0754228434"} Apr 24 21:34:46.902309 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:46.902282 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" Apr 24 21:34:46.902443 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:46.902292 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" podUID="6c2ac472-afc3-42bb-98cb-b32a47f4574c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:34:46.903595 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:46.903575 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" podUID="5f7de5da-b616-41d1-90a8-4c2f4e9f81e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 21:34:47.904995 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:47.904959 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" podUID="5f7de5da-b616-41d1-90a8-4c2f4e9f81e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 21:34:51.906662 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:51.906584 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" Apr 24 21:34:51.907133 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:51.907109 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" podUID="6c2ac472-afc3-42bb-98cb-b32a47f4574c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:34:52.909198 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:52.909165 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" Apr 24 21:34:52.909697 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:52.909670 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" podUID="5f7de5da-b616-41d1-90a8-4c2f4e9f81e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 21:34:53.927636 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:53.927607 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" event={"ID":"d9dd38aa-c701-426e-aa68-982add5b0621","Type":"ContainerStarted","Data":"af33fbce5bdfca5e0de757d65310b5c92b84ceb33a3caa40e131c3a39e273f52"} Apr 24 21:34:53.927994 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:53.927643 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" event={"ID":"d9dd38aa-c701-426e-aa68-982add5b0621","Type":"ContainerStarted","Data":"36f38f95a80abec3528c1889d6c381db8da6bf7c320db9176ad13b8e6bcb7f8e"} Apr 24 21:34:53.927994 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:53.927906 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" Apr 24 21:34:53.954047 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:53.953994 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" podStartSLOduration=1.573512299 podStartE2EDuration="28.953978407s" podCreationTimestamp="2026-04-24 21:34:25 +0000 UTC" firstStartedPulling="2026-04-24 21:34:25.665459287 +0000 UTC m=+411.841193938" lastFinishedPulling="2026-04-24 21:34:53.045925395 +0000 UTC m=+439.221660046" observedRunningTime="2026-04-24 21:34:53.953367441 +0000 UTC m=+440.129102110" watchObservedRunningTime="2026-04-24 21:34:53.953978407 +0000 UTC m=+440.129713081" Apr 24 21:34:54.931051 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:54.931017 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" Apr 24 21:34:54.932220 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:54.932196 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" podUID="d9dd38aa-c701-426e-aa68-982add5b0621" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:34:55.933896 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:34:55.933861 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" podUID="d9dd38aa-c701-426e-aa68-982add5b0621" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:35:00.938777 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:00.938748 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" Apr 24 21:35:00.939314 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:00.939288 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" podUID="d9dd38aa-c701-426e-aa68-982add5b0621" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:35:01.907203 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:01.907160 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" podUID="6c2ac472-afc3-42bb-98cb-b32a47f4574c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:35:02.910532 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:02.910491 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" podUID="5f7de5da-b616-41d1-90a8-4c2f4e9f81e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 21:35:10.939366 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:10.939319 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" podUID="d9dd38aa-c701-426e-aa68-982add5b0621" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:35:11.907221 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:11.907189 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" podUID="6c2ac472-afc3-42bb-98cb-b32a47f4574c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:35:12.910424 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:12.910387 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" podUID="5f7de5da-b616-41d1-90a8-4c2f4e9f81e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 21:35:20.939783 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:20.939738 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" podUID="d9dd38aa-c701-426e-aa68-982add5b0621" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:35:21.907641 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:21.907607 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" podUID="6c2ac472-afc3-42bb-98cb-b32a47f4574c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 21:35:22.910147 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:22.910111 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" podUID="5f7de5da-b616-41d1-90a8-4c2f4e9f81e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 21:35:30.939716 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:30.939678 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" podUID="d9dd38aa-c701-426e-aa68-982add5b0621" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:35:31.908301 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:31.908272 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" Apr 24 21:35:32.910962 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:32.910937 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" Apr 24 21:35:40.940285 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:40.940245 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" podUID="d9dd38aa-c701-426e-aa68-982add5b0621" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:35:50.940020 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:50.939977 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" podUID="d9dd38aa-c701-426e-aa68-982add5b0621" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 21:35:55.287014 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.286979 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9"] Apr 24 21:35:55.287370 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.287261 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" podUID="6c2ac472-afc3-42bb-98cb-b32a47f4574c" containerName="kserve-container" containerID="cri-o://82ed1ca8b04a3bd328a48a9838d51380c369b99add884cca91af08c2da979435" gracePeriod=30 Apr 24 21:35:55.287370 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.287296 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" podUID="6c2ac472-afc3-42bb-98cb-b32a47f4574c" containerName="kube-rbac-proxy" containerID="cri-o://4a373f2fa59468fee91da1c120806ee6c5dc76102fc7ea7c631122c7a5ba8794" gracePeriod=30 Apr 24 21:35:55.367583 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.367537 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r"] Apr 24 21:35:55.367851 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.367830 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" podUID="5f7de5da-b616-41d1-90a8-4c2f4e9f81e7" containerName="kserve-container" containerID="cri-o://cc92a70309cbc68f4f0f2abb183c10ea8cc002547b6de0e93135d4a1b250e536" gracePeriod=30 Apr 24 21:35:55.367924 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.367854 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" podUID="5f7de5da-b616-41d1-90a8-4c2f4e9f81e7" containerName="kube-rbac-proxy" containerID="cri-o://bf3a3a5452ba41e0a9059e617c64b6ec8c65985bd2b4437a2df525256cd8a9bc" gracePeriod=30 Apr 24 21:35:55.382220 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.382197 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc"] Apr 24 21:35:55.382582 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.382537 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7f04af0-b878-4065-8684-c6af48ade737" containerName="console" Apr 24 21:35:55.382582 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.382578 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f04af0-b878-4065-8684-c6af48ade737" containerName="console" Apr 24 21:35:55.382713 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.382639 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7f04af0-b878-4065-8684-c6af48ade737" containerName="console" Apr 24 21:35:55.385218 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.385204 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" Apr 24 21:35:55.388096 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.388079 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-6d3ba-predictor-serving-cert\"" Apr 24 21:35:55.388183 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.388143 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-6d3ba-kube-rbac-proxy-sar-config\"" Apr 24 21:35:55.396327 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.396308 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc"] Apr 24 21:35:55.469215 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.469186 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d-proxy-tls\") pod \"success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc\" (UID: \"2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d\") " pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" Apr 24 21:35:55.469369 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.469237 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-6d3ba-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d-success-200-isvc-6d3ba-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc\" (UID: \"2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d\") " pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" Apr 24 21:35:55.469369 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.469313 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-672qn\" (UniqueName: \"kubernetes.io/projected/2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d-kube-api-access-672qn\") pod \"success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc\" (UID: \"2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d\") " pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" Apr 24 21:35:55.531110 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.531082 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt"] Apr 24 21:35:55.534193 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.534177 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" Apr 24 21:35:55.536574 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.536534 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-6d3ba-predictor-serving-cert\"" Apr 24 21:35:55.536673 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.536574 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-6d3ba-kube-rbac-proxy-sar-config\"" Apr 24 21:35:55.546515 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.546495 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt"] Apr 24 21:35:55.570429 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.570406 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-672qn\" (UniqueName: \"kubernetes.io/projected/2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d-kube-api-access-672qn\") pod \"success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc\" (UID: \"2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d\") " pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" Apr 24 21:35:55.570528 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.570469 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d-proxy-tls\") pod \"success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc\" (UID: \"2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d\") " pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" Apr 24 21:35:55.570528 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.570497 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-6d3ba-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d-success-200-isvc-6d3ba-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc\" (UID: \"2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d\") " pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" Apr 24 21:35:55.571086 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.571060 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-6d3ba-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d-success-200-isvc-6d3ba-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc\" (UID: \"2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d\") " pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" Apr 24 21:35:55.572979 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.572961 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d-proxy-tls\") pod \"success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc\" (UID: \"2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d\") " pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" Apr 24 21:35:55.582230 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.582211 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-672qn\" (UniqueName: \"kubernetes.io/projected/2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d-kube-api-access-672qn\") pod \"success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc\" (UID: \"2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d\") " pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" Apr 24 21:35:55.671054 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.671025 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-6d3ba-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/be31fcea-aebe-4dbd-b1e7-4776f3785a40-error-404-isvc-6d3ba-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt\" (UID: \"be31fcea-aebe-4dbd-b1e7-4776f3785a40\") " pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" Apr 24 21:35:55.671191 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.671088 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrtvr\" (UniqueName: \"kubernetes.io/projected/be31fcea-aebe-4dbd-b1e7-4776f3785a40-kube-api-access-vrtvr\") pod \"error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt\" (UID: \"be31fcea-aebe-4dbd-b1e7-4776f3785a40\") " pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" Apr 24 21:35:55.671191 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.671124 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be31fcea-aebe-4dbd-b1e7-4776f3785a40-proxy-tls\") pod \"error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt\" (UID: \"be31fcea-aebe-4dbd-b1e7-4776f3785a40\") " pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" Apr 24 21:35:55.695945 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.695919 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" Apr 24 21:35:55.772465 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.772435 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-6d3ba-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/be31fcea-aebe-4dbd-b1e7-4776f3785a40-error-404-isvc-6d3ba-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt\" (UID: \"be31fcea-aebe-4dbd-b1e7-4776f3785a40\") " pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" Apr 24 21:35:55.772610 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.772504 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrtvr\" (UniqueName: \"kubernetes.io/projected/be31fcea-aebe-4dbd-b1e7-4776f3785a40-kube-api-access-vrtvr\") pod \"error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt\" (UID: \"be31fcea-aebe-4dbd-b1e7-4776f3785a40\") " pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" Apr 24 21:35:55.772610 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.772527 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be31fcea-aebe-4dbd-b1e7-4776f3785a40-proxy-tls\") pod \"error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt\" (UID: \"be31fcea-aebe-4dbd-b1e7-4776f3785a40\") " pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" Apr 24 21:35:55.773226 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.773198 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-6d3ba-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/be31fcea-aebe-4dbd-b1e7-4776f3785a40-error-404-isvc-6d3ba-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt\" (UID: \"be31fcea-aebe-4dbd-b1e7-4776f3785a40\") " pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" Apr 24 21:35:55.775373 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.774956 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be31fcea-aebe-4dbd-b1e7-4776f3785a40-proxy-tls\") pod \"error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt\" (UID: \"be31fcea-aebe-4dbd-b1e7-4776f3785a40\") " pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" Apr 24 21:35:55.781508 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.781482 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrtvr\" (UniqueName: \"kubernetes.io/projected/be31fcea-aebe-4dbd-b1e7-4776f3785a40-kube-api-access-vrtvr\") pod \"error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt\" (UID: \"be31fcea-aebe-4dbd-b1e7-4776f3785a40\") " pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" Apr 24 21:35:55.820477 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.820360 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc"] Apr 24 21:35:55.823316 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:35:55.823291 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fa5eddf_8404_4732_93a3_3f6f1e0b7d1d.slice/crio-02a39f6597acd15ba7165b941dbc340a811d02f4435e1a875a593633e4797f2e WatchSource:0}: Error finding container 02a39f6597acd15ba7165b941dbc340a811d02f4435e1a875a593633e4797f2e: Status 404 returned error can't find the container with id 02a39f6597acd15ba7165b941dbc340a811d02f4435e1a875a593633e4797f2e Apr 24 21:35:55.843330 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.843312 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" Apr 24 21:35:55.970357 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:55.970331 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt"] Apr 24 21:35:55.973099 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:35:55.973059 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe31fcea_aebe_4dbd_b1e7_4776f3785a40.slice/crio-ad6252d115442cbe65499008a990bc0ef8257070a197aad456e12e86ff5f3286 WatchSource:0}: Error finding container ad6252d115442cbe65499008a990bc0ef8257070a197aad456e12e86ff5f3286: Status 404 returned error can't find the container with id ad6252d115442cbe65499008a990bc0ef8257070a197aad456e12e86ff5f3286 Apr 24 21:35:56.115566 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:56.115515 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" event={"ID":"2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d","Type":"ContainerStarted","Data":"df8bfbee38cb6e1c1ace3d2bb11b7da729d8c3281df3fd64456edf0145626e2e"} Apr 24 21:35:56.115697 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:56.115585 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" event={"ID":"2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d","Type":"ContainerStarted","Data":"6322705f57d26155dac574a55351d2e4af29b20990db47cb2249e4e48c47785c"} Apr 24 21:35:56.115697 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:56.115600 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" event={"ID":"2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d","Type":"ContainerStarted","Data":"02a39f6597acd15ba7165b941dbc340a811d02f4435e1a875a593633e4797f2e"} Apr 24 21:35:56.116466 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:56.116441 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" Apr 24 21:35:56.116576 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:56.116475 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" Apr 24 21:35:56.119620 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:56.119588 2578 generic.go:358] "Generic (PLEG): container finished" podID="6c2ac472-afc3-42bb-98cb-b32a47f4574c" containerID="4a373f2fa59468fee91da1c120806ee6c5dc76102fc7ea7c631122c7a5ba8794" exitCode=2 Apr 24 21:35:56.119717 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:56.119661 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" event={"ID":"6c2ac472-afc3-42bb-98cb-b32a47f4574c","Type":"ContainerDied","Data":"4a373f2fa59468fee91da1c120806ee6c5dc76102fc7ea7c631122c7a5ba8794"} Apr 24 21:35:56.121171 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:56.121144 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" podUID="2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 21:35:56.122601 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:56.122473 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" event={"ID":"be31fcea-aebe-4dbd-b1e7-4776f3785a40","Type":"ContainerStarted","Data":"0b71176b6557597a843d05ffbce07c80b1ff6e4b093957b8a56f23074e9197ba"} Apr 24 21:35:56.122601 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:56.122502 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" event={"ID":"be31fcea-aebe-4dbd-b1e7-4776f3785a40","Type":"ContainerStarted","Data":"2aeb9abf2a6647091b1a5a05508147e018673ccd2a7b7ad769a72fcc6a12ac60"} Apr 24 21:35:56.122601 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:56.122516 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" event={"ID":"be31fcea-aebe-4dbd-b1e7-4776f3785a40","Type":"ContainerStarted","Data":"ad6252d115442cbe65499008a990bc0ef8257070a197aad456e12e86ff5f3286"} Apr 24 21:35:56.122810 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:56.122677 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" Apr 24 21:35:56.124224 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:56.124200 2578 generic.go:358] "Generic (PLEG): container finished" podID="5f7de5da-b616-41d1-90a8-4c2f4e9f81e7" containerID="bf3a3a5452ba41e0a9059e617c64b6ec8c65985bd2b4437a2df525256cd8a9bc" exitCode=2 Apr 24 21:35:56.124318 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:56.124238 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" event={"ID":"5f7de5da-b616-41d1-90a8-4c2f4e9f81e7","Type":"ContainerDied","Data":"bf3a3a5452ba41e0a9059e617c64b6ec8c65985bd2b4437a2df525256cd8a9bc"} Apr 24 21:35:56.134355 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:56.134311 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" podStartSLOduration=1.134294807 podStartE2EDuration="1.134294807s" podCreationTimestamp="2026-04-24 21:35:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:35:56.133641714 +0000 UTC m=+502.309376388" watchObservedRunningTime="2026-04-24 21:35:56.134294807 +0000 UTC m=+502.310029481" Apr 24 21:35:56.153513 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:56.153474 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" podStartSLOduration=1.153462561 podStartE2EDuration="1.153462561s" podCreationTimestamp="2026-04-24 21:35:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:35:56.151904144 +0000 UTC m=+502.327638817" watchObservedRunningTime="2026-04-24 21:35:56.153462561 +0000 UTC m=+502.329197225" Apr 24 21:35:56.902732 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:56.902693 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" podUID="6c2ac472-afc3-42bb-98cb-b32a47f4574c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.23:8643/healthz\": dial tcp 10.134.0.23:8643: connect: connection refused" Apr 24 21:35:57.127889 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:57.127861 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" Apr 24 21:35:57.128057 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:57.127952 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" podUID="2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 21:35:57.128840 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:57.128818 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" podUID="be31fcea-aebe-4dbd-b1e7-4776f3785a40" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 21:35:57.905363 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:57.905327 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" podUID="5f7de5da-b616-41d1-90a8-4c2f4e9f81e7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.25:8643/healthz\": dial tcp 10.134.0.25:8643: connect: connection refused" Apr 24 21:35:58.130696 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:58.130654 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" podUID="2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 21:35:58.130696 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:58.130691 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" podUID="be31fcea-aebe-4dbd-b1e7-4776f3785a40" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 21:35:58.934657 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:58.934633 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" Apr 24 21:35:59.003162 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.003135 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6c2ac472-afc3-42bb-98cb-b32a47f4574c-proxy-tls\") pod \"6c2ac472-afc3-42bb-98cb-b32a47f4574c\" (UID: \"6c2ac472-afc3-42bb-98cb-b32a47f4574c\") " Apr 24 21:35:59.003245 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.003209 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8nk8\" (UniqueName: \"kubernetes.io/projected/6c2ac472-afc3-42bb-98cb-b32a47f4574c-kube-api-access-k8nk8\") pod \"6c2ac472-afc3-42bb-98cb-b32a47f4574c\" (UID: \"6c2ac472-afc3-42bb-98cb-b32a47f4574c\") " Apr 24 21:35:59.003245 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.003240 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-9db4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6c2ac472-afc3-42bb-98cb-b32a47f4574c-success-200-isvc-9db4b-kube-rbac-proxy-sar-config\") pod \"6c2ac472-afc3-42bb-98cb-b32a47f4574c\" (UID: \"6c2ac472-afc3-42bb-98cb-b32a47f4574c\") " Apr 24 21:35:59.003674 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.003644 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c2ac472-afc3-42bb-98cb-b32a47f4574c-success-200-isvc-9db4b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-9db4b-kube-rbac-proxy-sar-config") pod "6c2ac472-afc3-42bb-98cb-b32a47f4574c" (UID: "6c2ac472-afc3-42bb-98cb-b32a47f4574c"). InnerVolumeSpecName "success-200-isvc-9db4b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:35:59.005421 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.005391 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c2ac472-afc3-42bb-98cb-b32a47f4574c-kube-api-access-k8nk8" (OuterVolumeSpecName: "kube-api-access-k8nk8") pod "6c2ac472-afc3-42bb-98cb-b32a47f4574c" (UID: "6c2ac472-afc3-42bb-98cb-b32a47f4574c"). InnerVolumeSpecName "kube-api-access-k8nk8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:35:59.005503 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.005432 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c2ac472-afc3-42bb-98cb-b32a47f4574c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6c2ac472-afc3-42bb-98cb-b32a47f4574c" (UID: "6c2ac472-afc3-42bb-98cb-b32a47f4574c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:35:59.006925 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.006909 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" Apr 24 21:35:59.104205 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.104176 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-9db4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5f7de5da-b616-41d1-90a8-4c2f4e9f81e7-error-404-isvc-9db4b-kube-rbac-proxy-sar-config\") pod \"5f7de5da-b616-41d1-90a8-4c2f4e9f81e7\" (UID: \"5f7de5da-b616-41d1-90a8-4c2f4e9f81e7\") " Apr 24 21:35:59.104329 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.104261 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bjhr\" (UniqueName: \"kubernetes.io/projected/5f7de5da-b616-41d1-90a8-4c2f4e9f81e7-kube-api-access-4bjhr\") pod \"5f7de5da-b616-41d1-90a8-4c2f4e9f81e7\" (UID: \"5f7de5da-b616-41d1-90a8-4c2f4e9f81e7\") " Apr 24 21:35:59.104329 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.104285 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f7de5da-b616-41d1-90a8-4c2f4e9f81e7-proxy-tls\") pod \"5f7de5da-b616-41d1-90a8-4c2f4e9f81e7\" (UID: \"5f7de5da-b616-41d1-90a8-4c2f4e9f81e7\") " Apr 24 21:35:59.104409 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.104392 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f7de5da-b616-41d1-90a8-4c2f4e9f81e7-error-404-isvc-9db4b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-9db4b-kube-rbac-proxy-sar-config") pod "5f7de5da-b616-41d1-90a8-4c2f4e9f81e7" (UID: "5f7de5da-b616-41d1-90a8-4c2f4e9f81e7"). InnerVolumeSpecName "error-404-isvc-9db4b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:35:59.104614 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.104596 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6c2ac472-afc3-42bb-98cb-b32a47f4574c-proxy-tls\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:35:59.104658 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.104622 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k8nk8\" (UniqueName: \"kubernetes.io/projected/6c2ac472-afc3-42bb-98cb-b32a47f4574c-kube-api-access-k8nk8\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:35:59.104658 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.104636 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-9db4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5f7de5da-b616-41d1-90a8-4c2f4e9f81e7-error-404-isvc-9db4b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:35:59.104658 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.104647 2578 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-9db4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6c2ac472-afc3-42bb-98cb-b32a47f4574c-success-200-isvc-9db4b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:35:59.106456 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.106432 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f7de5da-b616-41d1-90a8-4c2f4e9f81e7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5f7de5da-b616-41d1-90a8-4c2f4e9f81e7" (UID: "5f7de5da-b616-41d1-90a8-4c2f4e9f81e7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:35:59.106456 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.106433 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f7de5da-b616-41d1-90a8-4c2f4e9f81e7-kube-api-access-4bjhr" (OuterVolumeSpecName: "kube-api-access-4bjhr") pod "5f7de5da-b616-41d1-90a8-4c2f4e9f81e7" (UID: "5f7de5da-b616-41d1-90a8-4c2f4e9f81e7"). InnerVolumeSpecName "kube-api-access-4bjhr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:35:59.134779 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.134752 2578 generic.go:358] "Generic (PLEG): container finished" podID="5f7de5da-b616-41d1-90a8-4c2f4e9f81e7" containerID="cc92a70309cbc68f4f0f2abb183c10ea8cc002547b6de0e93135d4a1b250e536" exitCode=0 Apr 24 21:35:59.134899 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.134822 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" Apr 24 21:35:59.134899 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.134835 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" event={"ID":"5f7de5da-b616-41d1-90a8-4c2f4e9f81e7","Type":"ContainerDied","Data":"cc92a70309cbc68f4f0f2abb183c10ea8cc002547b6de0e93135d4a1b250e536"} Apr 24 21:35:59.134899 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.134876 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r" event={"ID":"5f7de5da-b616-41d1-90a8-4c2f4e9f81e7","Type":"ContainerDied","Data":"fddb86e021c14ee03657f370d7abde55b242444455720571052124d9925bcdae"} Apr 24 21:35:59.134899 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.134892 2578 scope.go:117] "RemoveContainer" containerID="bf3a3a5452ba41e0a9059e617c64b6ec8c65985bd2b4437a2df525256cd8a9bc" Apr 24 21:35:59.136341 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.136317 2578 generic.go:358] "Generic (PLEG): container finished" podID="6c2ac472-afc3-42bb-98cb-b32a47f4574c" containerID="82ed1ca8b04a3bd328a48a9838d51380c369b99add884cca91af08c2da979435" exitCode=0 Apr 24 21:35:59.136442 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.136367 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" event={"ID":"6c2ac472-afc3-42bb-98cb-b32a47f4574c","Type":"ContainerDied","Data":"82ed1ca8b04a3bd328a48a9838d51380c369b99add884cca91af08c2da979435"} Apr 24 21:35:59.136442 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.136378 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" Apr 24 21:35:59.136442 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.136388 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9" event={"ID":"6c2ac472-afc3-42bb-98cb-b32a47f4574c","Type":"ContainerDied","Data":"75fd18949d304aee294cdd8671eef66bf8385c67ac12ec8d52a1fd6ef904684e"} Apr 24 21:35:59.144104 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.143953 2578 scope.go:117] "RemoveContainer" containerID="cc92a70309cbc68f4f0f2abb183c10ea8cc002547b6de0e93135d4a1b250e536" Apr 24 21:35:59.151125 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.151111 2578 scope.go:117] "RemoveContainer" containerID="bf3a3a5452ba41e0a9059e617c64b6ec8c65985bd2b4437a2df525256cd8a9bc" Apr 24 21:35:59.151340 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:35:59.151324 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf3a3a5452ba41e0a9059e617c64b6ec8c65985bd2b4437a2df525256cd8a9bc\": container with ID starting with bf3a3a5452ba41e0a9059e617c64b6ec8c65985bd2b4437a2df525256cd8a9bc not found: ID does not exist" containerID="bf3a3a5452ba41e0a9059e617c64b6ec8c65985bd2b4437a2df525256cd8a9bc" Apr 24 21:35:59.151387 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.151347 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf3a3a5452ba41e0a9059e617c64b6ec8c65985bd2b4437a2df525256cd8a9bc"} err="failed to get container status \"bf3a3a5452ba41e0a9059e617c64b6ec8c65985bd2b4437a2df525256cd8a9bc\": rpc error: code = NotFound desc = could not find container \"bf3a3a5452ba41e0a9059e617c64b6ec8c65985bd2b4437a2df525256cd8a9bc\": container with ID starting with bf3a3a5452ba41e0a9059e617c64b6ec8c65985bd2b4437a2df525256cd8a9bc not found: ID does not exist" Apr 24 21:35:59.151387 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.151361 2578 scope.go:117] "RemoveContainer" containerID="cc92a70309cbc68f4f0f2abb183c10ea8cc002547b6de0e93135d4a1b250e536" Apr 24 21:35:59.151599 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:35:59.151581 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc92a70309cbc68f4f0f2abb183c10ea8cc002547b6de0e93135d4a1b250e536\": container with ID starting with cc92a70309cbc68f4f0f2abb183c10ea8cc002547b6de0e93135d4a1b250e536 not found: ID does not exist" containerID="cc92a70309cbc68f4f0f2abb183c10ea8cc002547b6de0e93135d4a1b250e536" Apr 24 21:35:59.151655 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.151605 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc92a70309cbc68f4f0f2abb183c10ea8cc002547b6de0e93135d4a1b250e536"} err="failed to get container status \"cc92a70309cbc68f4f0f2abb183c10ea8cc002547b6de0e93135d4a1b250e536\": rpc error: code = NotFound desc = could not find container \"cc92a70309cbc68f4f0f2abb183c10ea8cc002547b6de0e93135d4a1b250e536\": container with ID starting with cc92a70309cbc68f4f0f2abb183c10ea8cc002547b6de0e93135d4a1b250e536 not found: ID does not exist" Apr 24 21:35:59.151655 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.151622 2578 scope.go:117] "RemoveContainer" containerID="4a373f2fa59468fee91da1c120806ee6c5dc76102fc7ea7c631122c7a5ba8794" Apr 24 21:35:59.158184 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.158167 2578 scope.go:117] "RemoveContainer" containerID="82ed1ca8b04a3bd328a48a9838d51380c369b99add884cca91af08c2da979435" Apr 24 21:35:59.162447 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.162427 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9"] Apr 24 21:35:59.165793 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.165717 2578 scope.go:117] "RemoveContainer" containerID="4a373f2fa59468fee91da1c120806ee6c5dc76102fc7ea7c631122c7a5ba8794" Apr 24 21:35:59.166215 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:35:59.166183 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a373f2fa59468fee91da1c120806ee6c5dc76102fc7ea7c631122c7a5ba8794\": container with ID starting with 4a373f2fa59468fee91da1c120806ee6c5dc76102fc7ea7c631122c7a5ba8794 not found: ID does not exist" containerID="4a373f2fa59468fee91da1c120806ee6c5dc76102fc7ea7c631122c7a5ba8794" Apr 24 21:35:59.166312 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.166213 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a373f2fa59468fee91da1c120806ee6c5dc76102fc7ea7c631122c7a5ba8794"} err="failed to get container status \"4a373f2fa59468fee91da1c120806ee6c5dc76102fc7ea7c631122c7a5ba8794\": rpc error: code = NotFound desc = could not find container \"4a373f2fa59468fee91da1c120806ee6c5dc76102fc7ea7c631122c7a5ba8794\": container with ID starting with 4a373f2fa59468fee91da1c120806ee6c5dc76102fc7ea7c631122c7a5ba8794 not found: ID does not exist" Apr 24 21:35:59.166312 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.166233 2578 scope.go:117] "RemoveContainer" containerID="82ed1ca8b04a3bd328a48a9838d51380c369b99add884cca91af08c2da979435" Apr 24 21:35:59.166509 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:35:59.166491 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82ed1ca8b04a3bd328a48a9838d51380c369b99add884cca91af08c2da979435\": container with ID starting with 82ed1ca8b04a3bd328a48a9838d51380c369b99add884cca91af08c2da979435 not found: ID does not exist" containerID="82ed1ca8b04a3bd328a48a9838d51380c369b99add884cca91af08c2da979435" Apr 24 21:35:59.166575 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.166513 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82ed1ca8b04a3bd328a48a9838d51380c369b99add884cca91af08c2da979435"} err="failed to get container status \"82ed1ca8b04a3bd328a48a9838d51380c369b99add884cca91af08c2da979435\": rpc error: code = NotFound desc = could not find container \"82ed1ca8b04a3bd328a48a9838d51380c369b99add884cca91af08c2da979435\": container with ID starting with 82ed1ca8b04a3bd328a48a9838d51380c369b99add884cca91af08c2da979435 not found: ID does not exist" Apr 24 21:35:59.167950 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.167931 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9db4b-predictor-5f9575b58c-pcsk9"] Apr 24 21:35:59.178668 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.178650 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r"] Apr 24 21:35:59.182634 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.182616 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9db4b-predictor-7ffc8c88fb-lb79r"] Apr 24 21:35:59.205798 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.205778 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4bjhr\" (UniqueName: \"kubernetes.io/projected/5f7de5da-b616-41d1-90a8-4c2f4e9f81e7-kube-api-access-4bjhr\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:35:59.205891 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:35:59.205805 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f7de5da-b616-41d1-90a8-4c2f4e9f81e7-proxy-tls\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:36:00.423166 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:00.423137 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f7de5da-b616-41d1-90a8-4c2f4e9f81e7" path="/var/lib/kubelet/pods/5f7de5da-b616-41d1-90a8-4c2f4e9f81e7/volumes" Apr 24 21:36:00.423531 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:00.423518 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c2ac472-afc3-42bb-98cb-b32a47f4574c" path="/var/lib/kubelet/pods/6c2ac472-afc3-42bb-98cb-b32a47f4574c/volumes" Apr 24 21:36:00.940302 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:00.940275 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" Apr 24 21:36:03.135004 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:03.134974 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" Apr 24 21:36:03.135445 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:03.135405 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" Apr 24 21:36:03.135513 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:03.135491 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" podUID="2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 21:36:03.135862 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:03.135845 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" podUID="be31fcea-aebe-4dbd-b1e7-4776f3785a40" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 21:36:13.135793 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:13.135754 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" podUID="be31fcea-aebe-4dbd-b1e7-4776f3785a40" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 21:36:13.136145 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:13.135754 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" podUID="2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 21:36:23.135800 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:23.135715 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" podUID="2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 21:36:23.136127 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:23.135846 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" podUID="be31fcea-aebe-4dbd-b1e7-4776f3785a40" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 21:36:33.136565 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:33.136507 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" podUID="2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 21:36:33.136565 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:33.136520 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" podUID="be31fcea-aebe-4dbd-b1e7-4776f3785a40" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 21:36:35.386613 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.386579 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq"] Apr 24 21:36:35.386978 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.386966 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c2ac472-afc3-42bb-98cb-b32a47f4574c" containerName="kserve-container" Apr 24 21:36:35.387021 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.386979 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c2ac472-afc3-42bb-98cb-b32a47f4574c" containerName="kserve-container" Apr 24 21:36:35.387021 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.386991 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c2ac472-afc3-42bb-98cb-b32a47f4574c" containerName="kube-rbac-proxy" Apr 24 21:36:35.387021 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.386997 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c2ac472-afc3-42bb-98cb-b32a47f4574c" containerName="kube-rbac-proxy" Apr 24 21:36:35.387021 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.387005 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f7de5da-b616-41d1-90a8-4c2f4e9f81e7" containerName="kserve-container" Apr 24 21:36:35.387021 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.387011 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f7de5da-b616-41d1-90a8-4c2f4e9f81e7" containerName="kserve-container" Apr 24 21:36:35.387021 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.387021 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f7de5da-b616-41d1-90a8-4c2f4e9f81e7" containerName="kube-rbac-proxy" Apr 24 21:36:35.387189 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.387027 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f7de5da-b616-41d1-90a8-4c2f4e9f81e7" containerName="kube-rbac-proxy" Apr 24 21:36:35.387189 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.387080 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f7de5da-b616-41d1-90a8-4c2f4e9f81e7" containerName="kserve-container" Apr 24 21:36:35.387189 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.387087 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c2ac472-afc3-42bb-98cb-b32a47f4574c" containerName="kube-rbac-proxy" Apr 24 21:36:35.387189 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.387092 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c2ac472-afc3-42bb-98cb-b32a47f4574c" containerName="kserve-container" Apr 24 21:36:35.387189 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.387098 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f7de5da-b616-41d1-90a8-4c2f4e9f81e7" containerName="kube-rbac-proxy" Apr 24 21:36:35.391500 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.391483 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" Apr 24 21:36:35.398838 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.398819 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-54155-predictor-serving-cert\"" Apr 24 21:36:35.398923 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.398885 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-54155-kube-rbac-proxy-sar-config\"" Apr 24 21:36:35.422675 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.422650 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq"] Apr 24 21:36:35.443159 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.443134 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f"] Apr 24 21:36:35.443436 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.443411 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" podUID="d9dd38aa-c701-426e-aa68-982add5b0621" containerName="kserve-container" containerID="cri-o://36f38f95a80abec3528c1889d6c381db8da6bf7c320db9176ad13b8e6bcb7f8e" gracePeriod=30 Apr 24 21:36:35.443494 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.443440 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" podUID="d9dd38aa-c701-426e-aa68-982add5b0621" containerName="kube-rbac-proxy" containerID="cri-o://af33fbce5bdfca5e0de757d65310b5c92b84ceb33a3caa40e131c3a39e273f52" gracePeriod=30 Apr 24 21:36:35.506745 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.506721 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a38b40de-e304-4264-9768-c8b049b630a2-proxy-tls\") pod \"success-200-isvc-54155-predictor-744c89d589-bgsvq\" (UID: \"a38b40de-e304-4264-9768-c8b049b630a2\") " pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" Apr 24 21:36:35.506843 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.506770 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-54155-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a38b40de-e304-4264-9768-c8b049b630a2-success-200-isvc-54155-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-54155-predictor-744c89d589-bgsvq\" (UID: \"a38b40de-e304-4264-9768-c8b049b630a2\") " pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" Apr 24 21:36:35.506843 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.506828 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc2rw\" (UniqueName: \"kubernetes.io/projected/a38b40de-e304-4264-9768-c8b049b630a2-kube-api-access-lc2rw\") pod \"success-200-isvc-54155-predictor-744c89d589-bgsvq\" (UID: \"a38b40de-e304-4264-9768-c8b049b630a2\") " pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" Apr 24 21:36:35.579580 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.579535 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl"] Apr 24 21:36:35.582874 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.582856 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" Apr 24 21:36:35.586431 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.586410 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-54155-predictor-serving-cert\"" Apr 24 21:36:35.588232 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.588214 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-54155-kube-rbac-proxy-sar-config\"" Apr 24 21:36:35.607432 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.607411 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a38b40de-e304-4264-9768-c8b049b630a2-proxy-tls\") pod \"success-200-isvc-54155-predictor-744c89d589-bgsvq\" (UID: \"a38b40de-e304-4264-9768-c8b049b630a2\") " pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" Apr 24 21:36:35.607515 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.607456 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-54155-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a38b40de-e304-4264-9768-c8b049b630a2-success-200-isvc-54155-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-54155-predictor-744c89d589-bgsvq\" (UID: \"a38b40de-e304-4264-9768-c8b049b630a2\") " pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" Apr 24 21:36:35.607515 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.607492 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lc2rw\" (UniqueName: \"kubernetes.io/projected/a38b40de-e304-4264-9768-c8b049b630a2-kube-api-access-lc2rw\") pod \"success-200-isvc-54155-predictor-744c89d589-bgsvq\" (UID: \"a38b40de-e304-4264-9768-c8b049b630a2\") " pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" Apr 24 21:36:35.607636 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:36:35.607559 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-54155-predictor-serving-cert: secret "success-200-isvc-54155-predictor-serving-cert" not found Apr 24 21:36:35.607636 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:36:35.607625 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a38b40de-e304-4264-9768-c8b049b630a2-proxy-tls podName:a38b40de-e304-4264-9768-c8b049b630a2 nodeName:}" failed. No retries permitted until 2026-04-24 21:36:36.107608667 +0000 UTC m=+542.283343318 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a38b40de-e304-4264-9768-c8b049b630a2-proxy-tls") pod "success-200-isvc-54155-predictor-744c89d589-bgsvq" (UID: "a38b40de-e304-4264-9768-c8b049b630a2") : secret "success-200-isvc-54155-predictor-serving-cert" not found Apr 24 21:36:35.608097 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.608080 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-54155-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a38b40de-e304-4264-9768-c8b049b630a2-success-200-isvc-54155-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-54155-predictor-744c89d589-bgsvq\" (UID: \"a38b40de-e304-4264-9768-c8b049b630a2\") " pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" Apr 24 21:36:35.620517 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.620490 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl"] Apr 24 21:36:35.629146 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.629119 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc2rw\" (UniqueName: \"kubernetes.io/projected/a38b40de-e304-4264-9768-c8b049b630a2-kube-api-access-lc2rw\") pod \"success-200-isvc-54155-predictor-744c89d589-bgsvq\" (UID: \"a38b40de-e304-4264-9768-c8b049b630a2\") " pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" Apr 24 21:36:35.708870 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.708794 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-54155-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/affb13ef-1273-40ee-a013-bddd85341559-error-404-isvc-54155-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-54155-predictor-db578f77d-7fmtl\" (UID: \"affb13ef-1273-40ee-a013-bddd85341559\") " pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" Apr 24 21:36:35.708870 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.708860 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq2h7\" (UniqueName: \"kubernetes.io/projected/affb13ef-1273-40ee-a013-bddd85341559-kube-api-access-rq2h7\") pod \"error-404-isvc-54155-predictor-db578f77d-7fmtl\" (UID: \"affb13ef-1273-40ee-a013-bddd85341559\") " pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" Apr 24 21:36:35.709040 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.708926 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/affb13ef-1273-40ee-a013-bddd85341559-proxy-tls\") pod \"error-404-isvc-54155-predictor-db578f77d-7fmtl\" (UID: \"affb13ef-1273-40ee-a013-bddd85341559\") " pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" Apr 24 21:36:35.809326 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.809299 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rq2h7\" (UniqueName: \"kubernetes.io/projected/affb13ef-1273-40ee-a013-bddd85341559-kube-api-access-rq2h7\") pod \"error-404-isvc-54155-predictor-db578f77d-7fmtl\" (UID: \"affb13ef-1273-40ee-a013-bddd85341559\") " pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" Apr 24 21:36:35.809486 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.809336 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/affb13ef-1273-40ee-a013-bddd85341559-proxy-tls\") pod \"error-404-isvc-54155-predictor-db578f77d-7fmtl\" (UID: \"affb13ef-1273-40ee-a013-bddd85341559\") " pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" Apr 24 21:36:35.809486 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.809431 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-54155-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/affb13ef-1273-40ee-a013-bddd85341559-error-404-isvc-54155-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-54155-predictor-db578f77d-7fmtl\" (UID: \"affb13ef-1273-40ee-a013-bddd85341559\") " pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" Apr 24 21:36:35.810053 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.810032 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-54155-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/affb13ef-1273-40ee-a013-bddd85341559-error-404-isvc-54155-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-54155-predictor-db578f77d-7fmtl\" (UID: \"affb13ef-1273-40ee-a013-bddd85341559\") " pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" Apr 24 21:36:35.811919 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.811893 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/affb13ef-1273-40ee-a013-bddd85341559-proxy-tls\") pod \"error-404-isvc-54155-predictor-db578f77d-7fmtl\" (UID: \"affb13ef-1273-40ee-a013-bddd85341559\") " pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" Apr 24 21:36:35.819034 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.819008 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq2h7\" (UniqueName: \"kubernetes.io/projected/affb13ef-1273-40ee-a013-bddd85341559-kube-api-access-rq2h7\") pod \"error-404-isvc-54155-predictor-db578f77d-7fmtl\" (UID: \"affb13ef-1273-40ee-a013-bddd85341559\") " pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" Apr 24 21:36:35.892688 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.892663 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" Apr 24 21:36:35.934907 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:35.934867 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" podUID="d9dd38aa-c701-426e-aa68-982add5b0621" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.24:8643/healthz\": dial tcp 10.134.0.24:8643: connect: connection refused" Apr 24 21:36:36.013702 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:36.011944 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl"] Apr 24 21:36:36.015765 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:36:36.015718 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaffb13ef_1273_40ee_a013_bddd85341559.slice/crio-d052f4abbfcc43b05349df39b8b88d956a04a5fc0e2581b5bfe3bf4d04ea7dce WatchSource:0}: Error finding container d052f4abbfcc43b05349df39b8b88d956a04a5fc0e2581b5bfe3bf4d04ea7dce: Status 404 returned error can't find the container with id d052f4abbfcc43b05349df39b8b88d956a04a5fc0e2581b5bfe3bf4d04ea7dce Apr 24 21:36:36.112446 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:36.112398 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a38b40de-e304-4264-9768-c8b049b630a2-proxy-tls\") pod \"success-200-isvc-54155-predictor-744c89d589-bgsvq\" (UID: \"a38b40de-e304-4264-9768-c8b049b630a2\") " pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" Apr 24 21:36:36.115221 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:36.115199 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a38b40de-e304-4264-9768-c8b049b630a2-proxy-tls\") pod \"success-200-isvc-54155-predictor-744c89d589-bgsvq\" (UID: \"a38b40de-e304-4264-9768-c8b049b630a2\") " pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" Apr 24 21:36:36.253761 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:36.253686 2578 generic.go:358] "Generic (PLEG): container finished" podID="d9dd38aa-c701-426e-aa68-982add5b0621" containerID="af33fbce5bdfca5e0de757d65310b5c92b84ceb33a3caa40e131c3a39e273f52" exitCode=2 Apr 24 21:36:36.253897 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:36.253761 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" event={"ID":"d9dd38aa-c701-426e-aa68-982add5b0621","Type":"ContainerDied","Data":"af33fbce5bdfca5e0de757d65310b5c92b84ceb33a3caa40e131c3a39e273f52"} Apr 24 21:36:36.255120 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:36.255100 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" event={"ID":"affb13ef-1273-40ee-a013-bddd85341559","Type":"ContainerStarted","Data":"b994dd1d206ee5ecf0e220f27fc735c508698c06e51600ea4b5c16204b441fc9"} Apr 24 21:36:36.255238 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:36.255126 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" event={"ID":"affb13ef-1273-40ee-a013-bddd85341559","Type":"ContainerStarted","Data":"3115bf990807279ae2915a75e9f0a50583c57ea0d0228ac9683938602367a5b4"} Apr 24 21:36:36.255238 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:36.255139 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" event={"ID":"affb13ef-1273-40ee-a013-bddd85341559","Type":"ContainerStarted","Data":"d052f4abbfcc43b05349df39b8b88d956a04a5fc0e2581b5bfe3bf4d04ea7dce"} Apr 24 21:36:36.255238 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:36.255222 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" Apr 24 21:36:36.272607 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:36.272569 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" podStartSLOduration=1.272557307 podStartE2EDuration="1.272557307s" podCreationTimestamp="2026-04-24 21:36:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:36:36.270644024 +0000 UTC m=+542.446378695" watchObservedRunningTime="2026-04-24 21:36:36.272557307 +0000 UTC m=+542.448291971" Apr 24 21:36:36.300838 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:36.300820 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" Apr 24 21:36:36.429323 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:36.429298 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq"] Apr 24 21:36:37.259241 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:37.259201 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" event={"ID":"a38b40de-e304-4264-9768-c8b049b630a2","Type":"ContainerStarted","Data":"1e4679e9efcea4572a60f0ea755f23a34ab25d98c99f2f48e6626e94816e3c5b"} Apr 24 21:36:37.259241 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:37.259239 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" event={"ID":"a38b40de-e304-4264-9768-c8b049b630a2","Type":"ContainerStarted","Data":"f19cb8316382301bf87d716cfdbedae4f85d6c042030e383ffb835ef1df05f2d"} Apr 24 21:36:37.259241 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:37.259248 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" event={"ID":"a38b40de-e304-4264-9768-c8b049b630a2","Type":"ContainerStarted","Data":"bf53ee51c19f4aaba039d56b71155d628aad9ab0bb2e414fd31e279269416df3"} Apr 24 21:36:37.259571 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:37.259377 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" Apr 24 21:36:37.259571 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:37.259425 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" Apr 24 21:36:37.259571 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:37.259438 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" Apr 24 21:36:37.260508 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:37.260488 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" podUID="a38b40de-e304-4264-9768-c8b049b630a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:36:37.260582 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:37.260488 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" podUID="affb13ef-1273-40ee-a013-bddd85341559" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:36:37.277760 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:37.277716 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" podStartSLOduration=2.277704441 podStartE2EDuration="2.277704441s" podCreationTimestamp="2026-04-24 21:36:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:36:37.276060677 +0000 UTC m=+543.451795362" watchObservedRunningTime="2026-04-24 21:36:37.277704441 +0000 UTC m=+543.453439161" Apr 24 21:36:38.262334 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:38.262292 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" podUID="a38b40de-e304-4264-9768-c8b049b630a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:36:38.262727 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:38.262299 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" podUID="affb13ef-1273-40ee-a013-bddd85341559" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:36:40.269030 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:40.269004 2578 generic.go:358] "Generic (PLEG): container finished" podID="d9dd38aa-c701-426e-aa68-982add5b0621" containerID="36f38f95a80abec3528c1889d6c381db8da6bf7c320db9176ad13b8e6bcb7f8e" exitCode=0 Apr 24 21:36:40.269310 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:40.269051 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" event={"ID":"d9dd38aa-c701-426e-aa68-982add5b0621","Type":"ContainerDied","Data":"36f38f95a80abec3528c1889d6c381db8da6bf7c320db9176ad13b8e6bcb7f8e"} Apr 24 21:36:40.383797 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:40.383777 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" Apr 24 21:36:40.453742 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:40.453713 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9dd38aa-c701-426e-aa68-982add5b0621-proxy-tls\") pod \"d9dd38aa-c701-426e-aa68-982add5b0621\" (UID: \"d9dd38aa-c701-426e-aa68-982add5b0621\") " Apr 24 21:36:40.453869 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:40.453780 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9dd38aa-c701-426e-aa68-982add5b0621-kserve-provision-location\") pod \"d9dd38aa-c701-426e-aa68-982add5b0621\" (UID: \"d9dd38aa-c701-426e-aa68-982add5b0621\") " Apr 24 21:36:40.453869 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:40.453802 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9dd38aa-c701-426e-aa68-982add5b0621-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"d9dd38aa-c701-426e-aa68-982add5b0621\" (UID: \"d9dd38aa-c701-426e-aa68-982add5b0621\") " Apr 24 21:36:40.453869 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:40.453848 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dxlh\" (UniqueName: \"kubernetes.io/projected/d9dd38aa-c701-426e-aa68-982add5b0621-kube-api-access-5dxlh\") pod \"d9dd38aa-c701-426e-aa68-982add5b0621\" (UID: \"d9dd38aa-c701-426e-aa68-982add5b0621\") " Apr 24 21:36:40.454101 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:40.454079 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9dd38aa-c701-426e-aa68-982add5b0621-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d9dd38aa-c701-426e-aa68-982add5b0621" (UID: "d9dd38aa-c701-426e-aa68-982add5b0621"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:36:40.454191 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:40.454170 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9dd38aa-c701-426e-aa68-982add5b0621-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config") pod "d9dd38aa-c701-426e-aa68-982add5b0621" (UID: "d9dd38aa-c701-426e-aa68-982add5b0621"). InnerVolumeSpecName "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:36:40.455774 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:40.455754 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9dd38aa-c701-426e-aa68-982add5b0621-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d9dd38aa-c701-426e-aa68-982add5b0621" (UID: "d9dd38aa-c701-426e-aa68-982add5b0621"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:36:40.455948 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:40.455926 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9dd38aa-c701-426e-aa68-982add5b0621-kube-api-access-5dxlh" (OuterVolumeSpecName: "kube-api-access-5dxlh") pod "d9dd38aa-c701-426e-aa68-982add5b0621" (UID: "d9dd38aa-c701-426e-aa68-982add5b0621"). InnerVolumeSpecName "kube-api-access-5dxlh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:36:40.554339 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:40.554316 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9dd38aa-c701-426e-aa68-982add5b0621-proxy-tls\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:36:40.554339 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:40.554338 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9dd38aa-c701-426e-aa68-982add5b0621-kserve-provision-location\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:36:40.554481 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:40.554348 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9dd38aa-c701-426e-aa68-982add5b0621-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:36:40.554481 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:40.554358 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5dxlh\" (UniqueName: \"kubernetes.io/projected/d9dd38aa-c701-426e-aa68-982add5b0621-kube-api-access-5dxlh\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:36:41.273447 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:41.273413 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" event={"ID":"d9dd38aa-c701-426e-aa68-982add5b0621","Type":"ContainerDied","Data":"0348ead6d8fbac229076e1b39b26741b1da47f2196b00ba27d7768780f3e447c"} Apr 24 21:36:41.273926 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:41.273457 2578 scope.go:117] "RemoveContainer" containerID="af33fbce5bdfca5e0de757d65310b5c92b84ceb33a3caa40e131c3a39e273f52" Apr 24 21:36:41.273926 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:41.273496 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f" Apr 24 21:36:41.281668 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:41.281651 2578 scope.go:117] "RemoveContainer" containerID="36f38f95a80abec3528c1889d6c381db8da6bf7c320db9176ad13b8e6bcb7f8e" Apr 24 21:36:41.288637 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:41.288624 2578 scope.go:117] "RemoveContainer" containerID="5ca475d0e3c013ee239fdb1713eaeb56aba4101d2e4829f254ad3c0754228434" Apr 24 21:36:41.296627 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:41.296576 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f"] Apr 24 21:36:41.301197 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:41.301177 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-58d89b86bf-hch9f"] Apr 24 21:36:42.421793 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:42.421764 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9dd38aa-c701-426e-aa68-982add5b0621" path="/var/lib/kubelet/pods/d9dd38aa-c701-426e-aa68-982add5b0621/volumes" Apr 24 21:36:43.136724 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:43.136695 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" Apr 24 21:36:43.136909 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:43.136763 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" Apr 24 21:36:43.267114 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:43.267086 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" Apr 24 21:36:43.267248 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:43.267139 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" Apr 24 21:36:43.267664 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:43.267638 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" podUID="a38b40de-e304-4264-9768-c8b049b630a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:36:43.267935 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:43.267912 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" podUID="affb13ef-1273-40ee-a013-bddd85341559" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:36:53.267769 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:53.267725 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" podUID="a38b40de-e304-4264-9768-c8b049b630a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:36:53.268145 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:36:53.267834 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" podUID="affb13ef-1273-40ee-a013-bddd85341559" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:37:03.268492 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:37:03.268457 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" podUID="a38b40de-e304-4264-9768-c8b049b630a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:37:03.268972 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:37:03.268456 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" podUID="affb13ef-1273-40ee-a013-bddd85341559" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:37:13.268015 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:37:13.267976 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" podUID="affb13ef-1273-40ee-a013-bddd85341559" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:37:13.268015 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:37:13.267997 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" podUID="a38b40de-e304-4264-9768-c8b049b630a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:37:23.268516 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:37:23.268489 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" Apr 24 21:37:23.268992 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:37:23.268569 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" Apr 24 21:45:10.294211 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.294180 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc"] Apr 24 21:45:10.294809 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.294565 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" podUID="2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" containerName="kserve-container" containerID="cri-o://6322705f57d26155dac574a55351d2e4af29b20990db47cb2249e4e48c47785c" gracePeriod=30 Apr 24 21:45:10.294809 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.294622 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" podUID="2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" containerName="kube-rbac-proxy" containerID="cri-o://df8bfbee38cb6e1c1ace3d2bb11b7da729d8c3281df3fd64456edf0145626e2e" gracePeriod=30 Apr 24 21:45:10.368058 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.368021 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc"] Apr 24 21:45:10.368412 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.368398 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9dd38aa-c701-426e-aa68-982add5b0621" containerName="kserve-container" Apr 24 21:45:10.368412 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.368414 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9dd38aa-c701-426e-aa68-982add5b0621" containerName="kserve-container" Apr 24 21:45:10.368571 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.368435 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9dd38aa-c701-426e-aa68-982add5b0621" containerName="storage-initializer" Apr 24 21:45:10.368571 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.368444 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9dd38aa-c701-426e-aa68-982add5b0621" containerName="storage-initializer" Apr 24 21:45:10.368571 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.368455 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9dd38aa-c701-426e-aa68-982add5b0621" containerName="kube-rbac-proxy" Apr 24 21:45:10.368571 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.368462 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9dd38aa-c701-426e-aa68-982add5b0621" containerName="kube-rbac-proxy" Apr 24 21:45:10.368571 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.368515 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9dd38aa-c701-426e-aa68-982add5b0621" containerName="kserve-container" Apr 24 21:45:10.368571 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.368530 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9dd38aa-c701-426e-aa68-982add5b0621" containerName="kube-rbac-proxy" Apr 24 21:45:10.372189 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.372171 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" Apr 24 21:45:10.374613 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.374593 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-dd3fc-predictor-serving-cert\"" Apr 24 21:45:10.374613 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.374606 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-dd3fc-kube-rbac-proxy-sar-config\"" Apr 24 21:45:10.391670 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.391647 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc"] Apr 24 21:45:10.407106 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.407086 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt"] Apr 24 21:45:10.407430 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.407406 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" podUID="be31fcea-aebe-4dbd-b1e7-4776f3785a40" containerName="kserve-container" containerID="cri-o://2aeb9abf2a6647091b1a5a05508147e018673ccd2a7b7ad769a72fcc6a12ac60" gracePeriod=30 Apr 24 21:45:10.407503 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.407471 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" podUID="be31fcea-aebe-4dbd-b1e7-4776f3785a40" containerName="kube-rbac-proxy" containerID="cri-o://0b71176b6557597a843d05ffbce07c80b1ff6e4b093957b8a56f23074e9197ba" gracePeriod=30 Apr 24 21:45:10.433253 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.433230 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/214a2c49-4165-4a5f-a071-b7c230595d4c-proxy-tls\") pod \"success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc\" (UID: \"214a2c49-4165-4a5f-a071-b7c230595d4c\") " pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" Apr 24 21:45:10.433347 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.433268 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-dd3fc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/214a2c49-4165-4a5f-a071-b7c230595d4c-success-200-isvc-dd3fc-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc\" (UID: \"214a2c49-4165-4a5f-a071-b7c230595d4c\") " pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" Apr 24 21:45:10.433347 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.433315 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7bsq\" (UniqueName: \"kubernetes.io/projected/214a2c49-4165-4a5f-a071-b7c230595d4c-kube-api-access-l7bsq\") pod \"success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc\" (UID: \"214a2c49-4165-4a5f-a071-b7c230595d4c\") " pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" Apr 24 21:45:10.465568 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.465529 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt"] Apr 24 21:45:10.469164 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.469149 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" Apr 24 21:45:10.474637 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.474618 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-dd3fc-predictor-serving-cert\"" Apr 24 21:45:10.474731 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.474626 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-dd3fc-kube-rbac-proxy-sar-config\"" Apr 24 21:45:10.481741 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.481720 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt"] Apr 24 21:45:10.533805 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.533776 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-dd3fc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/214a2c49-4165-4a5f-a071-b7c230595d4c-success-200-isvc-dd3fc-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc\" (UID: \"214a2c49-4165-4a5f-a071-b7c230595d4c\") " pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" Apr 24 21:45:10.533921 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.533824 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7bsq\" (UniqueName: \"kubernetes.io/projected/214a2c49-4165-4a5f-a071-b7c230595d4c-kube-api-access-l7bsq\") pod \"success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc\" (UID: \"214a2c49-4165-4a5f-a071-b7c230595d4c\") " pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" Apr 24 21:45:10.533921 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.533867 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9824b\" (UniqueName: \"kubernetes.io/projected/8cb4eb76-33f9-42a7-94bd-501aeeab32f1-kube-api-access-9824b\") pod \"error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt\" (UID: \"8cb4eb76-33f9-42a7-94bd-501aeeab32f1\") " pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" Apr 24 21:45:10.533921 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.533891 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-dd3fc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8cb4eb76-33f9-42a7-94bd-501aeeab32f1-error-404-isvc-dd3fc-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt\" (UID: \"8cb4eb76-33f9-42a7-94bd-501aeeab32f1\") " pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" Apr 24 21:45:10.533921 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.533915 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8cb4eb76-33f9-42a7-94bd-501aeeab32f1-proxy-tls\") pod \"error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt\" (UID: \"8cb4eb76-33f9-42a7-94bd-501aeeab32f1\") " pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" Apr 24 21:45:10.534121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.533934 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/214a2c49-4165-4a5f-a071-b7c230595d4c-proxy-tls\") pod \"success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc\" (UID: \"214a2c49-4165-4a5f-a071-b7c230595d4c\") " pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" Apr 24 21:45:10.534423 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.534396 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-dd3fc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/214a2c49-4165-4a5f-a071-b7c230595d4c-success-200-isvc-dd3fc-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc\" (UID: \"214a2c49-4165-4a5f-a071-b7c230595d4c\") " pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" Apr 24 21:45:10.536496 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.536467 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/214a2c49-4165-4a5f-a071-b7c230595d4c-proxy-tls\") pod \"success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc\" (UID: \"214a2c49-4165-4a5f-a071-b7c230595d4c\") " pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" Apr 24 21:45:10.542446 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.542427 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7bsq\" (UniqueName: \"kubernetes.io/projected/214a2c49-4165-4a5f-a071-b7c230595d4c-kube-api-access-l7bsq\") pod \"success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc\" (UID: \"214a2c49-4165-4a5f-a071-b7c230595d4c\") " pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" Apr 24 21:45:10.634565 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.634457 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8cb4eb76-33f9-42a7-94bd-501aeeab32f1-proxy-tls\") pod \"error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt\" (UID: \"8cb4eb76-33f9-42a7-94bd-501aeeab32f1\") " pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" Apr 24 21:45:10.634718 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.634610 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9824b\" (UniqueName: \"kubernetes.io/projected/8cb4eb76-33f9-42a7-94bd-501aeeab32f1-kube-api-access-9824b\") pod \"error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt\" (UID: \"8cb4eb76-33f9-42a7-94bd-501aeeab32f1\") " pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" Apr 24 21:45:10.634718 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.634651 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-dd3fc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8cb4eb76-33f9-42a7-94bd-501aeeab32f1-error-404-isvc-dd3fc-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt\" (UID: \"8cb4eb76-33f9-42a7-94bd-501aeeab32f1\") " pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" Apr 24 21:45:10.635206 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.635182 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-dd3fc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8cb4eb76-33f9-42a7-94bd-501aeeab32f1-error-404-isvc-dd3fc-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt\" (UID: \"8cb4eb76-33f9-42a7-94bd-501aeeab32f1\") " pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" Apr 24 21:45:10.636940 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.636918 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8cb4eb76-33f9-42a7-94bd-501aeeab32f1-proxy-tls\") pod \"error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt\" (UID: \"8cb4eb76-33f9-42a7-94bd-501aeeab32f1\") " pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" Apr 24 21:45:10.643931 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.643912 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9824b\" (UniqueName: \"kubernetes.io/projected/8cb4eb76-33f9-42a7-94bd-501aeeab32f1-kube-api-access-9824b\") pod \"error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt\" (UID: \"8cb4eb76-33f9-42a7-94bd-501aeeab32f1\") " pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" Apr 24 21:45:10.682705 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.682684 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" Apr 24 21:45:10.748445 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.747841 2578 generic.go:358] "Generic (PLEG): container finished" podID="be31fcea-aebe-4dbd-b1e7-4776f3785a40" containerID="0b71176b6557597a843d05ffbce07c80b1ff6e4b093957b8a56f23074e9197ba" exitCode=2 Apr 24 21:45:10.748445 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.747971 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" event={"ID":"be31fcea-aebe-4dbd-b1e7-4776f3785a40","Type":"ContainerDied","Data":"0b71176b6557597a843d05ffbce07c80b1ff6e4b093957b8a56f23074e9197ba"} Apr 24 21:45:10.750447 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.750417 2578 generic.go:358] "Generic (PLEG): container finished" podID="2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" containerID="df8bfbee38cb6e1c1ace3d2bb11b7da729d8c3281df3fd64456edf0145626e2e" exitCode=2 Apr 24 21:45:10.750593 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.750488 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" event={"ID":"2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d","Type":"ContainerDied","Data":"df8bfbee38cb6e1c1ace3d2bb11b7da729d8c3281df3fd64456edf0145626e2e"} Apr 24 21:45:10.780221 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.779163 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" Apr 24 21:45:10.807070 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.807040 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc"] Apr 24 21:45:10.808257 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:45:10.808231 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod214a2c49_4165_4a5f_a071_b7c230595d4c.slice/crio-218728898a1b7a6d35127c6f9ab6c372ff296cc17c7b80f0918982186d7e8c45 WatchSource:0}: Error finding container 218728898a1b7a6d35127c6f9ab6c372ff296cc17c7b80f0918982186d7e8c45: Status 404 returned error can't find the container with id 218728898a1b7a6d35127c6f9ab6c372ff296cc17c7b80f0918982186d7e8c45 Apr 24 21:45:10.809803 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.809786 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:45:10.902108 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:10.902089 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt"] Apr 24 21:45:10.904614 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:45:10.904589 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cb4eb76_33f9_42a7_94bd_501aeeab32f1.slice/crio-ea546a526967d05f7f5ef9deb48a50c197d5dd24015213f2aa78f8722de3e8b6 WatchSource:0}: Error finding container ea546a526967d05f7f5ef9deb48a50c197d5dd24015213f2aa78f8722de3e8b6: Status 404 returned error can't find the container with id ea546a526967d05f7f5ef9deb48a50c197d5dd24015213f2aa78f8722de3e8b6 Apr 24 21:45:11.755806 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:11.755769 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" event={"ID":"214a2c49-4165-4a5f-a071-b7c230595d4c","Type":"ContainerStarted","Data":"1a61f067534f8c337287a97065a4b1082cd8b1e628830657aa33da1c13507fbd"} Apr 24 21:45:11.755806 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:11.755808 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" event={"ID":"214a2c49-4165-4a5f-a071-b7c230595d4c","Type":"ContainerStarted","Data":"aa7cc3c983d247ef8bf8fc586e5c91cf2fbe74f05c42e82d95aee83bc7eab7d2"} Apr 24 21:45:11.756248 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:11.755824 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" event={"ID":"214a2c49-4165-4a5f-a071-b7c230595d4c","Type":"ContainerStarted","Data":"218728898a1b7a6d35127c6f9ab6c372ff296cc17c7b80f0918982186d7e8c45"} Apr 24 21:45:11.756248 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:11.755920 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" Apr 24 21:45:11.759854 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:11.759826 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" event={"ID":"8cb4eb76-33f9-42a7-94bd-501aeeab32f1","Type":"ContainerStarted","Data":"0f1ebd0aaae7fafc83fa8993825fcf9df8e8dbbf1753eea5dcded7e3d6ec865a"} Apr 24 21:45:11.759993 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:11.759863 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" event={"ID":"8cb4eb76-33f9-42a7-94bd-501aeeab32f1","Type":"ContainerStarted","Data":"c70ef077ad67565464b1fb1220e70c3aaeb8e7f24694d952d6a0637fbb73c926"} Apr 24 21:45:11.759993 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:11.759876 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" event={"ID":"8cb4eb76-33f9-42a7-94bd-501aeeab32f1","Type":"ContainerStarted","Data":"ea546a526967d05f7f5ef9deb48a50c197d5dd24015213f2aa78f8722de3e8b6"} Apr 24 21:45:11.759993 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:11.759932 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" Apr 24 21:45:11.777273 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:11.777237 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" podStartSLOduration=1.777225361 podStartE2EDuration="1.777225361s" podCreationTimestamp="2026-04-24 21:45:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:45:11.775004097 +0000 UTC m=+1057.950738768" watchObservedRunningTime="2026-04-24 21:45:11.777225361 +0000 UTC m=+1057.952960034" Apr 24 21:45:11.794379 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:11.794336 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" podStartSLOduration=1.794320963 podStartE2EDuration="1.794320963s" podCreationTimestamp="2026-04-24 21:45:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:45:11.792941426 +0000 UTC m=+1057.968676098" watchObservedRunningTime="2026-04-24 21:45:11.794320963 +0000 UTC m=+1057.970055866" Apr 24 21:45:12.764293 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:12.764261 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" Apr 24 21:45:12.764716 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:12.764468 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" Apr 24 21:45:12.765447 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:12.765423 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" podUID="214a2c49-4165-4a5f-a071-b7c230595d4c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 21:45:12.765447 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:12.765438 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" podUID="8cb4eb76-33f9-42a7-94bd-501aeeab32f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:45:13.131616 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.131497 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" podUID="2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.26:8643/healthz\": dial tcp 10.134.0.26:8643: connect: connection refused" Apr 24 21:45:13.131616 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.131504 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" podUID="be31fcea-aebe-4dbd-b1e7-4776f3785a40" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.27:8643/healthz\": dial tcp 10.134.0.27:8643: connect: connection refused" Apr 24 21:45:13.135816 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.135796 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" podUID="2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 21:45:13.135878 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.135835 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" podUID="be31fcea-aebe-4dbd-b1e7-4776f3785a40" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 24 21:45:13.435776 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.435751 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" Apr 24 21:45:13.459535 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.459505 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-6d3ba-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d-success-200-isvc-6d3ba-kube-rbac-proxy-sar-config\") pod \"2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d\" (UID: \"2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d\") " Apr 24 21:45:13.459682 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.459588 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d-proxy-tls\") pod \"2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d\" (UID: \"2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d\") " Apr 24 21:45:13.459682 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.459621 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-672qn\" (UniqueName: \"kubernetes.io/projected/2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d-kube-api-access-672qn\") pod \"2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d\" (UID: \"2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d\") " Apr 24 21:45:13.459912 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.459888 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d-success-200-isvc-6d3ba-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-6d3ba-kube-rbac-proxy-sar-config") pod "2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" (UID: "2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d"). InnerVolumeSpecName "success-200-isvc-6d3ba-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:45:13.461905 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.461873 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" (UID: "2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:45:13.461905 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.461878 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d-kube-api-access-672qn" (OuterVolumeSpecName: "kube-api-access-672qn") pod "2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" (UID: "2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d"). InnerVolumeSpecName "kube-api-access-672qn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:45:13.545288 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.545257 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" Apr 24 21:45:13.560256 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.560236 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be31fcea-aebe-4dbd-b1e7-4776f3785a40-proxy-tls\") pod \"be31fcea-aebe-4dbd-b1e7-4776f3785a40\" (UID: \"be31fcea-aebe-4dbd-b1e7-4776f3785a40\") " Apr 24 21:45:13.560365 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.560293 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrtvr\" (UniqueName: \"kubernetes.io/projected/be31fcea-aebe-4dbd-b1e7-4776f3785a40-kube-api-access-vrtvr\") pod \"be31fcea-aebe-4dbd-b1e7-4776f3785a40\" (UID: \"be31fcea-aebe-4dbd-b1e7-4776f3785a40\") " Apr 24 21:45:13.560432 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.560366 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-6d3ba-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/be31fcea-aebe-4dbd-b1e7-4776f3785a40-error-404-isvc-6d3ba-kube-rbac-proxy-sar-config\") pod \"be31fcea-aebe-4dbd-b1e7-4776f3785a40\" (UID: \"be31fcea-aebe-4dbd-b1e7-4776f3785a40\") " Apr 24 21:45:13.560710 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.560600 2578 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-6d3ba-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d-success-200-isvc-6d3ba-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:45:13.560710 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.560621 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d-proxy-tls\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:45:13.560710 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.560637 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-672qn\" (UniqueName: \"kubernetes.io/projected/2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d-kube-api-access-672qn\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:45:13.560838 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.560719 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be31fcea-aebe-4dbd-b1e7-4776f3785a40-error-404-isvc-6d3ba-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-6d3ba-kube-rbac-proxy-sar-config") pod "be31fcea-aebe-4dbd-b1e7-4776f3785a40" (UID: "be31fcea-aebe-4dbd-b1e7-4776f3785a40"). InnerVolumeSpecName "error-404-isvc-6d3ba-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:45:13.562352 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.562331 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be31fcea-aebe-4dbd-b1e7-4776f3785a40-kube-api-access-vrtvr" (OuterVolumeSpecName: "kube-api-access-vrtvr") pod "be31fcea-aebe-4dbd-b1e7-4776f3785a40" (UID: "be31fcea-aebe-4dbd-b1e7-4776f3785a40"). InnerVolumeSpecName "kube-api-access-vrtvr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:45:13.562421 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.562374 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be31fcea-aebe-4dbd-b1e7-4776f3785a40-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "be31fcea-aebe-4dbd-b1e7-4776f3785a40" (UID: "be31fcea-aebe-4dbd-b1e7-4776f3785a40"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:45:13.661567 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.661523 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be31fcea-aebe-4dbd-b1e7-4776f3785a40-proxy-tls\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:45:13.661567 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.661570 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vrtvr\" (UniqueName: \"kubernetes.io/projected/be31fcea-aebe-4dbd-b1e7-4776f3785a40-kube-api-access-vrtvr\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:45:13.661744 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.661586 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-6d3ba-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/be31fcea-aebe-4dbd-b1e7-4776f3785a40-error-404-isvc-6d3ba-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:45:13.768442 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.768415 2578 generic.go:358] "Generic (PLEG): container finished" podID="be31fcea-aebe-4dbd-b1e7-4776f3785a40" containerID="2aeb9abf2a6647091b1a5a05508147e018673ccd2a7b7ad769a72fcc6a12ac60" exitCode=0 Apr 24 21:45:13.768886 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.768491 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" Apr 24 21:45:13.768886 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.768510 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" event={"ID":"be31fcea-aebe-4dbd-b1e7-4776f3785a40","Type":"ContainerDied","Data":"2aeb9abf2a6647091b1a5a05508147e018673ccd2a7b7ad769a72fcc6a12ac60"} Apr 24 21:45:13.768886 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.768561 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt" event={"ID":"be31fcea-aebe-4dbd-b1e7-4776f3785a40","Type":"ContainerDied","Data":"ad6252d115442cbe65499008a990bc0ef8257070a197aad456e12e86ff5f3286"} Apr 24 21:45:13.768886 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.768580 2578 scope.go:117] "RemoveContainer" containerID="0b71176b6557597a843d05ffbce07c80b1ff6e4b093957b8a56f23074e9197ba" Apr 24 21:45:13.770048 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.769985 2578 generic.go:358] "Generic (PLEG): container finished" podID="2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" containerID="6322705f57d26155dac574a55351d2e4af29b20990db47cb2249e4e48c47785c" exitCode=0 Apr 24 21:45:13.770048 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.770035 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" event={"ID":"2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d","Type":"ContainerDied","Data":"6322705f57d26155dac574a55351d2e4af29b20990db47cb2249e4e48c47785c"} Apr 24 21:45:13.770048 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.770063 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" Apr 24 21:45:13.770262 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.770065 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc" event={"ID":"2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d","Type":"ContainerDied","Data":"02a39f6597acd15ba7165b941dbc340a811d02f4435e1a875a593633e4797f2e"} Apr 24 21:45:13.770581 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.770534 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" podUID="8cb4eb76-33f9-42a7-94bd-501aeeab32f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:45:13.770673 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.770534 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" podUID="214a2c49-4165-4a5f-a071-b7c230595d4c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 21:45:13.778082 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.777947 2578 scope.go:117] "RemoveContainer" containerID="2aeb9abf2a6647091b1a5a05508147e018673ccd2a7b7ad769a72fcc6a12ac60" Apr 24 21:45:13.785146 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.785132 2578 scope.go:117] "RemoveContainer" containerID="0b71176b6557597a843d05ffbce07c80b1ff6e4b093957b8a56f23074e9197ba" Apr 24 21:45:13.785399 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:45:13.785380 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b71176b6557597a843d05ffbce07c80b1ff6e4b093957b8a56f23074e9197ba\": container with ID starting with 0b71176b6557597a843d05ffbce07c80b1ff6e4b093957b8a56f23074e9197ba not found: ID does not exist" containerID="0b71176b6557597a843d05ffbce07c80b1ff6e4b093957b8a56f23074e9197ba" Apr 24 21:45:13.785481 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.785408 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b71176b6557597a843d05ffbce07c80b1ff6e4b093957b8a56f23074e9197ba"} err="failed to get container status \"0b71176b6557597a843d05ffbce07c80b1ff6e4b093957b8a56f23074e9197ba\": rpc error: code = NotFound desc = could not find container \"0b71176b6557597a843d05ffbce07c80b1ff6e4b093957b8a56f23074e9197ba\": container with ID starting with 0b71176b6557597a843d05ffbce07c80b1ff6e4b093957b8a56f23074e9197ba not found: ID does not exist" Apr 24 21:45:13.785481 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.785430 2578 scope.go:117] "RemoveContainer" containerID="2aeb9abf2a6647091b1a5a05508147e018673ccd2a7b7ad769a72fcc6a12ac60" Apr 24 21:45:13.785720 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:45:13.785705 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aeb9abf2a6647091b1a5a05508147e018673ccd2a7b7ad769a72fcc6a12ac60\": container with ID starting with 2aeb9abf2a6647091b1a5a05508147e018673ccd2a7b7ad769a72fcc6a12ac60 not found: ID does not exist" containerID="2aeb9abf2a6647091b1a5a05508147e018673ccd2a7b7ad769a72fcc6a12ac60" Apr 24 21:45:13.785763 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.785726 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aeb9abf2a6647091b1a5a05508147e018673ccd2a7b7ad769a72fcc6a12ac60"} err="failed to get container status \"2aeb9abf2a6647091b1a5a05508147e018673ccd2a7b7ad769a72fcc6a12ac60\": rpc error: code = NotFound desc = could not find container \"2aeb9abf2a6647091b1a5a05508147e018673ccd2a7b7ad769a72fcc6a12ac60\": container with ID starting with 2aeb9abf2a6647091b1a5a05508147e018673ccd2a7b7ad769a72fcc6a12ac60 not found: ID does not exist" Apr 24 21:45:13.785763 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.785744 2578 scope.go:117] "RemoveContainer" containerID="df8bfbee38cb6e1c1ace3d2bb11b7da729d8c3281df3fd64456edf0145626e2e" Apr 24 21:45:13.792735 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.792723 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt"] Apr 24 21:45:13.792803 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.792752 2578 scope.go:117] "RemoveContainer" containerID="6322705f57d26155dac574a55351d2e4af29b20990db47cb2249e4e48c47785c" Apr 24 21:45:13.794505 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.794485 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6d3ba-predictor-5d779b8c86-cqxlt"] Apr 24 21:45:13.799611 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.799592 2578 scope.go:117] "RemoveContainer" containerID="df8bfbee38cb6e1c1ace3d2bb11b7da729d8c3281df3fd64456edf0145626e2e" Apr 24 21:45:13.799854 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:45:13.799837 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df8bfbee38cb6e1c1ace3d2bb11b7da729d8c3281df3fd64456edf0145626e2e\": container with ID starting with df8bfbee38cb6e1c1ace3d2bb11b7da729d8c3281df3fd64456edf0145626e2e not found: ID does not exist" containerID="df8bfbee38cb6e1c1ace3d2bb11b7da729d8c3281df3fd64456edf0145626e2e" Apr 24 21:45:13.799925 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.799863 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df8bfbee38cb6e1c1ace3d2bb11b7da729d8c3281df3fd64456edf0145626e2e"} err="failed to get container status \"df8bfbee38cb6e1c1ace3d2bb11b7da729d8c3281df3fd64456edf0145626e2e\": rpc error: code = NotFound desc = could not find container \"df8bfbee38cb6e1c1ace3d2bb11b7da729d8c3281df3fd64456edf0145626e2e\": container with ID starting with df8bfbee38cb6e1c1ace3d2bb11b7da729d8c3281df3fd64456edf0145626e2e not found: ID does not exist" Apr 24 21:45:13.799925 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.799883 2578 scope.go:117] "RemoveContainer" containerID="6322705f57d26155dac574a55351d2e4af29b20990db47cb2249e4e48c47785c" Apr 24 21:45:13.800139 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:45:13.800121 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6322705f57d26155dac574a55351d2e4af29b20990db47cb2249e4e48c47785c\": container with ID starting with 6322705f57d26155dac574a55351d2e4af29b20990db47cb2249e4e48c47785c not found: ID does not exist" containerID="6322705f57d26155dac574a55351d2e4af29b20990db47cb2249e4e48c47785c" Apr 24 21:45:13.800180 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.800147 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6322705f57d26155dac574a55351d2e4af29b20990db47cb2249e4e48c47785c"} err="failed to get container status \"6322705f57d26155dac574a55351d2e4af29b20990db47cb2249e4e48c47785c\": rpc error: code = NotFound desc = could not find container \"6322705f57d26155dac574a55351d2e4af29b20990db47cb2249e4e48c47785c\": container with ID starting with 6322705f57d26155dac574a55351d2e4af29b20990db47cb2249e4e48c47785c not found: ID does not exist" Apr 24 21:45:13.804231 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.804212 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc"] Apr 24 21:45:13.807963 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:13.807943 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6d3ba-predictor-5578786d5c-2xfmc"] Apr 24 21:45:14.422061 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:14.422025 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" path="/var/lib/kubelet/pods/2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d/volumes" Apr 24 21:45:14.422496 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:14.422478 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be31fcea-aebe-4dbd-b1e7-4776f3785a40" path="/var/lib/kubelet/pods/be31fcea-aebe-4dbd-b1e7-4776f3785a40/volumes" Apr 24 21:45:18.775260 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:18.775223 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" Apr 24 21:45:18.775712 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:18.775612 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" Apr 24 21:45:18.775712 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:18.775673 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" podUID="8cb4eb76-33f9-42a7-94bd-501aeeab32f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:45:18.776057 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:18.776038 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" podUID="214a2c49-4165-4a5f-a071-b7c230595d4c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 21:45:28.776442 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:28.776356 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" podUID="214a2c49-4165-4a5f-a071-b7c230595d4c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 21:45:28.776897 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:28.776354 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" podUID="8cb4eb76-33f9-42a7-94bd-501aeeab32f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:45:38.775787 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:38.775751 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" podUID="8cb4eb76-33f9-42a7-94bd-501aeeab32f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:45:38.776137 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:38.776057 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" podUID="214a2c49-4165-4a5f-a071-b7c230595d4c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 21:45:48.776265 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:48.776229 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" podUID="214a2c49-4165-4a5f-a071-b7c230595d4c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 21:45:48.776265 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:48.776243 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" podUID="8cb4eb76-33f9-42a7-94bd-501aeeab32f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 21:45:50.133842 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.133808 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl"] Apr 24 21:45:50.134222 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.134078 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" podUID="affb13ef-1273-40ee-a013-bddd85341559" containerName="kserve-container" containerID="cri-o://3115bf990807279ae2915a75e9f0a50583c57ea0d0228ac9683938602367a5b4" gracePeriod=30 Apr 24 21:45:50.134222 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.134131 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" podUID="affb13ef-1273-40ee-a013-bddd85341559" containerName="kube-rbac-proxy" containerID="cri-o://b994dd1d206ee5ecf0e220f27fc735c508698c06e51600ea4b5c16204b441fc9" gracePeriod=30 Apr 24 21:45:50.195904 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.195878 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq"] Apr 24 21:45:50.196152 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.196126 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" podUID="a38b40de-e304-4264-9768-c8b049b630a2" containerName="kserve-container" containerID="cri-o://f19cb8316382301bf87d716cfdbedae4f85d6c042030e383ffb835ef1df05f2d" gracePeriod=30 Apr 24 21:45:50.196234 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.196162 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" podUID="a38b40de-e304-4264-9768-c8b049b630a2" containerName="kube-rbac-proxy" containerID="cri-o://1e4679e9efcea4572a60f0ea755f23a34ab25d98c99f2f48e6626e94816e3c5b" gracePeriod=30 Apr 24 21:45:50.212476 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.212451 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf"] Apr 24 21:45:50.212930 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.212913 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be31fcea-aebe-4dbd-b1e7-4776f3785a40" containerName="kserve-container" Apr 24 21:45:50.213014 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.212932 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="be31fcea-aebe-4dbd-b1e7-4776f3785a40" containerName="kserve-container" Apr 24 21:45:50.213014 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.212946 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" containerName="kserve-container" Apr 24 21:45:50.213014 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.212954 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" containerName="kserve-container" Apr 24 21:45:50.213014 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.212979 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be31fcea-aebe-4dbd-b1e7-4776f3785a40" containerName="kube-rbac-proxy" Apr 24 21:45:50.213014 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.212988 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="be31fcea-aebe-4dbd-b1e7-4776f3785a40" containerName="kube-rbac-proxy" Apr 24 21:45:50.213014 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.213003 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" containerName="kube-rbac-proxy" Apr 24 21:45:50.213014 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.213010 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" containerName="kube-rbac-proxy" Apr 24 21:45:50.213334 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.213095 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="be31fcea-aebe-4dbd-b1e7-4776f3785a40" containerName="kserve-container" Apr 24 21:45:50.213334 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.213106 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" containerName="kserve-container" Apr 24 21:45:50.213334 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.213115 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="be31fcea-aebe-4dbd-b1e7-4776f3785a40" containerName="kube-rbac-proxy" Apr 24 21:45:50.213334 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.213128 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2fa5eddf-8404-4732-93a3-3f6f1e0b7d1d" containerName="kube-rbac-proxy" Apr 24 21:45:50.222291 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.222273 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" Apr 24 21:45:50.224415 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.224397 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-6445b-predictor-serving-cert\"" Apr 24 21:45:50.224658 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.224638 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-6445b-kube-rbac-proxy-sar-config\"" Apr 24 21:45:50.224938 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.224920 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf"] Apr 24 21:45:50.311196 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.311173 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw"] Apr 24 21:45:50.323791 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.323767 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" Apr 24 21:45:50.324578 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.324531 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw"] Apr 24 21:45:50.326008 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.325988 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-6445b-kube-rbac-proxy-sar-config\"" Apr 24 21:45:50.326100 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.326063 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-6445b-predictor-serving-cert\"" Apr 24 21:45:50.377068 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.377043 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-6445b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9c0e3ab9-32b9-4155-ad77-6da651f96cf3-success-200-isvc-6445b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-6445b-predictor-769c496d67-hc2tf\" (UID: \"9c0e3ab9-32b9-4155-ad77-6da651f96cf3\") " pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" Apr 24 21:45:50.377188 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.377088 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c0e3ab9-32b9-4155-ad77-6da651f96cf3-proxy-tls\") pod \"success-200-isvc-6445b-predictor-769c496d67-hc2tf\" (UID: \"9c0e3ab9-32b9-4155-ad77-6da651f96cf3\") " pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" Apr 24 21:45:50.377188 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.377145 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zck7g\" (UniqueName: \"kubernetes.io/projected/9c0e3ab9-32b9-4155-ad77-6da651f96cf3-kube-api-access-zck7g\") pod \"success-200-isvc-6445b-predictor-769c496d67-hc2tf\" (UID: \"9c0e3ab9-32b9-4155-ad77-6da651f96cf3\") " pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" Apr 24 21:45:50.478345 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.478320 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjp4t\" (UniqueName: \"kubernetes.io/projected/0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c-kube-api-access-cjp4t\") pod \"error-404-isvc-6445b-predictor-566b89cd56-67vrw\" (UID: \"0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c\") " pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" Apr 24 21:45:50.478464 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.478359 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-6445b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9c0e3ab9-32b9-4155-ad77-6da651f96cf3-success-200-isvc-6445b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-6445b-predictor-769c496d67-hc2tf\" (UID: \"9c0e3ab9-32b9-4155-ad77-6da651f96cf3\") " pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" Apr 24 21:45:50.478464 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.478410 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-6445b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c-error-404-isvc-6445b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-6445b-predictor-566b89cd56-67vrw\" (UID: \"0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c\") " pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" Apr 24 21:45:50.478464 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.478444 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c0e3ab9-32b9-4155-ad77-6da651f96cf3-proxy-tls\") pod \"success-200-isvc-6445b-predictor-769c496d67-hc2tf\" (UID: \"9c0e3ab9-32b9-4155-ad77-6da651f96cf3\") " pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" Apr 24 21:45:50.478649 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.478485 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zck7g\" (UniqueName: \"kubernetes.io/projected/9c0e3ab9-32b9-4155-ad77-6da651f96cf3-kube-api-access-zck7g\") pod \"success-200-isvc-6445b-predictor-769c496d67-hc2tf\" (UID: \"9c0e3ab9-32b9-4155-ad77-6da651f96cf3\") " pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" Apr 24 21:45:50.478649 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.478520 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c-proxy-tls\") pod \"error-404-isvc-6445b-predictor-566b89cd56-67vrw\" (UID: \"0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c\") " pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" Apr 24 21:45:50.478649 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:45:50.478531 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-6445b-predictor-serving-cert: secret "success-200-isvc-6445b-predictor-serving-cert" not found Apr 24 21:45:50.478649 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:45:50.478615 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c0e3ab9-32b9-4155-ad77-6da651f96cf3-proxy-tls podName:9c0e3ab9-32b9-4155-ad77-6da651f96cf3 nodeName:}" failed. No retries permitted until 2026-04-24 21:45:50.978595728 +0000 UTC m=+1097.154330382 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9c0e3ab9-32b9-4155-ad77-6da651f96cf3-proxy-tls") pod "success-200-isvc-6445b-predictor-769c496d67-hc2tf" (UID: "9c0e3ab9-32b9-4155-ad77-6da651f96cf3") : secret "success-200-isvc-6445b-predictor-serving-cert" not found Apr 24 21:45:50.479043 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.479024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-6445b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9c0e3ab9-32b9-4155-ad77-6da651f96cf3-success-200-isvc-6445b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-6445b-predictor-769c496d67-hc2tf\" (UID: \"9c0e3ab9-32b9-4155-ad77-6da651f96cf3\") " pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" Apr 24 21:45:50.486943 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.486923 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zck7g\" (UniqueName: \"kubernetes.io/projected/9c0e3ab9-32b9-4155-ad77-6da651f96cf3-kube-api-access-zck7g\") pod \"success-200-isvc-6445b-predictor-769c496d67-hc2tf\" (UID: \"9c0e3ab9-32b9-4155-ad77-6da651f96cf3\") " pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" Apr 24 21:45:50.579700 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.579667 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjp4t\" (UniqueName: \"kubernetes.io/projected/0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c-kube-api-access-cjp4t\") pod \"error-404-isvc-6445b-predictor-566b89cd56-67vrw\" (UID: \"0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c\") " pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" Apr 24 21:45:50.579859 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.579719 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-6445b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c-error-404-isvc-6445b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-6445b-predictor-566b89cd56-67vrw\" (UID: \"0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c\") " pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" Apr 24 21:45:50.579859 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.579780 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c-proxy-tls\") pod \"error-404-isvc-6445b-predictor-566b89cd56-67vrw\" (UID: \"0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c\") " pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" Apr 24 21:45:50.580344 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.580321 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-6445b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c-error-404-isvc-6445b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-6445b-predictor-566b89cd56-67vrw\" (UID: \"0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c\") " pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" Apr 24 21:45:50.582198 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.582177 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c-proxy-tls\") pod \"error-404-isvc-6445b-predictor-566b89cd56-67vrw\" (UID: \"0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c\") " pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" Apr 24 21:45:50.587518 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.587500 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjp4t\" (UniqueName: \"kubernetes.io/projected/0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c-kube-api-access-cjp4t\") pod \"error-404-isvc-6445b-predictor-566b89cd56-67vrw\" (UID: \"0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c\") " pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" Apr 24 21:45:50.633941 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.633916 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" Apr 24 21:45:50.755290 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.755219 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw"] Apr 24 21:45:50.758163 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:45:50.758137 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b3e97f0_f7b9_44f1_b0b1_5bbaf899a83c.slice/crio-8724de544827b22ea4caca1794e84dbee4761f51f10ae296cbdf4e33bf40a2f0 WatchSource:0}: Error finding container 8724de544827b22ea4caca1794e84dbee4761f51f10ae296cbdf4e33bf40a2f0: Status 404 returned error can't find the container with id 8724de544827b22ea4caca1794e84dbee4761f51f10ae296cbdf4e33bf40a2f0 Apr 24 21:45:50.892336 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.892305 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" event={"ID":"0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c","Type":"ContainerStarted","Data":"bc5998219fd61823d5c7357979332a001dc13168962724855aad4153160a79f0"} Apr 24 21:45:50.892454 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.892346 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" event={"ID":"0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c","Type":"ContainerStarted","Data":"e6ea65887cddbd0681bab6e25a2976a5777f9e05d30fbfc61264d4bd569b83fd"} Apr 24 21:45:50.892454 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.892360 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" event={"ID":"0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c","Type":"ContainerStarted","Data":"8724de544827b22ea4caca1794e84dbee4761f51f10ae296cbdf4e33bf40a2f0"} Apr 24 21:45:50.892454 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.892407 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" Apr 24 21:45:50.892454 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.892440 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" Apr 24 21:45:50.893708 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.893687 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" podUID="0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:45:50.893968 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.893952 2578 generic.go:358] "Generic (PLEG): container finished" podID="affb13ef-1273-40ee-a013-bddd85341559" containerID="b994dd1d206ee5ecf0e220f27fc735c508698c06e51600ea4b5c16204b441fc9" exitCode=2 Apr 24 21:45:50.894024 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.894009 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" event={"ID":"affb13ef-1273-40ee-a013-bddd85341559","Type":"ContainerDied","Data":"b994dd1d206ee5ecf0e220f27fc735c508698c06e51600ea4b5c16204b441fc9"} Apr 24 21:45:50.895455 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.895433 2578 generic.go:358] "Generic (PLEG): container finished" podID="a38b40de-e304-4264-9768-c8b049b630a2" containerID="1e4679e9efcea4572a60f0ea755f23a34ab25d98c99f2f48e6626e94816e3c5b" exitCode=2 Apr 24 21:45:50.895579 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.895481 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" event={"ID":"a38b40de-e304-4264-9768-c8b049b630a2","Type":"ContainerDied","Data":"1e4679e9efcea4572a60f0ea755f23a34ab25d98c99f2f48e6626e94816e3c5b"} Apr 24 21:45:50.910331 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.910296 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" podStartSLOduration=0.910286278 podStartE2EDuration="910.286278ms" podCreationTimestamp="2026-04-24 21:45:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:45:50.907979567 +0000 UTC m=+1097.083714250" watchObservedRunningTime="2026-04-24 21:45:50.910286278 +0000 UTC m=+1097.086020949" Apr 24 21:45:50.984139 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.984105 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c0e3ab9-32b9-4155-ad77-6da651f96cf3-proxy-tls\") pod \"success-200-isvc-6445b-predictor-769c496d67-hc2tf\" (UID: \"9c0e3ab9-32b9-4155-ad77-6da651f96cf3\") " pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" Apr 24 21:45:50.986284 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:50.986264 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c0e3ab9-32b9-4155-ad77-6da651f96cf3-proxy-tls\") pod \"success-200-isvc-6445b-predictor-769c496d67-hc2tf\" (UID: \"9c0e3ab9-32b9-4155-ad77-6da651f96cf3\") " pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" Apr 24 21:45:51.132957 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:51.132878 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" Apr 24 21:45:51.459338 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:51.459316 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf"] Apr 24 21:45:51.461529 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:45:51.461502 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c0e3ab9_32b9_4155_ad77_6da651f96cf3.slice/crio-aa0a9cb3d0124da3d2d539cdcb0dc1ac74f1332fcd310fa435d8e6d1ad2a80be WatchSource:0}: Error finding container aa0a9cb3d0124da3d2d539cdcb0dc1ac74f1332fcd310fa435d8e6d1ad2a80be: Status 404 returned error can't find the container with id aa0a9cb3d0124da3d2d539cdcb0dc1ac74f1332fcd310fa435d8e6d1ad2a80be Apr 24 21:45:51.900856 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:51.900814 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" event={"ID":"9c0e3ab9-32b9-4155-ad77-6da651f96cf3","Type":"ContainerStarted","Data":"3384a3d260de18f8d106a5edd3222ed5a5e84cc0c6c923521cfc22612f143d1f"} Apr 24 21:45:51.900856 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:51.900858 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" event={"ID":"9c0e3ab9-32b9-4155-ad77-6da651f96cf3","Type":"ContainerStarted","Data":"c73fab23f1586e4deafa8f52788683db84bc0c8503ca0f94bb3c06a4f756000d"} Apr 24 21:45:51.901089 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:51.900878 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" event={"ID":"9c0e3ab9-32b9-4155-ad77-6da651f96cf3","Type":"ContainerStarted","Data":"aa0a9cb3d0124da3d2d539cdcb0dc1ac74f1332fcd310fa435d8e6d1ad2a80be"} Apr 24 21:45:51.901089 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:51.900895 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" Apr 24 21:45:51.901089 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:51.900909 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" Apr 24 21:45:51.907084 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:51.901467 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" podUID="0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:45:51.907084 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:51.902240 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" podUID="9c0e3ab9-32b9-4155-ad77-6da651f96cf3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 21:45:51.921984 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:51.921931 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" podStartSLOduration=1.921914595 podStartE2EDuration="1.921914595s" podCreationTimestamp="2026-04-24 21:45:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:45:51.919465534 +0000 UTC m=+1098.095200207" watchObservedRunningTime="2026-04-24 21:45:51.921914595 +0000 UTC m=+1098.097649301" Apr 24 21:45:52.904657 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:52.904617 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" podUID="9c0e3ab9-32b9-4155-ad77-6da651f96cf3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 21:45:53.262903 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.262860 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" podUID="affb13ef-1273-40ee-a013-bddd85341559" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.29:8643/healthz\": dial tcp 10.134.0.29:8643: connect: connection refused" Apr 24 21:45:53.263030 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.262865 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" podUID="a38b40de-e304-4264-9768-c8b049b630a2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 24 21:45:53.268195 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.268175 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" podUID="a38b40de-e304-4264-9768-c8b049b630a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 21:45:53.268244 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.268181 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" podUID="affb13ef-1273-40ee-a013-bddd85341559" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 24 21:45:53.697619 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.697597 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" Apr 24 21:45:53.810681 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.810618 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/affb13ef-1273-40ee-a013-bddd85341559-proxy-tls\") pod \"affb13ef-1273-40ee-a013-bddd85341559\" (UID: \"affb13ef-1273-40ee-a013-bddd85341559\") " Apr 24 21:45:53.810681 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.810651 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-54155-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/affb13ef-1273-40ee-a013-bddd85341559-error-404-isvc-54155-kube-rbac-proxy-sar-config\") pod \"affb13ef-1273-40ee-a013-bddd85341559\" (UID: \"affb13ef-1273-40ee-a013-bddd85341559\") " Apr 24 21:45:53.810872 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.810722 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq2h7\" (UniqueName: \"kubernetes.io/projected/affb13ef-1273-40ee-a013-bddd85341559-kube-api-access-rq2h7\") pod \"affb13ef-1273-40ee-a013-bddd85341559\" (UID: \"affb13ef-1273-40ee-a013-bddd85341559\") " Apr 24 21:45:53.811045 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.811013 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/affb13ef-1273-40ee-a013-bddd85341559-error-404-isvc-54155-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-54155-kube-rbac-proxy-sar-config") pod "affb13ef-1273-40ee-a013-bddd85341559" (UID: "affb13ef-1273-40ee-a013-bddd85341559"). InnerVolumeSpecName "error-404-isvc-54155-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:45:53.812804 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.812779 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/affb13ef-1273-40ee-a013-bddd85341559-kube-api-access-rq2h7" (OuterVolumeSpecName: "kube-api-access-rq2h7") pod "affb13ef-1273-40ee-a013-bddd85341559" (UID: "affb13ef-1273-40ee-a013-bddd85341559"). InnerVolumeSpecName "kube-api-access-rq2h7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:45:53.813033 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.812949 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/affb13ef-1273-40ee-a013-bddd85341559-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "affb13ef-1273-40ee-a013-bddd85341559" (UID: "affb13ef-1273-40ee-a013-bddd85341559"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:45:53.909295 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.909256 2578 generic.go:358] "Generic (PLEG): container finished" podID="affb13ef-1273-40ee-a013-bddd85341559" containerID="3115bf990807279ae2915a75e9f0a50583c57ea0d0228ac9683938602367a5b4" exitCode=0 Apr 24 21:45:53.909694 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.909332 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" Apr 24 21:45:53.909694 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.909339 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" event={"ID":"affb13ef-1273-40ee-a013-bddd85341559","Type":"ContainerDied","Data":"3115bf990807279ae2915a75e9f0a50583c57ea0d0228ac9683938602367a5b4"} Apr 24 21:45:53.909694 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.909387 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl" event={"ID":"affb13ef-1273-40ee-a013-bddd85341559","Type":"ContainerDied","Data":"d052f4abbfcc43b05349df39b8b88d956a04a5fc0e2581b5bfe3bf4d04ea7dce"} Apr 24 21:45:53.909694 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.909409 2578 scope.go:117] "RemoveContainer" containerID="b994dd1d206ee5ecf0e220f27fc735c508698c06e51600ea4b5c16204b441fc9" Apr 24 21:45:53.911318 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.911287 2578 generic.go:358] "Generic (PLEG): container finished" podID="a38b40de-e304-4264-9768-c8b049b630a2" containerID="f19cb8316382301bf87d716cfdbedae4f85d6c042030e383ffb835ef1df05f2d" exitCode=0 Apr 24 21:45:53.911414 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.911334 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" event={"ID":"a38b40de-e304-4264-9768-c8b049b630a2","Type":"ContainerDied","Data":"f19cb8316382301bf87d716cfdbedae4f85d6c042030e383ffb835ef1df05f2d"} Apr 24 21:45:53.911457 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.911411 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rq2h7\" (UniqueName: \"kubernetes.io/projected/affb13ef-1273-40ee-a013-bddd85341559-kube-api-access-rq2h7\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:45:53.911457 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.911434 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/affb13ef-1273-40ee-a013-bddd85341559-proxy-tls\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:45:53.911457 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.911451 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-54155-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/affb13ef-1273-40ee-a013-bddd85341559-error-404-isvc-54155-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:45:53.919239 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.919218 2578 scope.go:117] "RemoveContainer" containerID="3115bf990807279ae2915a75e9f0a50583c57ea0d0228ac9683938602367a5b4" Apr 24 21:45:53.926088 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.926073 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" Apr 24 21:45:53.927117 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.927104 2578 scope.go:117] "RemoveContainer" containerID="b994dd1d206ee5ecf0e220f27fc735c508698c06e51600ea4b5c16204b441fc9" Apr 24 21:45:53.927340 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:45:53.927321 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b994dd1d206ee5ecf0e220f27fc735c508698c06e51600ea4b5c16204b441fc9\": container with ID starting with b994dd1d206ee5ecf0e220f27fc735c508698c06e51600ea4b5c16204b441fc9 not found: ID does not exist" containerID="b994dd1d206ee5ecf0e220f27fc735c508698c06e51600ea4b5c16204b441fc9" Apr 24 21:45:53.927410 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.927347 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b994dd1d206ee5ecf0e220f27fc735c508698c06e51600ea4b5c16204b441fc9"} err="failed to get container status \"b994dd1d206ee5ecf0e220f27fc735c508698c06e51600ea4b5c16204b441fc9\": rpc error: code = NotFound desc = could not find container \"b994dd1d206ee5ecf0e220f27fc735c508698c06e51600ea4b5c16204b441fc9\": container with ID starting with b994dd1d206ee5ecf0e220f27fc735c508698c06e51600ea4b5c16204b441fc9 not found: ID does not exist" Apr 24 21:45:53.927410 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.927364 2578 scope.go:117] "RemoveContainer" containerID="3115bf990807279ae2915a75e9f0a50583c57ea0d0228ac9683938602367a5b4" Apr 24 21:45:53.927639 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:45:53.927619 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3115bf990807279ae2915a75e9f0a50583c57ea0d0228ac9683938602367a5b4\": container with ID starting with 3115bf990807279ae2915a75e9f0a50583c57ea0d0228ac9683938602367a5b4 not found: ID does not exist" containerID="3115bf990807279ae2915a75e9f0a50583c57ea0d0228ac9683938602367a5b4" Apr 24 21:45:53.927701 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.927648 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3115bf990807279ae2915a75e9f0a50583c57ea0d0228ac9683938602367a5b4"} err="failed to get container status \"3115bf990807279ae2915a75e9f0a50583c57ea0d0228ac9683938602367a5b4\": rpc error: code = NotFound desc = could not find container \"3115bf990807279ae2915a75e9f0a50583c57ea0d0228ac9683938602367a5b4\": container with ID starting with 3115bf990807279ae2915a75e9f0a50583c57ea0d0228ac9683938602367a5b4 not found: ID does not exist" Apr 24 21:45:53.931902 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.931883 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl"] Apr 24 21:45:53.937854 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:53.937829 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-54155-predictor-db578f77d-7fmtl"] Apr 24 21:45:54.113628 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:54.113564 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-54155-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a38b40de-e304-4264-9768-c8b049b630a2-success-200-isvc-54155-kube-rbac-proxy-sar-config\") pod \"a38b40de-e304-4264-9768-c8b049b630a2\" (UID: \"a38b40de-e304-4264-9768-c8b049b630a2\") " Apr 24 21:45:54.113628 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:54.113616 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a38b40de-e304-4264-9768-c8b049b630a2-proxy-tls\") pod \"a38b40de-e304-4264-9768-c8b049b630a2\" (UID: \"a38b40de-e304-4264-9768-c8b049b630a2\") " Apr 24 21:45:54.113792 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:54.113670 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc2rw\" (UniqueName: \"kubernetes.io/projected/a38b40de-e304-4264-9768-c8b049b630a2-kube-api-access-lc2rw\") pod \"a38b40de-e304-4264-9768-c8b049b630a2\" (UID: \"a38b40de-e304-4264-9768-c8b049b630a2\") " Apr 24 21:45:54.113938 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:54.113912 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a38b40de-e304-4264-9768-c8b049b630a2-success-200-isvc-54155-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-54155-kube-rbac-proxy-sar-config") pod "a38b40de-e304-4264-9768-c8b049b630a2" (UID: "a38b40de-e304-4264-9768-c8b049b630a2"). InnerVolumeSpecName "success-200-isvc-54155-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:45:54.115785 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:54.115761 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a38b40de-e304-4264-9768-c8b049b630a2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a38b40de-e304-4264-9768-c8b049b630a2" (UID: "a38b40de-e304-4264-9768-c8b049b630a2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:45:54.115879 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:54.115826 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a38b40de-e304-4264-9768-c8b049b630a2-kube-api-access-lc2rw" (OuterVolumeSpecName: "kube-api-access-lc2rw") pod "a38b40de-e304-4264-9768-c8b049b630a2" (UID: "a38b40de-e304-4264-9768-c8b049b630a2"). InnerVolumeSpecName "kube-api-access-lc2rw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:45:54.214885 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:54.214860 2578 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-54155-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a38b40de-e304-4264-9768-c8b049b630a2-success-200-isvc-54155-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:45:54.214885 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:54.214884 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a38b40de-e304-4264-9768-c8b049b630a2-proxy-tls\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:45:54.215025 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:54.214894 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lc2rw\" (UniqueName: \"kubernetes.io/projected/a38b40de-e304-4264-9768-c8b049b630a2-kube-api-access-lc2rw\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:45:54.423067 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:54.423000 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="affb13ef-1273-40ee-a013-bddd85341559" path="/var/lib/kubelet/pods/affb13ef-1273-40ee-a013-bddd85341559/volumes" Apr 24 21:45:54.917577 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:54.917524 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" event={"ID":"a38b40de-e304-4264-9768-c8b049b630a2","Type":"ContainerDied","Data":"bf53ee51c19f4aaba039d56b71155d628aad9ab0bb2e414fd31e279269416df3"} Apr 24 21:45:54.918030 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:54.917594 2578 scope.go:117] "RemoveContainer" containerID="1e4679e9efcea4572a60f0ea755f23a34ab25d98c99f2f48e6626e94816e3c5b" Apr 24 21:45:54.918030 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:54.917530 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq" Apr 24 21:45:54.925600 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:54.925581 2578 scope.go:117] "RemoveContainer" containerID="f19cb8316382301bf87d716cfdbedae4f85d6c042030e383ffb835ef1df05f2d" Apr 24 21:45:54.934141 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:54.934118 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq"] Apr 24 21:45:54.938321 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:54.938303 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-54155-predictor-744c89d589-bgsvq"] Apr 24 21:45:56.421975 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:56.421938 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a38b40de-e304-4264-9768-c8b049b630a2" path="/var/lib/kubelet/pods/a38b40de-e304-4264-9768-c8b049b630a2/volumes" Apr 24 21:45:56.905709 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:56.905684 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" Apr 24 21:45:56.906179 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:56.906149 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" podUID="0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:45:57.909212 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:57.909179 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" Apr 24 21:45:57.909731 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:57.909709 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" podUID="9c0e3ab9-32b9-4155-ad77-6da651f96cf3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 21:45:58.776707 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:58.776677 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" Apr 24 21:45:58.777073 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:45:58.777057 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" Apr 24 21:46:06.906166 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:06.906128 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" podUID="0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:46:07.910680 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:07.910636 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" podUID="9c0e3ab9-32b9-4155-ad77-6da651f96cf3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 21:46:16.906198 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:16.906158 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" podUID="0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:46:17.910662 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:17.910625 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" podUID="9c0e3ab9-32b9-4155-ad77-6da651f96cf3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 21:46:20.564358 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.564328 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc"] Apr 24 21:46:20.564812 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.564624 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" podUID="214a2c49-4165-4a5f-a071-b7c230595d4c" containerName="kserve-container" containerID="cri-o://aa7cc3c983d247ef8bf8fc586e5c91cf2fbe74f05c42e82d95aee83bc7eab7d2" gracePeriod=30 Apr 24 21:46:20.564812 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.564677 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" podUID="214a2c49-4165-4a5f-a071-b7c230595d4c" containerName="kube-rbac-proxy" containerID="cri-o://1a61f067534f8c337287a97065a4b1082cd8b1e628830657aa33da1c13507fbd" gracePeriod=30 Apr 24 21:46:20.589214 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.589192 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t"] Apr 24 21:46:20.589556 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.589532 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="affb13ef-1273-40ee-a013-bddd85341559" containerName="kube-rbac-proxy" Apr 24 21:46:20.589603 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.589562 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="affb13ef-1273-40ee-a013-bddd85341559" containerName="kube-rbac-proxy" Apr 24 21:46:20.589603 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.589572 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="affb13ef-1273-40ee-a013-bddd85341559" containerName="kserve-container" Apr 24 21:46:20.589603 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.589577 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="affb13ef-1273-40ee-a013-bddd85341559" containerName="kserve-container" Apr 24 21:46:20.589603 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.589601 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a38b40de-e304-4264-9768-c8b049b630a2" containerName="kube-rbac-proxy" Apr 24 21:46:20.589720 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.589606 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a38b40de-e304-4264-9768-c8b049b630a2" containerName="kube-rbac-proxy" Apr 24 21:46:20.589720 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.589615 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a38b40de-e304-4264-9768-c8b049b630a2" containerName="kserve-container" Apr 24 21:46:20.589720 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.589620 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a38b40de-e304-4264-9768-c8b049b630a2" containerName="kserve-container" Apr 24 21:46:20.589720 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.589669 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a38b40de-e304-4264-9768-c8b049b630a2" containerName="kserve-container" Apr 24 21:46:20.589720 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.589678 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="affb13ef-1273-40ee-a013-bddd85341559" containerName="kube-rbac-proxy" Apr 24 21:46:20.589720 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.589686 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a38b40de-e304-4264-9768-c8b049b630a2" containerName="kube-rbac-proxy" Apr 24 21:46:20.589720 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.589694 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="affb13ef-1273-40ee-a013-bddd85341559" containerName="kserve-container" Apr 24 21:46:20.592737 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.592721 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" Apr 24 21:46:20.594673 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.594646 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-a9fd5-kube-rbac-proxy-sar-config\"" Apr 24 21:46:20.594775 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.594748 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-a9fd5-predictor-serving-cert\"" Apr 24 21:46:20.600998 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.600974 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t"] Apr 24 21:46:20.632197 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.632166 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt"] Apr 24 21:46:20.632531 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.632483 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" podUID="8cb4eb76-33f9-42a7-94bd-501aeeab32f1" containerName="kserve-container" containerID="cri-o://c70ef077ad67565464b1fb1220e70c3aaeb8e7f24694d952d6a0637fbb73c926" gracePeriod=30 Apr 24 21:46:20.632658 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.632524 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" podUID="8cb4eb76-33f9-42a7-94bd-501aeeab32f1" containerName="kube-rbac-proxy" containerID="cri-o://0f1ebd0aaae7fafc83fa8993825fcf9df8e8dbbf1753eea5dcded7e3d6ec865a" gracePeriod=30 Apr 24 21:46:20.690188 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.690133 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5"] Apr 24 21:46:20.693573 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.693557 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" Apr 24 21:46:20.695851 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.695826 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-a9fd5-predictor-serving-cert\"" Apr 24 21:46:20.695969 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.695908 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-a9fd5-kube-rbac-proxy-sar-config\"" Apr 24 21:46:20.702181 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.702160 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5"] Apr 24 21:46:20.721634 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.721608 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f409ce2b-d8c4-4c32-a764-32955142f14e-proxy-tls\") pod \"success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t\" (UID: \"f409ce2b-d8c4-4c32-a764-32955142f14e\") " pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" Apr 24 21:46:20.721767 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.721642 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-a9fd5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f409ce2b-d8c4-4c32-a764-32955142f14e-success-200-isvc-a9fd5-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t\" (UID: \"f409ce2b-d8c4-4c32-a764-32955142f14e\") " pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" Apr 24 21:46:20.721767 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.721677 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-244xd\" (UniqueName: \"kubernetes.io/projected/f409ce2b-d8c4-4c32-a764-32955142f14e-kube-api-access-244xd\") pod \"success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t\" (UID: \"f409ce2b-d8c4-4c32-a764-32955142f14e\") " pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" Apr 24 21:46:20.822587 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.822498 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f409ce2b-d8c4-4c32-a764-32955142f14e-proxy-tls\") pod \"success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t\" (UID: \"f409ce2b-d8c4-4c32-a764-32955142f14e\") " pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" Apr 24 21:46:20.822587 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.822533 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-a9fd5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f409ce2b-d8c4-4c32-a764-32955142f14e-success-200-isvc-a9fd5-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t\" (UID: \"f409ce2b-d8c4-4c32-a764-32955142f14e\") " pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" Apr 24 21:46:20.822587 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.822579 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-a9fd5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/989e357d-91ff-427f-a266-dcbde106f0fa-error-404-isvc-a9fd5-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5\" (UID: \"989e357d-91ff-427f-a266-dcbde106f0fa\") " pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" Apr 24 21:46:20.822839 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.822605 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/989e357d-91ff-427f-a266-dcbde106f0fa-proxy-tls\") pod \"error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5\" (UID: \"989e357d-91ff-427f-a266-dcbde106f0fa\") " pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" Apr 24 21:46:20.822839 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.822637 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-244xd\" (UniqueName: \"kubernetes.io/projected/f409ce2b-d8c4-4c32-a764-32955142f14e-kube-api-access-244xd\") pod \"success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t\" (UID: \"f409ce2b-d8c4-4c32-a764-32955142f14e\") " pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" Apr 24 21:46:20.822839 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.822692 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w24ln\" (UniqueName: \"kubernetes.io/projected/989e357d-91ff-427f-a266-dcbde106f0fa-kube-api-access-w24ln\") pod \"error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5\" (UID: \"989e357d-91ff-427f-a266-dcbde106f0fa\") " pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" Apr 24 21:46:20.823150 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.823133 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-a9fd5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f409ce2b-d8c4-4c32-a764-32955142f14e-success-200-isvc-a9fd5-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t\" (UID: \"f409ce2b-d8c4-4c32-a764-32955142f14e\") " pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" Apr 24 21:46:20.825083 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.825063 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f409ce2b-d8c4-4c32-a764-32955142f14e-proxy-tls\") pod \"success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t\" (UID: \"f409ce2b-d8c4-4c32-a764-32955142f14e\") " pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" Apr 24 21:46:20.830572 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.830532 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-244xd\" (UniqueName: \"kubernetes.io/projected/f409ce2b-d8c4-4c32-a764-32955142f14e-kube-api-access-244xd\") pod \"success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t\" (UID: \"f409ce2b-d8c4-4c32-a764-32955142f14e\") " pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" Apr 24 21:46:20.903941 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.903917 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" Apr 24 21:46:20.923649 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.923624 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/989e357d-91ff-427f-a266-dcbde106f0fa-proxy-tls\") pod \"error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5\" (UID: \"989e357d-91ff-427f-a266-dcbde106f0fa\") " pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" Apr 24 21:46:20.923773 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.923676 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w24ln\" (UniqueName: \"kubernetes.io/projected/989e357d-91ff-427f-a266-dcbde106f0fa-kube-api-access-w24ln\") pod \"error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5\" (UID: \"989e357d-91ff-427f-a266-dcbde106f0fa\") " pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" Apr 24 21:46:20.923773 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.923764 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-a9fd5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/989e357d-91ff-427f-a266-dcbde106f0fa-error-404-isvc-a9fd5-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5\" (UID: \"989e357d-91ff-427f-a266-dcbde106f0fa\") " pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" Apr 24 21:46:20.923866 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:46:20.923766 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-serving-cert: secret "error-404-isvc-a9fd5-predictor-serving-cert" not found Apr 24 21:46:20.926224 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:46:20.924382 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/989e357d-91ff-427f-a266-dcbde106f0fa-proxy-tls podName:989e357d-91ff-427f-a266-dcbde106f0fa nodeName:}" failed. No retries permitted until 2026-04-24 21:46:21.424354413 +0000 UTC m=+1127.600089073 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/989e357d-91ff-427f-a266-dcbde106f0fa-proxy-tls") pod "error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" (UID: "989e357d-91ff-427f-a266-dcbde106f0fa") : secret "error-404-isvc-a9fd5-predictor-serving-cert" not found Apr 24 21:46:20.926224 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.924662 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-a9fd5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/989e357d-91ff-427f-a266-dcbde106f0fa-error-404-isvc-a9fd5-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5\" (UID: \"989e357d-91ff-427f-a266-dcbde106f0fa\") " pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" Apr 24 21:46:20.933778 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:20.933754 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w24ln\" (UniqueName: \"kubernetes.io/projected/989e357d-91ff-427f-a266-dcbde106f0fa-kube-api-access-w24ln\") pod \"error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5\" (UID: \"989e357d-91ff-427f-a266-dcbde106f0fa\") " pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" Apr 24 21:46:21.004720 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:21.004686 2578 generic.go:358] "Generic (PLEG): container finished" podID="8cb4eb76-33f9-42a7-94bd-501aeeab32f1" containerID="0f1ebd0aaae7fafc83fa8993825fcf9df8e8dbbf1753eea5dcded7e3d6ec865a" exitCode=2 Apr 24 21:46:21.004882 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:21.004727 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" event={"ID":"8cb4eb76-33f9-42a7-94bd-501aeeab32f1","Type":"ContainerDied","Data":"0f1ebd0aaae7fafc83fa8993825fcf9df8e8dbbf1753eea5dcded7e3d6ec865a"} Apr 24 21:46:21.006845 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:21.006822 2578 generic.go:358] "Generic (PLEG): container finished" podID="214a2c49-4165-4a5f-a071-b7c230595d4c" containerID="1a61f067534f8c337287a97065a4b1082cd8b1e628830657aa33da1c13507fbd" exitCode=2 Apr 24 21:46:21.006946 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:21.006887 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" event={"ID":"214a2c49-4165-4a5f-a071-b7c230595d4c","Type":"ContainerDied","Data":"1a61f067534f8c337287a97065a4b1082cd8b1e628830657aa33da1c13507fbd"} Apr 24 21:46:21.027121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:21.027100 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t"] Apr 24 21:46:21.029164 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:46:21.029139 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf409ce2b_d8c4_4c32_a764_32955142f14e.slice/crio-060c3c8773ac1c7448b3c2b7ed92b7704202b7bb8bbc976786f94ec328595a5d WatchSource:0}: Error finding container 060c3c8773ac1c7448b3c2b7ed92b7704202b7bb8bbc976786f94ec328595a5d: Status 404 returned error can't find the container with id 060c3c8773ac1c7448b3c2b7ed92b7704202b7bb8bbc976786f94ec328595a5d Apr 24 21:46:21.429094 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:21.429012 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/989e357d-91ff-427f-a266-dcbde106f0fa-proxy-tls\") pod \"error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5\" (UID: \"989e357d-91ff-427f-a266-dcbde106f0fa\") " pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" Apr 24 21:46:21.431455 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:21.431431 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/989e357d-91ff-427f-a266-dcbde106f0fa-proxy-tls\") pod \"error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5\" (UID: \"989e357d-91ff-427f-a266-dcbde106f0fa\") " pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" Apr 24 21:46:21.604364 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:21.604332 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" Apr 24 21:46:21.723177 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:21.723145 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5"] Apr 24 21:46:21.724169 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:46:21.724146 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod989e357d_91ff_427f_a266_dcbde106f0fa.slice/crio-b1c748120a15b8c30f267cd979edf3261e3fed69ab513153d819b607635703e7 WatchSource:0}: Error finding container b1c748120a15b8c30f267cd979edf3261e3fed69ab513153d819b607635703e7: Status 404 returned error can't find the container with id b1c748120a15b8c30f267cd979edf3261e3fed69ab513153d819b607635703e7 Apr 24 21:46:22.012360 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:22.012263 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" event={"ID":"989e357d-91ff-427f-a266-dcbde106f0fa","Type":"ContainerStarted","Data":"5c269eedc030daf5b98390b669232f683a51ab6916b443ddb1d53e1835044511"} Apr 24 21:46:22.012360 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:22.012312 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" event={"ID":"989e357d-91ff-427f-a266-dcbde106f0fa","Type":"ContainerStarted","Data":"6c46b927068c8ad73b3eedf70de51f6ab5babb6a2f4c7c4607ba0b8e5247ef3e"} Apr 24 21:46:22.012360 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:22.012329 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" event={"ID":"989e357d-91ff-427f-a266-dcbde106f0fa","Type":"ContainerStarted","Data":"b1c748120a15b8c30f267cd979edf3261e3fed69ab513153d819b607635703e7"} Apr 24 21:46:22.014664 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:22.014630 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" Apr 24 21:46:22.014787 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:22.014674 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" Apr 24 21:46:22.014787 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:22.014722 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" podUID="989e357d-91ff-427f-a266-dcbde106f0fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:46:22.017141 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:22.017109 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" event={"ID":"f409ce2b-d8c4-4c32-a764-32955142f14e","Type":"ContainerStarted","Data":"ed34458ebfb6f0250796f542d304b6158fe8fc047e41b45c7f3864b92b349537"} Apr 24 21:46:22.017253 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:22.017144 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" event={"ID":"f409ce2b-d8c4-4c32-a764-32955142f14e","Type":"ContainerStarted","Data":"4eeab9aac3ce69206fe21e872b43b8900bfd6f2ef49e60bf7de5260ba5f9cf4a"} Apr 24 21:46:22.017253 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:22.017159 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" event={"ID":"f409ce2b-d8c4-4c32-a764-32955142f14e","Type":"ContainerStarted","Data":"060c3c8773ac1c7448b3c2b7ed92b7704202b7bb8bbc976786f94ec328595a5d"} Apr 24 21:46:22.017717 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:22.017691 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" Apr 24 21:46:22.017808 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:22.017724 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" Apr 24 21:46:22.018353 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:22.018323 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" podUID="f409ce2b-d8c4-4c32-a764-32955142f14e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:46:22.032963 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:22.032911 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" podStartSLOduration=2.032895018 podStartE2EDuration="2.032895018s" podCreationTimestamp="2026-04-24 21:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:46:22.031527462 +0000 UTC m=+1128.207262138" watchObservedRunningTime="2026-04-24 21:46:22.032895018 +0000 UTC m=+1128.208629695" Apr 24 21:46:22.048647 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:22.048604 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" podStartSLOduration=2.048593924 podStartE2EDuration="2.048593924s" podCreationTimestamp="2026-04-24 21:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:46:22.047826914 +0000 UTC m=+1128.223561600" watchObservedRunningTime="2026-04-24 21:46:22.048593924 +0000 UTC m=+1128.224328597" Apr 24 21:46:23.024667 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:23.024628 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" podUID="f409ce2b-d8c4-4c32-a764-32955142f14e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:46:23.025122 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:23.024808 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" podUID="989e357d-91ff-427f-a266-dcbde106f0fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:46:23.770822 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:23.770788 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" podUID="214a2c49-4165-4a5f-a071-b7c230595d4c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.30:8643/healthz\": dial tcp 10.134.0.30:8643: connect: connection refused" Apr 24 21:46:23.770981 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:23.770787 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" podUID="8cb4eb76-33f9-42a7-94bd-501aeeab32f1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 24 21:46:24.028038 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:24.027952 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" podUID="f409ce2b-d8c4-4c32-a764-32955142f14e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:46:24.028439 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:24.028056 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" podUID="989e357d-91ff-427f-a266-dcbde106f0fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:46:24.421109 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:24.421077 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" Apr 24 21:46:24.423730 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:24.423710 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" Apr 24 21:46:24.554815 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:24.554739 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7bsq\" (UniqueName: \"kubernetes.io/projected/214a2c49-4165-4a5f-a071-b7c230595d4c-kube-api-access-l7bsq\") pod \"214a2c49-4165-4a5f-a071-b7c230595d4c\" (UID: \"214a2c49-4165-4a5f-a071-b7c230595d4c\") " Apr 24 21:46:24.554815 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:24.554781 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-dd3fc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/214a2c49-4165-4a5f-a071-b7c230595d4c-success-200-isvc-dd3fc-kube-rbac-proxy-sar-config\") pod \"214a2c49-4165-4a5f-a071-b7c230595d4c\" (UID: \"214a2c49-4165-4a5f-a071-b7c230595d4c\") " Apr 24 21:46:24.554815 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:24.554799 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9824b\" (UniqueName: \"kubernetes.io/projected/8cb4eb76-33f9-42a7-94bd-501aeeab32f1-kube-api-access-9824b\") pod \"8cb4eb76-33f9-42a7-94bd-501aeeab32f1\" (UID: \"8cb4eb76-33f9-42a7-94bd-501aeeab32f1\") " Apr 24 21:46:24.555072 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:24.554819 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8cb4eb76-33f9-42a7-94bd-501aeeab32f1-proxy-tls\") pod \"8cb4eb76-33f9-42a7-94bd-501aeeab32f1\" (UID: \"8cb4eb76-33f9-42a7-94bd-501aeeab32f1\") " Apr 24 21:46:24.555072 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:24.554849 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-dd3fc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8cb4eb76-33f9-42a7-94bd-501aeeab32f1-error-404-isvc-dd3fc-kube-rbac-proxy-sar-config\") pod \"8cb4eb76-33f9-42a7-94bd-501aeeab32f1\" (UID: \"8cb4eb76-33f9-42a7-94bd-501aeeab32f1\") " Apr 24 21:46:24.555072 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:24.554959 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/214a2c49-4165-4a5f-a071-b7c230595d4c-proxy-tls\") pod \"214a2c49-4165-4a5f-a071-b7c230595d4c\" (UID: \"214a2c49-4165-4a5f-a071-b7c230595d4c\") " Apr 24 21:46:24.555242 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:24.555217 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/214a2c49-4165-4a5f-a071-b7c230595d4c-success-200-isvc-dd3fc-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-dd3fc-kube-rbac-proxy-sar-config") pod "214a2c49-4165-4a5f-a071-b7c230595d4c" (UID: "214a2c49-4165-4a5f-a071-b7c230595d4c"). InnerVolumeSpecName "success-200-isvc-dd3fc-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:46:24.555313 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:24.555271 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cb4eb76-33f9-42a7-94bd-501aeeab32f1-error-404-isvc-dd3fc-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-dd3fc-kube-rbac-proxy-sar-config") pod "8cb4eb76-33f9-42a7-94bd-501aeeab32f1" (UID: "8cb4eb76-33f9-42a7-94bd-501aeeab32f1"). InnerVolumeSpecName "error-404-isvc-dd3fc-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:46:24.557028 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:24.556997 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/214a2c49-4165-4a5f-a071-b7c230595d4c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "214a2c49-4165-4a5f-a071-b7c230595d4c" (UID: "214a2c49-4165-4a5f-a071-b7c230595d4c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:46:24.557121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:24.557036 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cb4eb76-33f9-42a7-94bd-501aeeab32f1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8cb4eb76-33f9-42a7-94bd-501aeeab32f1" (UID: "8cb4eb76-33f9-42a7-94bd-501aeeab32f1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:46:24.557121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:24.557062 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/214a2c49-4165-4a5f-a071-b7c230595d4c-kube-api-access-l7bsq" (OuterVolumeSpecName: "kube-api-access-l7bsq") pod "214a2c49-4165-4a5f-a071-b7c230595d4c" (UID: "214a2c49-4165-4a5f-a071-b7c230595d4c"). InnerVolumeSpecName "kube-api-access-l7bsq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:46:24.557121 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:24.557068 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cb4eb76-33f9-42a7-94bd-501aeeab32f1-kube-api-access-9824b" (OuterVolumeSpecName: "kube-api-access-9824b") pod "8cb4eb76-33f9-42a7-94bd-501aeeab32f1" (UID: "8cb4eb76-33f9-42a7-94bd-501aeeab32f1"). InnerVolumeSpecName "kube-api-access-9824b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:46:24.655794 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:24.655764 2578 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-dd3fc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/214a2c49-4165-4a5f-a071-b7c230595d4c-success-200-isvc-dd3fc-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:46:24.655794 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:24.655793 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9824b\" (UniqueName: \"kubernetes.io/projected/8cb4eb76-33f9-42a7-94bd-501aeeab32f1-kube-api-access-9824b\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:46:24.655982 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:24.655804 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8cb4eb76-33f9-42a7-94bd-501aeeab32f1-proxy-tls\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:46:24.655982 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:24.655814 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-dd3fc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8cb4eb76-33f9-42a7-94bd-501aeeab32f1-error-404-isvc-dd3fc-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:46:24.655982 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:24.655824 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/214a2c49-4165-4a5f-a071-b7c230595d4c-proxy-tls\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:46:24.655982 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:24.655833 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l7bsq\" (UniqueName: \"kubernetes.io/projected/214a2c49-4165-4a5f-a071-b7c230595d4c-kube-api-access-l7bsq\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:46:25.032572 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.032520 2578 generic.go:358] "Generic (PLEG): container finished" podID="214a2c49-4165-4a5f-a071-b7c230595d4c" containerID="aa7cc3c983d247ef8bf8fc586e5c91cf2fbe74f05c42e82d95aee83bc7eab7d2" exitCode=0 Apr 24 21:46:25.032991 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.032586 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" event={"ID":"214a2c49-4165-4a5f-a071-b7c230595d4c","Type":"ContainerDied","Data":"aa7cc3c983d247ef8bf8fc586e5c91cf2fbe74f05c42e82d95aee83bc7eab7d2"} Apr 24 21:46:25.032991 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.032620 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" Apr 24 21:46:25.032991 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.032628 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc" event={"ID":"214a2c49-4165-4a5f-a071-b7c230595d4c","Type":"ContainerDied","Data":"218728898a1b7a6d35127c6f9ab6c372ff296cc17c7b80f0918982186d7e8c45"} Apr 24 21:46:25.032991 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.032645 2578 scope.go:117] "RemoveContainer" containerID="1a61f067534f8c337287a97065a4b1082cd8b1e628830657aa33da1c13507fbd" Apr 24 21:46:25.034090 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.034067 2578 generic.go:358] "Generic (PLEG): container finished" podID="8cb4eb76-33f9-42a7-94bd-501aeeab32f1" containerID="c70ef077ad67565464b1fb1220e70c3aaeb8e7f24694d952d6a0637fbb73c926" exitCode=0 Apr 24 21:46:25.034204 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.034118 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" event={"ID":"8cb4eb76-33f9-42a7-94bd-501aeeab32f1","Type":"ContainerDied","Data":"c70ef077ad67565464b1fb1220e70c3aaeb8e7f24694d952d6a0637fbb73c926"} Apr 24 21:46:25.034204 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.034125 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" Apr 24 21:46:25.034204 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.034141 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt" event={"ID":"8cb4eb76-33f9-42a7-94bd-501aeeab32f1","Type":"ContainerDied","Data":"ea546a526967d05f7f5ef9deb48a50c197d5dd24015213f2aa78f8722de3e8b6"} Apr 24 21:46:25.040615 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.040596 2578 scope.go:117] "RemoveContainer" containerID="aa7cc3c983d247ef8bf8fc586e5c91cf2fbe74f05c42e82d95aee83bc7eab7d2" Apr 24 21:46:25.048193 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.048175 2578 scope.go:117] "RemoveContainer" containerID="1a61f067534f8c337287a97065a4b1082cd8b1e628830657aa33da1c13507fbd" Apr 24 21:46:25.048418 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:46:25.048402 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a61f067534f8c337287a97065a4b1082cd8b1e628830657aa33da1c13507fbd\": container with ID starting with 1a61f067534f8c337287a97065a4b1082cd8b1e628830657aa33da1c13507fbd not found: ID does not exist" containerID="1a61f067534f8c337287a97065a4b1082cd8b1e628830657aa33da1c13507fbd" Apr 24 21:46:25.048483 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.048428 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a61f067534f8c337287a97065a4b1082cd8b1e628830657aa33da1c13507fbd"} err="failed to get container status \"1a61f067534f8c337287a97065a4b1082cd8b1e628830657aa33da1c13507fbd\": rpc error: code = NotFound desc = could not find container \"1a61f067534f8c337287a97065a4b1082cd8b1e628830657aa33da1c13507fbd\": container with ID starting with 1a61f067534f8c337287a97065a4b1082cd8b1e628830657aa33da1c13507fbd not found: ID does not exist" Apr 24 21:46:25.048483 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.048451 2578 scope.go:117] "RemoveContainer" containerID="aa7cc3c983d247ef8bf8fc586e5c91cf2fbe74f05c42e82d95aee83bc7eab7d2" Apr 24 21:46:25.048702 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:46:25.048685 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa7cc3c983d247ef8bf8fc586e5c91cf2fbe74f05c42e82d95aee83bc7eab7d2\": container with ID starting with aa7cc3c983d247ef8bf8fc586e5c91cf2fbe74f05c42e82d95aee83bc7eab7d2 not found: ID does not exist" containerID="aa7cc3c983d247ef8bf8fc586e5c91cf2fbe74f05c42e82d95aee83bc7eab7d2" Apr 24 21:46:25.048764 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.048708 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa7cc3c983d247ef8bf8fc586e5c91cf2fbe74f05c42e82d95aee83bc7eab7d2"} err="failed to get container status \"aa7cc3c983d247ef8bf8fc586e5c91cf2fbe74f05c42e82d95aee83bc7eab7d2\": rpc error: code = NotFound desc = could not find container \"aa7cc3c983d247ef8bf8fc586e5c91cf2fbe74f05c42e82d95aee83bc7eab7d2\": container with ID starting with aa7cc3c983d247ef8bf8fc586e5c91cf2fbe74f05c42e82d95aee83bc7eab7d2 not found: ID does not exist" Apr 24 21:46:25.048764 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.048722 2578 scope.go:117] "RemoveContainer" containerID="0f1ebd0aaae7fafc83fa8993825fcf9df8e8dbbf1753eea5dcded7e3d6ec865a" Apr 24 21:46:25.055516 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.055500 2578 scope.go:117] "RemoveContainer" containerID="c70ef077ad67565464b1fb1220e70c3aaeb8e7f24694d952d6a0637fbb73c926" Apr 24 21:46:25.062058 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.062045 2578 scope.go:117] "RemoveContainer" containerID="0f1ebd0aaae7fafc83fa8993825fcf9df8e8dbbf1753eea5dcded7e3d6ec865a" Apr 24 21:46:25.062286 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:46:25.062269 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f1ebd0aaae7fafc83fa8993825fcf9df8e8dbbf1753eea5dcded7e3d6ec865a\": container with ID starting with 0f1ebd0aaae7fafc83fa8993825fcf9df8e8dbbf1753eea5dcded7e3d6ec865a not found: ID does not exist" containerID="0f1ebd0aaae7fafc83fa8993825fcf9df8e8dbbf1753eea5dcded7e3d6ec865a" Apr 24 21:46:25.062323 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.062292 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f1ebd0aaae7fafc83fa8993825fcf9df8e8dbbf1753eea5dcded7e3d6ec865a"} err="failed to get container status \"0f1ebd0aaae7fafc83fa8993825fcf9df8e8dbbf1753eea5dcded7e3d6ec865a\": rpc error: code = NotFound desc = could not find container \"0f1ebd0aaae7fafc83fa8993825fcf9df8e8dbbf1753eea5dcded7e3d6ec865a\": container with ID starting with 0f1ebd0aaae7fafc83fa8993825fcf9df8e8dbbf1753eea5dcded7e3d6ec865a not found: ID does not exist" Apr 24 21:46:25.062323 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.062306 2578 scope.go:117] "RemoveContainer" containerID="c70ef077ad67565464b1fb1220e70c3aaeb8e7f24694d952d6a0637fbb73c926" Apr 24 21:46:25.062525 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:46:25.062507 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c70ef077ad67565464b1fb1220e70c3aaeb8e7f24694d952d6a0637fbb73c926\": container with ID starting with c70ef077ad67565464b1fb1220e70c3aaeb8e7f24694d952d6a0637fbb73c926 not found: ID does not exist" containerID="c70ef077ad67565464b1fb1220e70c3aaeb8e7f24694d952d6a0637fbb73c926" Apr 24 21:46:25.062622 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.062530 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c70ef077ad67565464b1fb1220e70c3aaeb8e7f24694d952d6a0637fbb73c926"} err="failed to get container status \"c70ef077ad67565464b1fb1220e70c3aaeb8e7f24694d952d6a0637fbb73c926\": rpc error: code = NotFound desc = could not find container \"c70ef077ad67565464b1fb1220e70c3aaeb8e7f24694d952d6a0637fbb73c926\": container with ID starting with c70ef077ad67565464b1fb1220e70c3aaeb8e7f24694d952d6a0637fbb73c926 not found: ID does not exist" Apr 24 21:46:25.067411 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.067390 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc"] Apr 24 21:46:25.069057 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.069039 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dd3fc-predictor-7dd9d668df-b8wpc"] Apr 24 21:46:25.079464 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.079442 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt"] Apr 24 21:46:25.080894 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:25.080876 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dd3fc-predictor-5bbf46867d-gzdnt"] Apr 24 21:46:26.423201 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:26.423167 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="214a2c49-4165-4a5f-a071-b7c230595d4c" path="/var/lib/kubelet/pods/214a2c49-4165-4a5f-a071-b7c230595d4c/volumes" Apr 24 21:46:26.423605 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:26.423593 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cb4eb76-33f9-42a7-94bd-501aeeab32f1" path="/var/lib/kubelet/pods/8cb4eb76-33f9-42a7-94bd-501aeeab32f1/volumes" Apr 24 21:46:26.906651 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:26.906613 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" podUID="0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 21:46:27.910478 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:27.910440 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" podUID="9c0e3ab9-32b9-4155-ad77-6da651f96cf3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 21:46:29.033129 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:29.033101 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" Apr 24 21:46:29.033524 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:29.033359 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" Apr 24 21:46:29.033734 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:29.033712 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" podUID="f409ce2b-d8c4-4c32-a764-32955142f14e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:46:29.033792 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:29.033716 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" podUID="989e357d-91ff-427f-a266-dcbde106f0fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:46:36.907597 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:36.907570 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" Apr 24 21:46:37.911158 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:37.911120 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" Apr 24 21:46:39.034033 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:39.033990 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" podUID="989e357d-91ff-427f-a266-dcbde106f0fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:46:39.034380 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:39.033990 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" podUID="f409ce2b-d8c4-4c32-a764-32955142f14e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:46:49.034491 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:49.034455 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" podUID="989e357d-91ff-427f-a266-dcbde106f0fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:46:49.034853 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:49.034460 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" podUID="f409ce2b-d8c4-4c32-a764-32955142f14e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:46:59.034679 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:59.034595 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" podUID="989e357d-91ff-427f-a266-dcbde106f0fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 21:46:59.035098 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:46:59.034595 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" podUID="f409ce2b-d8c4-4c32-a764-32955142f14e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 21:47:00.439294 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.439262 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9"] Apr 24 21:47:00.439812 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.439793 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="214a2c49-4165-4a5f-a071-b7c230595d4c" containerName="kube-rbac-proxy" Apr 24 21:47:00.439890 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.439815 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="214a2c49-4165-4a5f-a071-b7c230595d4c" containerName="kube-rbac-proxy" Apr 24 21:47:00.439890 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.439828 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="214a2c49-4165-4a5f-a071-b7c230595d4c" containerName="kserve-container" Apr 24 21:47:00.439890 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.439836 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="214a2c49-4165-4a5f-a071-b7c230595d4c" containerName="kserve-container" Apr 24 21:47:00.439890 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.439859 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8cb4eb76-33f9-42a7-94bd-501aeeab32f1" containerName="kserve-container" Apr 24 21:47:00.439890 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.439867 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cb4eb76-33f9-42a7-94bd-501aeeab32f1" containerName="kserve-container" Apr 24 21:47:00.439890 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.439889 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8cb4eb76-33f9-42a7-94bd-501aeeab32f1" containerName="kube-rbac-proxy" Apr 24 21:47:00.440180 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.439897 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cb4eb76-33f9-42a7-94bd-501aeeab32f1" containerName="kube-rbac-proxy" Apr 24 21:47:00.440180 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.439972 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="214a2c49-4165-4a5f-a071-b7c230595d4c" containerName="kserve-container" Apr 24 21:47:00.440180 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.439987 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="214a2c49-4165-4a5f-a071-b7c230595d4c" containerName="kube-rbac-proxy" Apr 24 21:47:00.440180 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.439999 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8cb4eb76-33f9-42a7-94bd-501aeeab32f1" containerName="kserve-container" Apr 24 21:47:00.440180 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.440010 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8cb4eb76-33f9-42a7-94bd-501aeeab32f1" containerName="kube-rbac-proxy" Apr 24 21:47:00.443256 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.443236 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" Apr 24 21:47:00.444114 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.444096 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf"] Apr 24 21:47:00.444339 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.444319 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" podUID="9c0e3ab9-32b9-4155-ad77-6da651f96cf3" containerName="kserve-container" containerID="cri-o://c73fab23f1586e4deafa8f52788683db84bc0c8503ca0f94bb3c06a4f756000d" gracePeriod=30 Apr 24 21:47:00.444420 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.444329 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" podUID="9c0e3ab9-32b9-4155-ad77-6da651f96cf3" containerName="kube-rbac-proxy" containerID="cri-o://3384a3d260de18f8d106a5edd3222ed5a5e84cc0c6c923521cfc22612f143d1f" gracePeriod=30 Apr 24 21:47:00.445334 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.445315 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-d1134-kube-rbac-proxy-sar-config\"" Apr 24 21:47:00.445420 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.445402 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-d1134-predictor-serving-cert\"" Apr 24 21:47:00.452028 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.452008 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9"] Apr 24 21:47:00.536338 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.536312 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh"] Apr 24 21:47:00.539970 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.539950 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" Apr 24 21:47:00.541965 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.541944 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-d1134-predictor-serving-cert\"" Apr 24 21:47:00.542050 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.542025 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-d1134-kube-rbac-proxy-sar-config\"" Apr 24 21:47:00.551563 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.551520 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw"] Apr 24 21:47:00.551934 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.551888 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" podUID="0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c" containerName="kserve-container" containerID="cri-o://e6ea65887cddbd0681bab6e25a2976a5777f9e05d30fbfc61264d4bd569b83fd" gracePeriod=30 Apr 24 21:47:00.552021 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.551949 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" podUID="0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c" containerName="kube-rbac-proxy" containerID="cri-o://bc5998219fd61823d5c7357979332a001dc13168962724855aad4153160a79f0" gracePeriod=30 Apr 24 21:47:00.553814 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.553796 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh"] Apr 24 21:47:00.558859 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.558829 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjvjp\" (UniqueName: \"kubernetes.io/projected/7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67-kube-api-access-tjvjp\") pod \"success-200-isvc-d1134-predictor-b6cdf5487-xvsr9\" (UID: \"7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67\") " pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" Apr 24 21:47:00.559146 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.559121 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-d1134-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67-success-200-isvc-d1134-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d1134-predictor-b6cdf5487-xvsr9\" (UID: \"7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67\") " pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" Apr 24 21:47:00.559249 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.559165 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67-proxy-tls\") pod \"success-200-isvc-d1134-predictor-b6cdf5487-xvsr9\" (UID: \"7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67\") " pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" Apr 24 21:47:00.660325 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.660294 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlbkv\" (UniqueName: \"kubernetes.io/projected/303b11b3-a44c-4cd3-896a-000a60c34a09-kube-api-access-zlbkv\") pod \"error-404-isvc-d1134-predictor-796fc776b4-5ssqh\" (UID: \"303b11b3-a44c-4cd3-896a-000a60c34a09\") " pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" Apr 24 21:47:00.660464 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.660345 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjvjp\" (UniqueName: \"kubernetes.io/projected/7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67-kube-api-access-tjvjp\") pod \"success-200-isvc-d1134-predictor-b6cdf5487-xvsr9\" (UID: \"7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67\") " pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" Apr 24 21:47:00.660464 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.660419 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-d1134-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/303b11b3-a44c-4cd3-896a-000a60c34a09-error-404-isvc-d1134-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d1134-predictor-796fc776b4-5ssqh\" (UID: \"303b11b3-a44c-4cd3-896a-000a60c34a09\") " pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" Apr 24 21:47:00.660574 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.660488 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/303b11b3-a44c-4cd3-896a-000a60c34a09-proxy-tls\") pod \"error-404-isvc-d1134-predictor-796fc776b4-5ssqh\" (UID: \"303b11b3-a44c-4cd3-896a-000a60c34a09\") " pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" Apr 24 21:47:00.660574 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.660519 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-d1134-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67-success-200-isvc-d1134-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d1134-predictor-b6cdf5487-xvsr9\" (UID: \"7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67\") " pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" Apr 24 21:47:00.660674 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.660563 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67-proxy-tls\") pod \"success-200-isvc-d1134-predictor-b6cdf5487-xvsr9\" (UID: \"7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67\") " pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" Apr 24 21:47:00.661192 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.661163 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-d1134-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67-success-200-isvc-d1134-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d1134-predictor-b6cdf5487-xvsr9\" (UID: \"7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67\") " pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" Apr 24 21:47:00.662838 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.662822 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67-proxy-tls\") pod \"success-200-isvc-d1134-predictor-b6cdf5487-xvsr9\" (UID: \"7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67\") " pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" Apr 24 21:47:00.667715 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.667698 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjvjp\" (UniqueName: \"kubernetes.io/projected/7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67-kube-api-access-tjvjp\") pod \"success-200-isvc-d1134-predictor-b6cdf5487-xvsr9\" (UID: \"7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67\") " pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" Apr 24 21:47:00.761379 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.761352 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlbkv\" (UniqueName: \"kubernetes.io/projected/303b11b3-a44c-4cd3-896a-000a60c34a09-kube-api-access-zlbkv\") pod \"error-404-isvc-d1134-predictor-796fc776b4-5ssqh\" (UID: \"303b11b3-a44c-4cd3-896a-000a60c34a09\") " pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" Apr 24 21:47:00.761535 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.761396 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-d1134-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/303b11b3-a44c-4cd3-896a-000a60c34a09-error-404-isvc-d1134-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d1134-predictor-796fc776b4-5ssqh\" (UID: \"303b11b3-a44c-4cd3-896a-000a60c34a09\") " pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" Apr 24 21:47:00.761535 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.761431 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/303b11b3-a44c-4cd3-896a-000a60c34a09-proxy-tls\") pod \"error-404-isvc-d1134-predictor-796fc776b4-5ssqh\" (UID: \"303b11b3-a44c-4cd3-896a-000a60c34a09\") " pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" Apr 24 21:47:00.761535 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:47:00.761530 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-d1134-predictor-serving-cert: secret "error-404-isvc-d1134-predictor-serving-cert" not found Apr 24 21:47:00.761686 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:47:00.761609 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/303b11b3-a44c-4cd3-896a-000a60c34a09-proxy-tls podName:303b11b3-a44c-4cd3-896a-000a60c34a09 nodeName:}" failed. No retries permitted until 2026-04-24 21:47:01.261594118 +0000 UTC m=+1167.437328770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/303b11b3-a44c-4cd3-896a-000a60c34a09-proxy-tls") pod "error-404-isvc-d1134-predictor-796fc776b4-5ssqh" (UID: "303b11b3-a44c-4cd3-896a-000a60c34a09") : secret "error-404-isvc-d1134-predictor-serving-cert" not found Apr 24 21:47:00.761873 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.761855 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" Apr 24 21:47:00.762184 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.762163 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-d1134-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/303b11b3-a44c-4cd3-896a-000a60c34a09-error-404-isvc-d1134-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d1134-predictor-796fc776b4-5ssqh\" (UID: \"303b11b3-a44c-4cd3-896a-000a60c34a09\") " pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" Apr 24 21:47:00.769839 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.769821 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlbkv\" (UniqueName: \"kubernetes.io/projected/303b11b3-a44c-4cd3-896a-000a60c34a09-kube-api-access-zlbkv\") pod \"error-404-isvc-d1134-predictor-796fc776b4-5ssqh\" (UID: \"303b11b3-a44c-4cd3-896a-000a60c34a09\") " pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" Apr 24 21:47:00.880070 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:00.880047 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9"] Apr 24 21:47:00.882171 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:47:00.882145 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7eeb6208_ba7c_40e7_9ccf_0b38b2a3bc67.slice/crio-e6df9842eccca33b182d0b97c70f96b071d7f6c4885c43af457b88463c843c61 WatchSource:0}: Error finding container e6df9842eccca33b182d0b97c70f96b071d7f6c4885c43af457b88463c843c61: Status 404 returned error can't find the container with id e6df9842eccca33b182d0b97c70f96b071d7f6c4885c43af457b88463c843c61 Apr 24 21:47:01.150981 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:01.150897 2578 generic.go:358] "Generic (PLEG): container finished" podID="9c0e3ab9-32b9-4155-ad77-6da651f96cf3" containerID="3384a3d260de18f8d106a5edd3222ed5a5e84cc0c6c923521cfc22612f143d1f" exitCode=2 Apr 24 21:47:01.150981 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:01.150961 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" event={"ID":"9c0e3ab9-32b9-4155-ad77-6da651f96cf3","Type":"ContainerDied","Data":"3384a3d260de18f8d106a5edd3222ed5a5e84cc0c6c923521cfc22612f143d1f"} Apr 24 21:47:01.152539 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:01.152505 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" event={"ID":"7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67","Type":"ContainerStarted","Data":"5e60521ff8d9a49349d4a74cb1b9371f07c9c7f4d593ac1d29733b46d8c7ced3"} Apr 24 21:47:01.152695 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:01.152566 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" event={"ID":"7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67","Type":"ContainerStarted","Data":"cbb3bf76ee8c3b68de1630a74ae63c604bf1545e3b1ac03a2eda9af223f609a6"} Apr 24 21:47:01.152695 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:01.152588 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" Apr 24 21:47:01.152695 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:01.152601 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" event={"ID":"7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67","Type":"ContainerStarted","Data":"e6df9842eccca33b182d0b97c70f96b071d7f6c4885c43af457b88463c843c61"} Apr 24 21:47:01.153981 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:01.153960 2578 generic.go:358] "Generic (PLEG): container finished" podID="0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c" containerID="bc5998219fd61823d5c7357979332a001dc13168962724855aad4153160a79f0" exitCode=2 Apr 24 21:47:01.154082 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:01.153989 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" event={"ID":"0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c","Type":"ContainerDied","Data":"bc5998219fd61823d5c7357979332a001dc13168962724855aad4153160a79f0"} Apr 24 21:47:01.172561 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:01.172514 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" podStartSLOduration=1.17250265 podStartE2EDuration="1.17250265s" podCreationTimestamp="2026-04-24 21:47:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:47:01.171485126 +0000 UTC m=+1167.347219799" watchObservedRunningTime="2026-04-24 21:47:01.17250265 +0000 UTC m=+1167.348237322" Apr 24 21:47:01.266423 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:01.266388 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/303b11b3-a44c-4cd3-896a-000a60c34a09-proxy-tls\") pod \"error-404-isvc-d1134-predictor-796fc776b4-5ssqh\" (UID: \"303b11b3-a44c-4cd3-896a-000a60c34a09\") " pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" Apr 24 21:47:01.268810 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:01.268793 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/303b11b3-a44c-4cd3-896a-000a60c34a09-proxy-tls\") pod \"error-404-isvc-d1134-predictor-796fc776b4-5ssqh\" (UID: \"303b11b3-a44c-4cd3-896a-000a60c34a09\") " pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" Apr 24 21:47:01.451937 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:01.451853 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" Apr 24 21:47:01.592571 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:01.592528 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh"] Apr 24 21:47:01.595442 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:47:01.595414 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod303b11b3_a44c_4cd3_896a_000a60c34a09.slice/crio-8dd9eee6a63f499dd38f0d7f5836ca8408bbbb03cf0acb17aac8caf6890d60a5 WatchSource:0}: Error finding container 8dd9eee6a63f499dd38f0d7f5836ca8408bbbb03cf0acb17aac8caf6890d60a5: Status 404 returned error can't find the container with id 8dd9eee6a63f499dd38f0d7f5836ca8408bbbb03cf0acb17aac8caf6890d60a5 Apr 24 21:47:01.902018 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:01.901974 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" podUID="0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": dial tcp 10.134.0.33:8643: connect: connection refused" Apr 24 21:47:02.158934 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:02.158897 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" event={"ID":"303b11b3-a44c-4cd3-896a-000a60c34a09","Type":"ContainerStarted","Data":"bd21e692fe3f5e9dda57d1b564826bc10d06ae99da96fb71aaa1defa4ed9dd47"} Apr 24 21:47:02.159128 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:02.159072 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" Apr 24 21:47:02.159128 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:02.159098 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" Apr 24 21:47:02.159128 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:02.159110 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" event={"ID":"303b11b3-a44c-4cd3-896a-000a60c34a09","Type":"ContainerStarted","Data":"f2bdfdc9a55f57341d5987125674ae6dd139516489fa12a7df1f87aa9930c4e6"} Apr 24 21:47:02.159128 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:02.159125 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" event={"ID":"303b11b3-a44c-4cd3-896a-000a60c34a09","Type":"ContainerStarted","Data":"8dd9eee6a63f499dd38f0d7f5836ca8408bbbb03cf0acb17aac8caf6890d60a5"} Apr 24 21:47:02.159388 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:02.159361 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" Apr 24 21:47:02.160393 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:02.160370 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" podUID="303b11b3-a44c-4cd3-896a-000a60c34a09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 21:47:02.160393 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:02.160386 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" podUID="7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:47:02.178985 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:02.178932 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" podStartSLOduration=2.1789173809999998 podStartE2EDuration="2.178917381s" podCreationTimestamp="2026-04-24 21:47:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:47:02.177107812 +0000 UTC m=+1168.352842484" watchObservedRunningTime="2026-04-24 21:47:02.178917381 +0000 UTC m=+1168.354652055" Apr 24 21:47:02.905284 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:02.905247 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" podUID="9c0e3ab9-32b9-4155-ad77-6da651f96cf3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.32:8643/healthz\": dial tcp 10.134.0.32:8643: connect: connection refused" Apr 24 21:47:03.168265 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:03.168167 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" podUID="7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:47:03.168265 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:03.168245 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" podUID="303b11b3-a44c-4cd3-896a-000a60c34a09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 21:47:03.980242 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:03.980222 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" Apr 24 21:47:03.989388 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:03.989364 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c-proxy-tls\") pod \"0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c\" (UID: \"0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c\") " Apr 24 21:47:03.989516 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:03.989398 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjp4t\" (UniqueName: \"kubernetes.io/projected/0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c-kube-api-access-cjp4t\") pod \"0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c\" (UID: \"0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c\") " Apr 24 21:47:03.989516 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:03.989422 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-6445b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c-error-404-isvc-6445b-kube-rbac-proxy-sar-config\") pod \"0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c\" (UID: \"0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c\") " Apr 24 21:47:03.989859 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:03.989826 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c-error-404-isvc-6445b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-6445b-kube-rbac-proxy-sar-config") pod "0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c" (UID: "0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c"). InnerVolumeSpecName "error-404-isvc-6445b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:47:03.991448 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:03.991425 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c" (UID: "0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:47:03.991755 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:03.991734 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c-kube-api-access-cjp4t" (OuterVolumeSpecName: "kube-api-access-cjp4t") pod "0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c" (UID: "0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c"). InnerVolumeSpecName "kube-api-access-cjp4t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:47:04.084025 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.084006 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" Apr 24 21:47:04.090641 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.090623 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-6445b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9c0e3ab9-32b9-4155-ad77-6da651f96cf3-success-200-isvc-6445b-kube-rbac-proxy-sar-config\") pod \"9c0e3ab9-32b9-4155-ad77-6da651f96cf3\" (UID: \"9c0e3ab9-32b9-4155-ad77-6da651f96cf3\") " Apr 24 21:47:04.090716 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.090664 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zck7g\" (UniqueName: \"kubernetes.io/projected/9c0e3ab9-32b9-4155-ad77-6da651f96cf3-kube-api-access-zck7g\") pod \"9c0e3ab9-32b9-4155-ad77-6da651f96cf3\" (UID: \"9c0e3ab9-32b9-4155-ad77-6da651f96cf3\") " Apr 24 21:47:04.090716 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.090706 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c0e3ab9-32b9-4155-ad77-6da651f96cf3-proxy-tls\") pod \"9c0e3ab9-32b9-4155-ad77-6da651f96cf3\" (UID: \"9c0e3ab9-32b9-4155-ad77-6da651f96cf3\") " Apr 24 21:47:04.090986 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.090971 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c-proxy-tls\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:47:04.091028 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.090999 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cjp4t\" (UniqueName: \"kubernetes.io/projected/0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c-kube-api-access-cjp4t\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:47:04.091028 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.091012 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-6445b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c-error-404-isvc-6445b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:47:04.091028 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.090974 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c0e3ab9-32b9-4155-ad77-6da651f96cf3-success-200-isvc-6445b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-6445b-kube-rbac-proxy-sar-config") pod "9c0e3ab9-32b9-4155-ad77-6da651f96cf3" (UID: "9c0e3ab9-32b9-4155-ad77-6da651f96cf3"). InnerVolumeSpecName "success-200-isvc-6445b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:47:04.092722 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.092698 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c0e3ab9-32b9-4155-ad77-6da651f96cf3-kube-api-access-zck7g" (OuterVolumeSpecName: "kube-api-access-zck7g") pod "9c0e3ab9-32b9-4155-ad77-6da651f96cf3" (UID: "9c0e3ab9-32b9-4155-ad77-6da651f96cf3"). InnerVolumeSpecName "kube-api-access-zck7g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:47:04.092781 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.092715 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c0e3ab9-32b9-4155-ad77-6da651f96cf3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9c0e3ab9-32b9-4155-ad77-6da651f96cf3" (UID: "9c0e3ab9-32b9-4155-ad77-6da651f96cf3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:47:04.172264 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.172188 2578 generic.go:358] "Generic (PLEG): container finished" podID="0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c" containerID="e6ea65887cddbd0681bab6e25a2976a5777f9e05d30fbfc61264d4bd569b83fd" exitCode=0 Apr 24 21:47:04.172264 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.172254 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" Apr 24 21:47:04.172461 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.172279 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" event={"ID":"0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c","Type":"ContainerDied","Data":"e6ea65887cddbd0681bab6e25a2976a5777f9e05d30fbfc61264d4bd569b83fd"} Apr 24 21:47:04.172461 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.172321 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw" event={"ID":"0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c","Type":"ContainerDied","Data":"8724de544827b22ea4caca1794e84dbee4761f51f10ae296cbdf4e33bf40a2f0"} Apr 24 21:47:04.172461 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.172341 2578 scope.go:117] "RemoveContainer" containerID="bc5998219fd61823d5c7357979332a001dc13168962724855aad4153160a79f0" Apr 24 21:47:04.173705 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.173681 2578 generic.go:358] "Generic (PLEG): container finished" podID="9c0e3ab9-32b9-4155-ad77-6da651f96cf3" containerID="c73fab23f1586e4deafa8f52788683db84bc0c8503ca0f94bb3c06a4f756000d" exitCode=0 Apr 24 21:47:04.173821 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.173728 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" event={"ID":"9c0e3ab9-32b9-4155-ad77-6da651f96cf3","Type":"ContainerDied","Data":"c73fab23f1586e4deafa8f52788683db84bc0c8503ca0f94bb3c06a4f756000d"} Apr 24 21:47:04.173821 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.173749 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" event={"ID":"9c0e3ab9-32b9-4155-ad77-6da651f96cf3","Type":"ContainerDied","Data":"aa0a9cb3d0124da3d2d539cdcb0dc1ac74f1332fcd310fa435d8e6d1ad2a80be"} Apr 24 21:47:04.173821 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.173778 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf" Apr 24 21:47:04.181601 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.181407 2578 scope.go:117] "RemoveContainer" containerID="e6ea65887cddbd0681bab6e25a2976a5777f9e05d30fbfc61264d4bd569b83fd" Apr 24 21:47:04.188415 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.188308 2578 scope.go:117] "RemoveContainer" containerID="bc5998219fd61823d5c7357979332a001dc13168962724855aad4153160a79f0" Apr 24 21:47:04.188593 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:47:04.188569 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc5998219fd61823d5c7357979332a001dc13168962724855aad4153160a79f0\": container with ID starting with bc5998219fd61823d5c7357979332a001dc13168962724855aad4153160a79f0 not found: ID does not exist" containerID="bc5998219fd61823d5c7357979332a001dc13168962724855aad4153160a79f0" Apr 24 21:47:04.188664 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.188600 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc5998219fd61823d5c7357979332a001dc13168962724855aad4153160a79f0"} err="failed to get container status \"bc5998219fd61823d5c7357979332a001dc13168962724855aad4153160a79f0\": rpc error: code = NotFound desc = could not find container \"bc5998219fd61823d5c7357979332a001dc13168962724855aad4153160a79f0\": container with ID starting with bc5998219fd61823d5c7357979332a001dc13168962724855aad4153160a79f0 not found: ID does not exist" Apr 24 21:47:04.188664 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.188616 2578 scope.go:117] "RemoveContainer" containerID="e6ea65887cddbd0681bab6e25a2976a5777f9e05d30fbfc61264d4bd569b83fd" Apr 24 21:47:04.188868 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:47:04.188851 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6ea65887cddbd0681bab6e25a2976a5777f9e05d30fbfc61264d4bd569b83fd\": container with ID starting with e6ea65887cddbd0681bab6e25a2976a5777f9e05d30fbfc61264d4bd569b83fd not found: ID does not exist" containerID="e6ea65887cddbd0681bab6e25a2976a5777f9e05d30fbfc61264d4bd569b83fd" Apr 24 21:47:04.188925 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.188872 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6ea65887cddbd0681bab6e25a2976a5777f9e05d30fbfc61264d4bd569b83fd"} err="failed to get container status \"e6ea65887cddbd0681bab6e25a2976a5777f9e05d30fbfc61264d4bd569b83fd\": rpc error: code = NotFound desc = could not find container \"e6ea65887cddbd0681bab6e25a2976a5777f9e05d30fbfc61264d4bd569b83fd\": container with ID starting with e6ea65887cddbd0681bab6e25a2976a5777f9e05d30fbfc61264d4bd569b83fd not found: ID does not exist" Apr 24 21:47:04.188925 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.188894 2578 scope.go:117] "RemoveContainer" containerID="3384a3d260de18f8d106a5edd3222ed5a5e84cc0c6c923521cfc22612f143d1f" Apr 24 21:47:04.191905 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.191890 2578 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-6445b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9c0e3ab9-32b9-4155-ad77-6da651f96cf3-success-200-isvc-6445b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:47:04.191971 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.191909 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zck7g\" (UniqueName: \"kubernetes.io/projected/9c0e3ab9-32b9-4155-ad77-6da651f96cf3-kube-api-access-zck7g\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:47:04.191971 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.191919 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c0e3ab9-32b9-4155-ad77-6da651f96cf3-proxy-tls\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:47:04.195797 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.195771 2578 scope.go:117] "RemoveContainer" containerID="c73fab23f1586e4deafa8f52788683db84bc0c8503ca0f94bb3c06a4f756000d" Apr 24 21:47:04.197423 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.197403 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw"] Apr 24 21:47:04.203144 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.203130 2578 scope.go:117] "RemoveContainer" containerID="3384a3d260de18f8d106a5edd3222ed5a5e84cc0c6c923521cfc22612f143d1f" Apr 24 21:47:04.203399 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:47:04.203383 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3384a3d260de18f8d106a5edd3222ed5a5e84cc0c6c923521cfc22612f143d1f\": container with ID starting with 3384a3d260de18f8d106a5edd3222ed5a5e84cc0c6c923521cfc22612f143d1f not found: ID does not exist" containerID="3384a3d260de18f8d106a5edd3222ed5a5e84cc0c6c923521cfc22612f143d1f" Apr 24 21:47:04.203447 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.203404 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3384a3d260de18f8d106a5edd3222ed5a5e84cc0c6c923521cfc22612f143d1f"} err="failed to get container status \"3384a3d260de18f8d106a5edd3222ed5a5e84cc0c6c923521cfc22612f143d1f\": rpc error: code = NotFound desc = could not find container \"3384a3d260de18f8d106a5edd3222ed5a5e84cc0c6c923521cfc22612f143d1f\": container with ID starting with 3384a3d260de18f8d106a5edd3222ed5a5e84cc0c6c923521cfc22612f143d1f not found: ID does not exist" Apr 24 21:47:04.203447 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.203418 2578 scope.go:117] "RemoveContainer" containerID="c73fab23f1586e4deafa8f52788683db84bc0c8503ca0f94bb3c06a4f756000d" Apr 24 21:47:04.203586 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.203568 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6445b-predictor-566b89cd56-67vrw"] Apr 24 21:47:04.203699 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:47:04.203684 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c73fab23f1586e4deafa8f52788683db84bc0c8503ca0f94bb3c06a4f756000d\": container with ID starting with c73fab23f1586e4deafa8f52788683db84bc0c8503ca0f94bb3c06a4f756000d not found: ID does not exist" containerID="c73fab23f1586e4deafa8f52788683db84bc0c8503ca0f94bb3c06a4f756000d" Apr 24 21:47:04.203741 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.203705 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c73fab23f1586e4deafa8f52788683db84bc0c8503ca0f94bb3c06a4f756000d"} err="failed to get container status \"c73fab23f1586e4deafa8f52788683db84bc0c8503ca0f94bb3c06a4f756000d\": rpc error: code = NotFound desc = could not find container \"c73fab23f1586e4deafa8f52788683db84bc0c8503ca0f94bb3c06a4f756000d\": container with ID starting with c73fab23f1586e4deafa8f52788683db84bc0c8503ca0f94bb3c06a4f756000d not found: ID does not exist" Apr 24 21:47:04.220874 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.219999 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf"] Apr 24 21:47:04.222047 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.222029 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6445b-predictor-769c496d67-hc2tf"] Apr 24 21:47:04.423387 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.423313 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c" path="/var/lib/kubelet/pods/0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c/volumes" Apr 24 21:47:04.423823 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:04.423808 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c0e3ab9-32b9-4155-ad77-6da651f96cf3" path="/var/lib/kubelet/pods/9c0e3ab9-32b9-4155-ad77-6da651f96cf3/volumes" Apr 24 21:47:08.172373 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:08.172342 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" Apr 24 21:47:08.172918 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:08.172614 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" Apr 24 21:47:08.173011 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:08.172981 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" podUID="7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:47:08.173255 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:08.173233 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" podUID="303b11b3-a44c-4cd3-896a-000a60c34a09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 21:47:09.034729 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:09.034694 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" Apr 24 21:47:09.034920 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:09.034904 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" Apr 24 21:47:18.173091 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:18.173051 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" podUID="7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:47:18.173441 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:18.173198 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" podUID="303b11b3-a44c-4cd3-896a-000a60c34a09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 21:47:28.173324 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:28.173289 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" podUID="303b11b3-a44c-4cd3-896a-000a60c34a09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 21:47:28.173691 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:28.173294 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" podUID="7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:47:38.173010 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:38.172973 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" podUID="7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:47:38.173392 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:38.173196 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" podUID="303b11b3-a44c-4cd3-896a-000a60c34a09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 21:47:48.173480 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:48.173454 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" Apr 24 21:47:48.174283 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:47:48.174267 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" Apr 24 21:55:35.538310 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.538279 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t"] Apr 24 21:55:35.540922 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.538528 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" podUID="f409ce2b-d8c4-4c32-a764-32955142f14e" containerName="kserve-container" containerID="cri-o://4eeab9aac3ce69206fe21e872b43b8900bfd6f2ef49e60bf7de5260ba5f9cf4a" gracePeriod=30 Apr 24 21:55:35.540922 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.538613 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" podUID="f409ce2b-d8c4-4c32-a764-32955142f14e" containerName="kube-rbac-proxy" containerID="cri-o://ed34458ebfb6f0250796f542d304b6158fe8fc047e41b45c7f3864b92b349537" gracePeriod=30 Apr 24 21:55:35.619487 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.619453 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc"] Apr 24 21:55:35.620025 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.619995 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c" containerName="kserve-container" Apr 24 21:55:35.620025 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.620019 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c" containerName="kserve-container" Apr 24 21:55:35.620191 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.620032 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c" containerName="kube-rbac-proxy" Apr 24 21:55:35.620191 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.620038 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c" containerName="kube-rbac-proxy" Apr 24 21:55:35.620191 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.620054 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c0e3ab9-32b9-4155-ad77-6da651f96cf3" containerName="kserve-container" Apr 24 21:55:35.620191 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.620059 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c0e3ab9-32b9-4155-ad77-6da651f96cf3" containerName="kserve-container" Apr 24 21:55:35.620191 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.620065 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c0e3ab9-32b9-4155-ad77-6da651f96cf3" containerName="kube-rbac-proxy" Apr 24 21:55:35.620191 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.620071 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c0e3ab9-32b9-4155-ad77-6da651f96cf3" containerName="kube-rbac-proxy" Apr 24 21:55:35.620191 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.620134 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c0e3ab9-32b9-4155-ad77-6da651f96cf3" containerName="kserve-container" Apr 24 21:55:35.620191 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.620146 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c" containerName="kserve-container" Apr 24 21:55:35.620191 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.620153 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c0e3ab9-32b9-4155-ad77-6da651f96cf3" containerName="kube-rbac-proxy" Apr 24 21:55:35.620191 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.620162 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b3e97f0-f7b9-44f1-b0b1-5bbaf899a83c" containerName="kube-rbac-proxy" Apr 24 21:55:35.623395 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.623376 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" Apr 24 21:55:35.625940 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.625921 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-6842b-kube-rbac-proxy-sar-config\"" Apr 24 21:55:35.626088 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.626065 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-6842b-predictor-serving-cert\"" Apr 24 21:55:35.640752 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.640731 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5"] Apr 24 21:55:35.641064 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.641033 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" podUID="989e357d-91ff-427f-a266-dcbde106f0fa" containerName="kube-rbac-proxy" containerID="cri-o://5c269eedc030daf5b98390b669232f683a51ab6916b443ddb1d53e1835044511" gracePeriod=30 Apr 24 21:55:35.641147 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.641058 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" podUID="989e357d-91ff-427f-a266-dcbde106f0fa" containerName="kserve-container" containerID="cri-o://6c46b927068c8ad73b3eedf70de51f6ab5babb6a2f4c7c4607ba0b8e5247ef3e" gracePeriod=30 Apr 24 21:55:35.644407 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.644374 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc"] Apr 24 21:55:35.665798 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.665774 2578 generic.go:358] "Generic (PLEG): container finished" podID="f409ce2b-d8c4-4c32-a764-32955142f14e" containerID="ed34458ebfb6f0250796f542d304b6158fe8fc047e41b45c7f3864b92b349537" exitCode=2 Apr 24 21:55:35.665936 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.665842 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" event={"ID":"f409ce2b-d8c4-4c32-a764-32955142f14e","Type":"ContainerDied","Data":"ed34458ebfb6f0250796f542d304b6158fe8fc047e41b45c7f3864b92b349537"} Apr 24 21:55:35.702908 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.702876 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65"] Apr 24 21:55:35.706444 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.706425 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" Apr 24 21:55:35.709315 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.709297 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-6842b-predictor-serving-cert\"" Apr 24 21:55:35.710631 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.710610 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-6842b-kube-rbac-proxy-sar-config\"" Apr 24 21:55:35.718673 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.718656 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65"] Apr 24 21:55:35.721234 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.721211 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-6842b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/61ed92de-cc0f-4baf-9b8d-8bb6e4019d55-error-404-isvc-6842b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-6842b-predictor-565dd576f8-rwz65\" (UID: \"61ed92de-cc0f-4baf-9b8d-8bb6e4019d55\") " pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" Apr 24 21:55:35.721317 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.721249 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgbs4\" (UniqueName: \"kubernetes.io/projected/9302faf7-7f03-465f-8e33-8ea9cdf22fdd-kube-api-access-bgbs4\") pod \"success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc\" (UID: \"9302faf7-7f03-465f-8e33-8ea9cdf22fdd\") " pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" Apr 24 21:55:35.721317 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.721276 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9302faf7-7f03-465f-8e33-8ea9cdf22fdd-proxy-tls\") pod \"success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc\" (UID: \"9302faf7-7f03-465f-8e33-8ea9cdf22fdd\") " pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" Apr 24 21:55:35.721415 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.721387 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkfnd\" (UniqueName: \"kubernetes.io/projected/61ed92de-cc0f-4baf-9b8d-8bb6e4019d55-kube-api-access-dkfnd\") pod \"error-404-isvc-6842b-predictor-565dd576f8-rwz65\" (UID: \"61ed92de-cc0f-4baf-9b8d-8bb6e4019d55\") " pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" Apr 24 21:55:35.721465 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.721439 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61ed92de-cc0f-4baf-9b8d-8bb6e4019d55-proxy-tls\") pod \"error-404-isvc-6842b-predictor-565dd576f8-rwz65\" (UID: \"61ed92de-cc0f-4baf-9b8d-8bb6e4019d55\") " pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" Apr 24 21:55:35.721502 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.721488 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-6842b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9302faf7-7f03-465f-8e33-8ea9cdf22fdd-success-200-isvc-6842b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc\" (UID: \"9302faf7-7f03-465f-8e33-8ea9cdf22fdd\") " pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" Apr 24 21:55:35.822534 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.822462 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-6842b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/61ed92de-cc0f-4baf-9b8d-8bb6e4019d55-error-404-isvc-6842b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-6842b-predictor-565dd576f8-rwz65\" (UID: \"61ed92de-cc0f-4baf-9b8d-8bb6e4019d55\") " pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" Apr 24 21:55:35.822534 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.822497 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgbs4\" (UniqueName: \"kubernetes.io/projected/9302faf7-7f03-465f-8e33-8ea9cdf22fdd-kube-api-access-bgbs4\") pod \"success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc\" (UID: \"9302faf7-7f03-465f-8e33-8ea9cdf22fdd\") " pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" Apr 24 21:55:35.822534 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.822517 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9302faf7-7f03-465f-8e33-8ea9cdf22fdd-proxy-tls\") pod \"success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc\" (UID: \"9302faf7-7f03-465f-8e33-8ea9cdf22fdd\") " pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" Apr 24 21:55:35.822823 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.822586 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkfnd\" (UniqueName: \"kubernetes.io/projected/61ed92de-cc0f-4baf-9b8d-8bb6e4019d55-kube-api-access-dkfnd\") pod \"error-404-isvc-6842b-predictor-565dd576f8-rwz65\" (UID: \"61ed92de-cc0f-4baf-9b8d-8bb6e4019d55\") " pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" Apr 24 21:55:35.822823 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.822628 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61ed92de-cc0f-4baf-9b8d-8bb6e4019d55-proxy-tls\") pod \"error-404-isvc-6842b-predictor-565dd576f8-rwz65\" (UID: \"61ed92de-cc0f-4baf-9b8d-8bb6e4019d55\") " pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" Apr 24 21:55:35.822823 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.822683 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-6842b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9302faf7-7f03-465f-8e33-8ea9cdf22fdd-success-200-isvc-6842b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc\" (UID: \"9302faf7-7f03-465f-8e33-8ea9cdf22fdd\") " pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" Apr 24 21:55:35.823198 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.823177 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-6842b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/61ed92de-cc0f-4baf-9b8d-8bb6e4019d55-error-404-isvc-6842b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-6842b-predictor-565dd576f8-rwz65\" (UID: \"61ed92de-cc0f-4baf-9b8d-8bb6e4019d55\") " pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" Apr 24 21:55:35.823421 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.823258 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-6842b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9302faf7-7f03-465f-8e33-8ea9cdf22fdd-success-200-isvc-6842b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc\" (UID: \"9302faf7-7f03-465f-8e33-8ea9cdf22fdd\") " pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" Apr 24 21:55:35.825035 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.825015 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61ed92de-cc0f-4baf-9b8d-8bb6e4019d55-proxy-tls\") pod \"error-404-isvc-6842b-predictor-565dd576f8-rwz65\" (UID: \"61ed92de-cc0f-4baf-9b8d-8bb6e4019d55\") " pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" Apr 24 21:55:35.825119 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.825051 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9302faf7-7f03-465f-8e33-8ea9cdf22fdd-proxy-tls\") pod \"success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc\" (UID: \"9302faf7-7f03-465f-8e33-8ea9cdf22fdd\") " pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" Apr 24 21:55:35.832907 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.832887 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkfnd\" (UniqueName: \"kubernetes.io/projected/61ed92de-cc0f-4baf-9b8d-8bb6e4019d55-kube-api-access-dkfnd\") pod \"error-404-isvc-6842b-predictor-565dd576f8-rwz65\" (UID: \"61ed92de-cc0f-4baf-9b8d-8bb6e4019d55\") " pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" Apr 24 21:55:35.833223 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.833174 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgbs4\" (UniqueName: \"kubernetes.io/projected/9302faf7-7f03-465f-8e33-8ea9cdf22fdd-kube-api-access-bgbs4\") pod \"success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc\" (UID: \"9302faf7-7f03-465f-8e33-8ea9cdf22fdd\") " pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" Apr 24 21:55:35.932671 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:35.932635 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" Apr 24 21:55:36.017084 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:36.017044 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" Apr 24 21:55:36.057199 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:36.057172 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc"] Apr 24 21:55:36.062790 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:55:36.062757 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9302faf7_7f03_465f_8e33_8ea9cdf22fdd.slice/crio-c32a7d21821afac3daa7a88986df8b6b389f74a1b6bc0881d4519f323bd1dc87 WatchSource:0}: Error finding container c32a7d21821afac3daa7a88986df8b6b389f74a1b6bc0881d4519f323bd1dc87: Status 404 returned error can't find the container with id c32a7d21821afac3daa7a88986df8b6b389f74a1b6bc0881d4519f323bd1dc87 Apr 24 21:55:36.065728 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:36.065708 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:55:36.142916 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:36.142895 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65"] Apr 24 21:55:36.144724 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:55:36.144699 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61ed92de_cc0f_4baf_9b8d_8bb6e4019d55.slice/crio-b5d6e2c0e881a0b67ceac8cadcad1317b3c10d2e80ffdec25cf7eea32f3a8739 WatchSource:0}: Error finding container b5d6e2c0e881a0b67ceac8cadcad1317b3c10d2e80ffdec25cf7eea32f3a8739: Status 404 returned error can't find the container with id b5d6e2c0e881a0b67ceac8cadcad1317b3c10d2e80ffdec25cf7eea32f3a8739 Apr 24 21:55:36.670720 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:36.670684 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" event={"ID":"9302faf7-7f03-465f-8e33-8ea9cdf22fdd","Type":"ContainerStarted","Data":"cc03c0c86c5628c072509eecab56623f992e060ff3268dabe2cd9c800bc82eeb"} Apr 24 21:55:36.670720 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:36.670720 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" event={"ID":"9302faf7-7f03-465f-8e33-8ea9cdf22fdd","Type":"ContainerStarted","Data":"a8450e82ec91990d4486549521f5c59a4b9dce46258412261072c6811100d628"} Apr 24 21:55:36.671181 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:36.670732 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" event={"ID":"9302faf7-7f03-465f-8e33-8ea9cdf22fdd","Type":"ContainerStarted","Data":"c32a7d21821afac3daa7a88986df8b6b389f74a1b6bc0881d4519f323bd1dc87"} Apr 24 21:55:36.671181 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:36.670875 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" Apr 24 21:55:36.671181 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:36.670900 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" Apr 24 21:55:36.671920 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:36.671899 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" podUID="9302faf7-7f03-465f-8e33-8ea9cdf22fdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 21:55:36.672317 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:36.672296 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" event={"ID":"61ed92de-cc0f-4baf-9b8d-8bb6e4019d55","Type":"ContainerStarted","Data":"182b8170e65a387c0435978b15a23ff190aeda7a78dc03c6923b469fb4ef7146"} Apr 24 21:55:36.672405 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:36.672322 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" event={"ID":"61ed92de-cc0f-4baf-9b8d-8bb6e4019d55","Type":"ContainerStarted","Data":"3da2cc6cb8c12f6b3d02d0ea2958d39f19390f275a129aff3c3f04b369ad8b92"} Apr 24 21:55:36.672405 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:36.672331 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" event={"ID":"61ed92de-cc0f-4baf-9b8d-8bb6e4019d55","Type":"ContainerStarted","Data":"b5d6e2c0e881a0b67ceac8cadcad1317b3c10d2e80ffdec25cf7eea32f3a8739"} Apr 24 21:55:36.672492 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:36.672407 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" Apr 24 21:55:36.673858 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:36.673820 2578 generic.go:358] "Generic (PLEG): container finished" podID="989e357d-91ff-427f-a266-dcbde106f0fa" containerID="5c269eedc030daf5b98390b669232f683a51ab6916b443ddb1d53e1835044511" exitCode=2 Apr 24 21:55:36.673985 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:36.673867 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" event={"ID":"989e357d-91ff-427f-a266-dcbde106f0fa","Type":"ContainerDied","Data":"5c269eedc030daf5b98390b669232f683a51ab6916b443ddb1d53e1835044511"} Apr 24 21:55:36.690660 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:36.690621 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" podStartSLOduration=1.6906091989999998 podStartE2EDuration="1.690609199s" podCreationTimestamp="2026-04-24 21:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:55:36.688301181 +0000 UTC m=+1682.864035864" watchObservedRunningTime="2026-04-24 21:55:36.690609199 +0000 UTC m=+1682.866343872" Apr 24 21:55:36.714050 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:36.713994 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" podStartSLOduration=1.713979403 podStartE2EDuration="1.713979403s" podCreationTimestamp="2026-04-24 21:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:55:36.712648354 +0000 UTC m=+1682.888383027" watchObservedRunningTime="2026-04-24 21:55:36.713979403 +0000 UTC m=+1682.889714081" Apr 24 21:55:37.677853 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:37.677820 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" Apr 24 21:55:37.677853 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:37.677846 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" podUID="9302faf7-7f03-465f-8e33-8ea9cdf22fdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 21:55:37.679051 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:37.679026 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" podUID="61ed92de-cc0f-4baf-9b8d-8bb6e4019d55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 21:55:38.589394 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.589369 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" Apr 24 21:55:38.645674 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.645638 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-a9fd5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f409ce2b-d8c4-4c32-a764-32955142f14e-success-200-isvc-a9fd5-kube-rbac-proxy-sar-config\") pod \"f409ce2b-d8c4-4c32-a764-32955142f14e\" (UID: \"f409ce2b-d8c4-4c32-a764-32955142f14e\") " Apr 24 21:55:38.645803 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.645732 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f409ce2b-d8c4-4c32-a764-32955142f14e-proxy-tls\") pod \"f409ce2b-d8c4-4c32-a764-32955142f14e\" (UID: \"f409ce2b-d8c4-4c32-a764-32955142f14e\") " Apr 24 21:55:38.645803 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.645773 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-244xd\" (UniqueName: \"kubernetes.io/projected/f409ce2b-d8c4-4c32-a764-32955142f14e-kube-api-access-244xd\") pod \"f409ce2b-d8c4-4c32-a764-32955142f14e\" (UID: \"f409ce2b-d8c4-4c32-a764-32955142f14e\") " Apr 24 21:55:38.646062 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.646035 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f409ce2b-d8c4-4c32-a764-32955142f14e-success-200-isvc-a9fd5-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-a9fd5-kube-rbac-proxy-sar-config") pod "f409ce2b-d8c4-4c32-a764-32955142f14e" (UID: "f409ce2b-d8c4-4c32-a764-32955142f14e"). InnerVolumeSpecName "success-200-isvc-a9fd5-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:55:38.648234 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.648204 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f409ce2b-d8c4-4c32-a764-32955142f14e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f409ce2b-d8c4-4c32-a764-32955142f14e" (UID: "f409ce2b-d8c4-4c32-a764-32955142f14e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:55:38.648333 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.648247 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f409ce2b-d8c4-4c32-a764-32955142f14e-kube-api-access-244xd" (OuterVolumeSpecName: "kube-api-access-244xd") pod "f409ce2b-d8c4-4c32-a764-32955142f14e" (UID: "f409ce2b-d8c4-4c32-a764-32955142f14e"). InnerVolumeSpecName "kube-api-access-244xd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:55:38.672447 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.672430 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" Apr 24 21:55:38.682715 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.682694 2578 generic.go:358] "Generic (PLEG): container finished" podID="989e357d-91ff-427f-a266-dcbde106f0fa" containerID="6c46b927068c8ad73b3eedf70de51f6ab5babb6a2f4c7c4607ba0b8e5247ef3e" exitCode=0 Apr 24 21:55:38.683042 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.682758 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" event={"ID":"989e357d-91ff-427f-a266-dcbde106f0fa","Type":"ContainerDied","Data":"6c46b927068c8ad73b3eedf70de51f6ab5babb6a2f4c7c4607ba0b8e5247ef3e"} Apr 24 21:55:38.683042 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.682779 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" event={"ID":"989e357d-91ff-427f-a266-dcbde106f0fa","Type":"ContainerDied","Data":"b1c748120a15b8c30f267cd979edf3261e3fed69ab513153d819b607635703e7"} Apr 24 21:55:38.683042 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.682759 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5" Apr 24 21:55:38.683042 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.682795 2578 scope.go:117] "RemoveContainer" containerID="5c269eedc030daf5b98390b669232f683a51ab6916b443ddb1d53e1835044511" Apr 24 21:55:38.684051 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.684028 2578 generic.go:358] "Generic (PLEG): container finished" podID="f409ce2b-d8c4-4c32-a764-32955142f14e" containerID="4eeab9aac3ce69206fe21e872b43b8900bfd6f2ef49e60bf7de5260ba5f9cf4a" exitCode=0 Apr 24 21:55:38.684164 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.684110 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" Apr 24 21:55:38.684239 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.684215 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" event={"ID":"f409ce2b-d8c4-4c32-a764-32955142f14e","Type":"ContainerDied","Data":"4eeab9aac3ce69206fe21e872b43b8900bfd6f2ef49e60bf7de5260ba5f9cf4a"} Apr 24 21:55:38.684285 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.684252 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t" event={"ID":"f409ce2b-d8c4-4c32-a764-32955142f14e","Type":"ContainerDied","Data":"060c3c8773ac1c7448b3c2b7ed92b7704202b7bb8bbc976786f94ec328595a5d"} Apr 24 21:55:38.684470 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.684449 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" podUID="61ed92de-cc0f-4baf-9b8d-8bb6e4019d55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 21:55:38.690651 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.690632 2578 scope.go:117] "RemoveContainer" containerID="6c46b927068c8ad73b3eedf70de51f6ab5babb6a2f4c7c4607ba0b8e5247ef3e" Apr 24 21:55:38.699061 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.699039 2578 scope.go:117] "RemoveContainer" containerID="5c269eedc030daf5b98390b669232f683a51ab6916b443ddb1d53e1835044511" Apr 24 21:55:38.699326 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:55:38.699309 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c269eedc030daf5b98390b669232f683a51ab6916b443ddb1d53e1835044511\": container with ID starting with 5c269eedc030daf5b98390b669232f683a51ab6916b443ddb1d53e1835044511 not found: ID does not exist" containerID="5c269eedc030daf5b98390b669232f683a51ab6916b443ddb1d53e1835044511" Apr 24 21:55:38.699377 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.699335 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c269eedc030daf5b98390b669232f683a51ab6916b443ddb1d53e1835044511"} err="failed to get container status \"5c269eedc030daf5b98390b669232f683a51ab6916b443ddb1d53e1835044511\": rpc error: code = NotFound desc = could not find container \"5c269eedc030daf5b98390b669232f683a51ab6916b443ddb1d53e1835044511\": container with ID starting with 5c269eedc030daf5b98390b669232f683a51ab6916b443ddb1d53e1835044511 not found: ID does not exist" Apr 24 21:55:38.699377 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.699351 2578 scope.go:117] "RemoveContainer" containerID="6c46b927068c8ad73b3eedf70de51f6ab5babb6a2f4c7c4607ba0b8e5247ef3e" Apr 24 21:55:38.699595 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:55:38.699576 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c46b927068c8ad73b3eedf70de51f6ab5babb6a2f4c7c4607ba0b8e5247ef3e\": container with ID starting with 6c46b927068c8ad73b3eedf70de51f6ab5babb6a2f4c7c4607ba0b8e5247ef3e not found: ID does not exist" containerID="6c46b927068c8ad73b3eedf70de51f6ab5babb6a2f4c7c4607ba0b8e5247ef3e" Apr 24 21:55:38.699640 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.699602 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c46b927068c8ad73b3eedf70de51f6ab5babb6a2f4c7c4607ba0b8e5247ef3e"} err="failed to get container status \"6c46b927068c8ad73b3eedf70de51f6ab5babb6a2f4c7c4607ba0b8e5247ef3e\": rpc error: code = NotFound desc = could not find container \"6c46b927068c8ad73b3eedf70de51f6ab5babb6a2f4c7c4607ba0b8e5247ef3e\": container with ID starting with 6c46b927068c8ad73b3eedf70de51f6ab5babb6a2f4c7c4607ba0b8e5247ef3e not found: ID does not exist" Apr 24 21:55:38.699640 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.699617 2578 scope.go:117] "RemoveContainer" containerID="ed34458ebfb6f0250796f542d304b6158fe8fc047e41b45c7f3864b92b349537" Apr 24 21:55:38.706288 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.706273 2578 scope.go:117] "RemoveContainer" containerID="4eeab9aac3ce69206fe21e872b43b8900bfd6f2ef49e60bf7de5260ba5f9cf4a" Apr 24 21:55:38.722203 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.719973 2578 scope.go:117] "RemoveContainer" containerID="ed34458ebfb6f0250796f542d304b6158fe8fc047e41b45c7f3864b92b349537" Apr 24 21:55:38.722203 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.721834 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t"] Apr 24 21:55:38.722203 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:55:38.721896 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed34458ebfb6f0250796f542d304b6158fe8fc047e41b45c7f3864b92b349537\": container with ID starting with ed34458ebfb6f0250796f542d304b6158fe8fc047e41b45c7f3864b92b349537 not found: ID does not exist" containerID="ed34458ebfb6f0250796f542d304b6158fe8fc047e41b45c7f3864b92b349537" Apr 24 21:55:38.722203 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.721964 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed34458ebfb6f0250796f542d304b6158fe8fc047e41b45c7f3864b92b349537"} err="failed to get container status \"ed34458ebfb6f0250796f542d304b6158fe8fc047e41b45c7f3864b92b349537\": rpc error: code = NotFound desc = could not find container \"ed34458ebfb6f0250796f542d304b6158fe8fc047e41b45c7f3864b92b349537\": container with ID starting with ed34458ebfb6f0250796f542d304b6158fe8fc047e41b45c7f3864b92b349537 not found: ID does not exist" Apr 24 21:55:38.722203 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.721995 2578 scope.go:117] "RemoveContainer" containerID="4eeab9aac3ce69206fe21e872b43b8900bfd6f2ef49e60bf7de5260ba5f9cf4a" Apr 24 21:55:38.722627 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:55:38.722602 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eeab9aac3ce69206fe21e872b43b8900bfd6f2ef49e60bf7de5260ba5f9cf4a\": container with ID starting with 4eeab9aac3ce69206fe21e872b43b8900bfd6f2ef49e60bf7de5260ba5f9cf4a not found: ID does not exist" containerID="4eeab9aac3ce69206fe21e872b43b8900bfd6f2ef49e60bf7de5260ba5f9cf4a" Apr 24 21:55:38.722726 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.722632 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eeab9aac3ce69206fe21e872b43b8900bfd6f2ef49e60bf7de5260ba5f9cf4a"} err="failed to get container status \"4eeab9aac3ce69206fe21e872b43b8900bfd6f2ef49e60bf7de5260ba5f9cf4a\": rpc error: code = NotFound desc = could not find container \"4eeab9aac3ce69206fe21e872b43b8900bfd6f2ef49e60bf7de5260ba5f9cf4a\": container with ID starting with 4eeab9aac3ce69206fe21e872b43b8900bfd6f2ef49e60bf7de5260ba5f9cf4a not found: ID does not exist" Apr 24 21:55:38.723476 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.723454 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a9fd5-predictor-864584f7cf-fgd5t"] Apr 24 21:55:38.747124 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.747105 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-a9fd5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/989e357d-91ff-427f-a266-dcbde106f0fa-error-404-isvc-a9fd5-kube-rbac-proxy-sar-config\") pod \"989e357d-91ff-427f-a266-dcbde106f0fa\" (UID: \"989e357d-91ff-427f-a266-dcbde106f0fa\") " Apr 24 21:55:38.747196 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.747135 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/989e357d-91ff-427f-a266-dcbde106f0fa-proxy-tls\") pod \"989e357d-91ff-427f-a266-dcbde106f0fa\" (UID: \"989e357d-91ff-427f-a266-dcbde106f0fa\") " Apr 24 21:55:38.747196 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.747163 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w24ln\" (UniqueName: \"kubernetes.io/projected/989e357d-91ff-427f-a266-dcbde106f0fa-kube-api-access-w24ln\") pod \"989e357d-91ff-427f-a266-dcbde106f0fa\" (UID: \"989e357d-91ff-427f-a266-dcbde106f0fa\") " Apr 24 21:55:38.747438 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.747418 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/989e357d-91ff-427f-a266-dcbde106f0fa-error-404-isvc-a9fd5-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-a9fd5-kube-rbac-proxy-sar-config") pod "989e357d-91ff-427f-a266-dcbde106f0fa" (UID: "989e357d-91ff-427f-a266-dcbde106f0fa"). InnerVolumeSpecName "error-404-isvc-a9fd5-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:55:38.747535 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.747512 2578 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-a9fd5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f409ce2b-d8c4-4c32-a764-32955142f14e-success-200-isvc-a9fd5-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:55:38.747535 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.747533 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-a9fd5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/989e357d-91ff-427f-a266-dcbde106f0fa-error-404-isvc-a9fd5-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:55:38.747718 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.747568 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f409ce2b-d8c4-4c32-a764-32955142f14e-proxy-tls\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:55:38.747718 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.747578 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-244xd\" (UniqueName: \"kubernetes.io/projected/f409ce2b-d8c4-4c32-a764-32955142f14e-kube-api-access-244xd\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:55:38.749327 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.749309 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/989e357d-91ff-427f-a266-dcbde106f0fa-kube-api-access-w24ln" (OuterVolumeSpecName: "kube-api-access-w24ln") pod "989e357d-91ff-427f-a266-dcbde106f0fa" (UID: "989e357d-91ff-427f-a266-dcbde106f0fa"). InnerVolumeSpecName "kube-api-access-w24ln". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:55:38.749384 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.749330 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/989e357d-91ff-427f-a266-dcbde106f0fa-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "989e357d-91ff-427f-a266-dcbde106f0fa" (UID: "989e357d-91ff-427f-a266-dcbde106f0fa"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:55:38.848074 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.848052 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/989e357d-91ff-427f-a266-dcbde106f0fa-proxy-tls\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:55:38.848074 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:38.848072 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w24ln\" (UniqueName: \"kubernetes.io/projected/989e357d-91ff-427f-a266-dcbde106f0fa-kube-api-access-w24ln\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:55:39.003970 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:39.003947 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5"] Apr 24 21:55:39.011519 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:39.011498 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a9fd5-predictor-869bc478f6-jwtd5"] Apr 24 21:55:40.421267 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:40.421238 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="989e357d-91ff-427f-a266-dcbde106f0fa" path="/var/lib/kubelet/pods/989e357d-91ff-427f-a266-dcbde106f0fa/volumes" Apr 24 21:55:40.421658 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:40.421642 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f409ce2b-d8c4-4c32-a764-32955142f14e" path="/var/lib/kubelet/pods/f409ce2b-d8c4-4c32-a764-32955142f14e/volumes" Apr 24 21:55:42.682532 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:42.682507 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" Apr 24 21:55:42.682970 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:42.682931 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" podUID="9302faf7-7f03-465f-8e33-8ea9cdf22fdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 21:55:43.688407 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:43.688382 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" Apr 24 21:55:43.688957 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:43.688930 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" podUID="61ed92de-cc0f-4baf-9b8d-8bb6e4019d55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 21:55:52.683418 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:52.683339 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" podUID="9302faf7-7f03-465f-8e33-8ea9cdf22fdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 21:55:53.689133 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:55:53.689097 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" podUID="61ed92de-cc0f-4baf-9b8d-8bb6e4019d55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 21:56:02.682929 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:02.682887 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" podUID="9302faf7-7f03-465f-8e33-8ea9cdf22fdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 21:56:03.688894 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:03.688857 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" podUID="61ed92de-cc0f-4baf-9b8d-8bb6e4019d55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 21:56:12.683802 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:12.683763 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" podUID="9302faf7-7f03-465f-8e33-8ea9cdf22fdd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 24 21:56:13.688891 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:13.688855 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" podUID="61ed92de-cc0f-4baf-9b8d-8bb6e4019d55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 24 21:56:15.215437 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.215389 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh"] Apr 24 21:56:15.215819 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.215788 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" podUID="303b11b3-a44c-4cd3-896a-000a60c34a09" containerName="kserve-container" containerID="cri-o://f2bdfdc9a55f57341d5987125674ae6dd139516489fa12a7df1f87aa9930c4e6" gracePeriod=30 Apr 24 21:56:15.215929 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.215867 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" podUID="303b11b3-a44c-4cd3-896a-000a60c34a09" containerName="kube-rbac-proxy" containerID="cri-o://bd21e692fe3f5e9dda57d1b564826bc10d06ae99da96fb71aaa1defa4ed9dd47" gracePeriod=30 Apr 24 21:56:15.302138 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.302107 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj"] Apr 24 21:56:15.302524 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.302507 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="989e357d-91ff-427f-a266-dcbde106f0fa" containerName="kube-rbac-proxy" Apr 24 21:56:15.302587 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.302531 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="989e357d-91ff-427f-a266-dcbde106f0fa" containerName="kube-rbac-proxy" Apr 24 21:56:15.302623 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.302599 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f409ce2b-d8c4-4c32-a764-32955142f14e" containerName="kube-rbac-proxy" Apr 24 21:56:15.302623 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.302609 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f409ce2b-d8c4-4c32-a764-32955142f14e" containerName="kube-rbac-proxy" Apr 24 21:56:15.302623 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.302620 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f409ce2b-d8c4-4c32-a764-32955142f14e" containerName="kserve-container" Apr 24 21:56:15.302715 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.302628 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f409ce2b-d8c4-4c32-a764-32955142f14e" containerName="kserve-container" Apr 24 21:56:15.302715 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.302637 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="989e357d-91ff-427f-a266-dcbde106f0fa" containerName="kserve-container" Apr 24 21:56:15.302715 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.302642 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="989e357d-91ff-427f-a266-dcbde106f0fa" containerName="kserve-container" Apr 24 21:56:15.302715 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.302709 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f409ce2b-d8c4-4c32-a764-32955142f14e" containerName="kserve-container" Apr 24 21:56:15.302827 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.302719 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="989e357d-91ff-427f-a266-dcbde106f0fa" containerName="kserve-container" Apr 24 21:56:15.302827 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.302725 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="989e357d-91ff-427f-a266-dcbde106f0fa" containerName="kube-rbac-proxy" Apr 24 21:56:15.302827 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.302732 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f409ce2b-d8c4-4c32-a764-32955142f14e" containerName="kube-rbac-proxy" Apr 24 21:56:15.305944 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.305925 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" Apr 24 21:56:15.308321 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.308305 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-fc125-kube-rbac-proxy-sar-config\"" Apr 24 21:56:15.308721 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.308703 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-fc125-predictor-serving-cert\"" Apr 24 21:56:15.321026 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.321001 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj"] Apr 24 21:56:15.327376 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.327354 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9"] Apr 24 21:56:15.328270 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.327692 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" podUID="7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67" containerName="kserve-container" containerID="cri-o://cbb3bf76ee8c3b68de1630a74ae63c604bf1545e3b1ac03a2eda9af223f609a6" gracePeriod=30 Apr 24 21:56:15.328270 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.327825 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" podUID="7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67" containerName="kube-rbac-proxy" containerID="cri-o://5e60521ff8d9a49349d4a74cb1b9371f07c9c7f4d593ac1d29733b46d8c7ced3" gracePeriod=30 Apr 24 21:56:15.409190 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.409160 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk"] Apr 24 21:56:15.412741 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.412719 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" Apr 24 21:56:15.415149 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.415129 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-fc125-predictor-serving-cert\"" Apr 24 21:56:15.415290 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.415151 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-fc125-kube-rbac-proxy-sar-config\"" Apr 24 21:56:15.424560 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.424524 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk"] Apr 24 21:56:15.456823 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.456731 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-fc125-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42e97302-b125-4622-918c-666a6a28d9c9-success-200-isvc-fc125-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-fc125-predictor-67497cd975-mrbkj\" (UID: \"42e97302-b125-4622-918c-666a6a28d9c9\") " pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" Apr 24 21:56:15.456956 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.456844 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn87r\" (UniqueName: \"kubernetes.io/projected/42e97302-b125-4622-918c-666a6a28d9c9-kube-api-access-pn87r\") pod \"success-200-isvc-fc125-predictor-67497cd975-mrbkj\" (UID: \"42e97302-b125-4622-918c-666a6a28d9c9\") " pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" Apr 24 21:56:15.456956 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.456921 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42e97302-b125-4622-918c-666a6a28d9c9-proxy-tls\") pod \"success-200-isvc-fc125-predictor-67497cd975-mrbkj\" (UID: \"42e97302-b125-4622-918c-666a6a28d9c9\") " pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" Apr 24 21:56:15.557493 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.557463 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-fc125-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42e97302-b125-4622-918c-666a6a28d9c9-success-200-isvc-fc125-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-fc125-predictor-67497cd975-mrbkj\" (UID: \"42e97302-b125-4622-918c-666a6a28d9c9\") " pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" Apr 24 21:56:15.557707 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.557516 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-fc125-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d-error-404-isvc-fc125-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-fc125-predictor-7967db9f76-wrnsk\" (UID: \"f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d\") " pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" Apr 24 21:56:15.557707 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.557559 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pn87r\" (UniqueName: \"kubernetes.io/projected/42e97302-b125-4622-918c-666a6a28d9c9-kube-api-access-pn87r\") pod \"success-200-isvc-fc125-predictor-67497cd975-mrbkj\" (UID: \"42e97302-b125-4622-918c-666a6a28d9c9\") " pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" Apr 24 21:56:15.557842 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.557751 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d-proxy-tls\") pod \"error-404-isvc-fc125-predictor-7967db9f76-wrnsk\" (UID: \"f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d\") " pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" Apr 24 21:56:15.558020 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.557984 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42e97302-b125-4622-918c-666a6a28d9c9-proxy-tls\") pod \"success-200-isvc-fc125-predictor-67497cd975-mrbkj\" (UID: \"42e97302-b125-4622-918c-666a6a28d9c9\") " pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" Apr 24 21:56:15.558146 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.558035 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7szk\" (UniqueName: \"kubernetes.io/projected/f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d-kube-api-access-b7szk\") pod \"error-404-isvc-fc125-predictor-7967db9f76-wrnsk\" (UID: \"f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d\") " pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" Apr 24 21:56:15.558224 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.558202 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-fc125-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42e97302-b125-4622-918c-666a6a28d9c9-success-200-isvc-fc125-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-fc125-predictor-67497cd975-mrbkj\" (UID: \"42e97302-b125-4622-918c-666a6a28d9c9\") " pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" Apr 24 21:56:15.560329 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.560310 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42e97302-b125-4622-918c-666a6a28d9c9-proxy-tls\") pod \"success-200-isvc-fc125-predictor-67497cd975-mrbkj\" (UID: \"42e97302-b125-4622-918c-666a6a28d9c9\") " pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" Apr 24 21:56:15.566528 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.566510 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn87r\" (UniqueName: \"kubernetes.io/projected/42e97302-b125-4622-918c-666a6a28d9c9-kube-api-access-pn87r\") pod \"success-200-isvc-fc125-predictor-67497cd975-mrbkj\" (UID: \"42e97302-b125-4622-918c-666a6a28d9c9\") " pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" Apr 24 21:56:15.616465 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.616446 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" Apr 24 21:56:15.659682 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.659643 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d-proxy-tls\") pod \"error-404-isvc-fc125-predictor-7967db9f76-wrnsk\" (UID: \"f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d\") " pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" Apr 24 21:56:15.659825 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.659722 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7szk\" (UniqueName: \"kubernetes.io/projected/f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d-kube-api-access-b7szk\") pod \"error-404-isvc-fc125-predictor-7967db9f76-wrnsk\" (UID: \"f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d\") " pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" Apr 24 21:56:15.659825 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.659818 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-fc125-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d-error-404-isvc-fc125-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-fc125-predictor-7967db9f76-wrnsk\" (UID: \"f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d\") " pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" Apr 24 21:56:15.660533 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.660500 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-fc125-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d-error-404-isvc-fc125-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-fc125-predictor-7967db9f76-wrnsk\" (UID: \"f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d\") " pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" Apr 24 21:56:15.662076 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.662053 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d-proxy-tls\") pod \"error-404-isvc-fc125-predictor-7967db9f76-wrnsk\" (UID: \"f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d\") " pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" Apr 24 21:56:15.668965 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.668923 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7szk\" (UniqueName: \"kubernetes.io/projected/f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d-kube-api-access-b7szk\") pod \"error-404-isvc-fc125-predictor-7967db9f76-wrnsk\" (UID: \"f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d\") " pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" Apr 24 21:56:15.723125 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.723085 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" Apr 24 21:56:15.742934 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.742912 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj"] Apr 24 21:56:15.745688 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:56:15.745663 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42e97302_b125_4622_918c_666a6a28d9c9.slice/crio-86b69108c80c140101c332af0abe45ea6b7d87c592a8714d112539088c80e1e4 WatchSource:0}: Error finding container 86b69108c80c140101c332af0abe45ea6b7d87c592a8714d112539088c80e1e4: Status 404 returned error can't find the container with id 86b69108c80c140101c332af0abe45ea6b7d87c592a8714d112539088c80e1e4 Apr 24 21:56:15.807652 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.807575 2578 generic.go:358] "Generic (PLEG): container finished" podID="7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67" containerID="5e60521ff8d9a49349d4a74cb1b9371f07c9c7f4d593ac1d29733b46d8c7ced3" exitCode=2 Apr 24 21:56:15.807764 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.807681 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" event={"ID":"7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67","Type":"ContainerDied","Data":"5e60521ff8d9a49349d4a74cb1b9371f07c9c7f4d593ac1d29733b46d8c7ced3"} Apr 24 21:56:15.809457 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.809428 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" event={"ID":"42e97302-b125-4622-918c-666a6a28d9c9","Type":"ContainerStarted","Data":"86b69108c80c140101c332af0abe45ea6b7d87c592a8714d112539088c80e1e4"} Apr 24 21:56:15.811273 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.811250 2578 generic.go:358] "Generic (PLEG): container finished" podID="303b11b3-a44c-4cd3-896a-000a60c34a09" containerID="bd21e692fe3f5e9dda57d1b564826bc10d06ae99da96fb71aaa1defa4ed9dd47" exitCode=2 Apr 24 21:56:15.811368 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.811274 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" event={"ID":"303b11b3-a44c-4cd3-896a-000a60c34a09","Type":"ContainerDied","Data":"bd21e692fe3f5e9dda57d1b564826bc10d06ae99da96fb71aaa1defa4ed9dd47"} Apr 24 21:56:15.852770 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:15.852745 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk"] Apr 24 21:56:15.853952 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:56:15.853925 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf13ffeb5_f1e9_487a_946d_0c5a5d04cb3d.slice/crio-5d1cd1e298a2d52b2b0268baa9ce70cbcb3b1f607c9d2aa9a2b9267bf3c7c055 WatchSource:0}: Error finding container 5d1cd1e298a2d52b2b0268baa9ce70cbcb3b1f607c9d2aa9a2b9267bf3c7c055: Status 404 returned error can't find the container with id 5d1cd1e298a2d52b2b0268baa9ce70cbcb3b1f607c9d2aa9a2b9267bf3c7c055 Apr 24 21:56:16.817859 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:16.817813 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" event={"ID":"42e97302-b125-4622-918c-666a6a28d9c9","Type":"ContainerStarted","Data":"58340add28dfe75c3c7875035a32d8363435ad22ecb235b29803fce0f708b2ba"} Apr 24 21:56:16.818328 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:16.817868 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" event={"ID":"42e97302-b125-4622-918c-666a6a28d9c9","Type":"ContainerStarted","Data":"19207ed01113a2372277da2dbf34164fa2771fffbe5d097a1a047b005978f239"} Apr 24 21:56:16.820492 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:16.820433 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" podUID="42e97302-b125-4622-918c-666a6a28d9c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 21:56:16.820716 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:16.820523 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" Apr 24 21:56:16.820797 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:16.820734 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" Apr 24 21:56:16.821771 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:16.821751 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" event={"ID":"f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d","Type":"ContainerStarted","Data":"60dcc603dcadf01183a278279395cc124482beb1437a1b8d4df8e3d7ce2df221"} Apr 24 21:56:16.821884 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:16.821776 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" event={"ID":"f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d","Type":"ContainerStarted","Data":"0bcfa3f4fe95dac8bb726e47906b151e6a96fd2d6495c0cb1a2a18ab61a96bb3"} Apr 24 21:56:16.821884 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:16.821785 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" event={"ID":"f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d","Type":"ContainerStarted","Data":"5d1cd1e298a2d52b2b0268baa9ce70cbcb3b1f607c9d2aa9a2b9267bf3c7c055"} Apr 24 21:56:16.821999 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:16.821915 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" Apr 24 21:56:16.838223 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:16.838184 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" podStartSLOduration=1.8381733919999999 podStartE2EDuration="1.838173392s" podCreationTimestamp="2026-04-24 21:56:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:56:16.836321242 +0000 UTC m=+1723.012055916" watchObservedRunningTime="2026-04-24 21:56:16.838173392 +0000 UTC m=+1723.013908064" Apr 24 21:56:16.854634 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:16.854593 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" podStartSLOduration=1.854582026 podStartE2EDuration="1.854582026s" podCreationTimestamp="2026-04-24 21:56:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:56:16.852772654 +0000 UTC m=+1723.028507325" watchObservedRunningTime="2026-04-24 21:56:16.854582026 +0000 UTC m=+1723.030316698" Apr 24 21:56:17.825555 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:17.825495 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" podUID="42e97302-b125-4622-918c-666a6a28d9c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 21:56:17.825555 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:17.825521 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" Apr 24 21:56:17.826427 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:17.826406 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" podUID="f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 21:56:18.168710 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:18.168628 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" podUID="7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.36:8643/healthz\": dial tcp 10.134.0.36:8643: connect: connection refused" Apr 24 21:56:18.168856 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:18.168624 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" podUID="303b11b3-a44c-4cd3-896a-000a60c34a09" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.37:8643/healthz\": dial tcp 10.134.0.37:8643: connect: connection refused" Apr 24 21:56:18.172935 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:18.172911 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" podUID="7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 21:56:18.174140 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:18.174121 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" podUID="303b11b3-a44c-4cd3-896a-000a60c34a09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 21:56:18.830322 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:18.830290 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" podUID="f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 21:56:19.170028 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.170007 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" Apr 24 21:56:19.289501 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.289475 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67-proxy-tls\") pod \"7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67\" (UID: \"7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67\") " Apr 24 21:56:19.289671 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.289616 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-d1134-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67-success-200-isvc-d1134-kube-rbac-proxy-sar-config\") pod \"7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67\" (UID: \"7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67\") " Apr 24 21:56:19.289671 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.289640 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjvjp\" (UniqueName: \"kubernetes.io/projected/7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67-kube-api-access-tjvjp\") pod \"7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67\" (UID: \"7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67\") " Apr 24 21:56:19.289945 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.289921 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67-success-200-isvc-d1134-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-d1134-kube-rbac-proxy-sar-config") pod "7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67" (UID: "7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67"). InnerVolumeSpecName "success-200-isvc-d1134-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:56:19.291624 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.291603 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67-kube-api-access-tjvjp" (OuterVolumeSpecName: "kube-api-access-tjvjp") pod "7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67" (UID: "7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67"). InnerVolumeSpecName "kube-api-access-tjvjp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:56:19.291719 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.291702 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67" (UID: "7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:56:19.391134 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.391108 2578 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-d1134-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67-success-200-isvc-d1134-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:56:19.391245 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.391138 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tjvjp\" (UniqueName: \"kubernetes.io/projected/7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67-kube-api-access-tjvjp\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:56:19.391245 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.391150 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67-proxy-tls\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:56:19.452793 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.452773 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" Apr 24 21:56:19.593489 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.593462 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlbkv\" (UniqueName: \"kubernetes.io/projected/303b11b3-a44c-4cd3-896a-000a60c34a09-kube-api-access-zlbkv\") pod \"303b11b3-a44c-4cd3-896a-000a60c34a09\" (UID: \"303b11b3-a44c-4cd3-896a-000a60c34a09\") " Apr 24 21:56:19.593655 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.593510 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/303b11b3-a44c-4cd3-896a-000a60c34a09-proxy-tls\") pod \"303b11b3-a44c-4cd3-896a-000a60c34a09\" (UID: \"303b11b3-a44c-4cd3-896a-000a60c34a09\") " Apr 24 21:56:19.593655 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.593637 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-d1134-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/303b11b3-a44c-4cd3-896a-000a60c34a09-error-404-isvc-d1134-kube-rbac-proxy-sar-config\") pod \"303b11b3-a44c-4cd3-896a-000a60c34a09\" (UID: \"303b11b3-a44c-4cd3-896a-000a60c34a09\") " Apr 24 21:56:19.593961 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.593938 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/303b11b3-a44c-4cd3-896a-000a60c34a09-error-404-isvc-d1134-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-d1134-kube-rbac-proxy-sar-config") pod "303b11b3-a44c-4cd3-896a-000a60c34a09" (UID: "303b11b3-a44c-4cd3-896a-000a60c34a09"). InnerVolumeSpecName "error-404-isvc-d1134-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:56:19.595559 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.595530 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/303b11b3-a44c-4cd3-896a-000a60c34a09-kube-api-access-zlbkv" (OuterVolumeSpecName: "kube-api-access-zlbkv") pod "303b11b3-a44c-4cd3-896a-000a60c34a09" (UID: "303b11b3-a44c-4cd3-896a-000a60c34a09"). InnerVolumeSpecName "kube-api-access-zlbkv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:56:19.595620 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.595580 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/303b11b3-a44c-4cd3-896a-000a60c34a09-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "303b11b3-a44c-4cd3-896a-000a60c34a09" (UID: "303b11b3-a44c-4cd3-896a-000a60c34a09"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:56:19.694249 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.694187 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/303b11b3-a44c-4cd3-896a-000a60c34a09-proxy-tls\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:56:19.694249 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.694208 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-d1134-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/303b11b3-a44c-4cd3-896a-000a60c34a09-error-404-isvc-d1134-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:56:19.694249 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.694218 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zlbkv\" (UniqueName: \"kubernetes.io/projected/303b11b3-a44c-4cd3-896a-000a60c34a09-kube-api-access-zlbkv\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:56:19.834932 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.834907 2578 generic.go:358] "Generic (PLEG): container finished" podID="7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67" containerID="cbb3bf76ee8c3b68de1630a74ae63c604bf1545e3b1ac03a2eda9af223f609a6" exitCode=0 Apr 24 21:56:19.835269 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.834990 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" Apr 24 21:56:19.835269 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.834997 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" event={"ID":"7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67","Type":"ContainerDied","Data":"cbb3bf76ee8c3b68de1630a74ae63c604bf1545e3b1ac03a2eda9af223f609a6"} Apr 24 21:56:19.835269 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.835037 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9" event={"ID":"7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67","Type":"ContainerDied","Data":"e6df9842eccca33b182d0b97c70f96b071d7f6c4885c43af457b88463c843c61"} Apr 24 21:56:19.835269 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.835056 2578 scope.go:117] "RemoveContainer" containerID="5e60521ff8d9a49349d4a74cb1b9371f07c9c7f4d593ac1d29733b46d8c7ced3" Apr 24 21:56:19.836514 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.836491 2578 generic.go:358] "Generic (PLEG): container finished" podID="303b11b3-a44c-4cd3-896a-000a60c34a09" containerID="f2bdfdc9a55f57341d5987125674ae6dd139516489fa12a7df1f87aa9930c4e6" exitCode=0 Apr 24 21:56:19.836643 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.836522 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" event={"ID":"303b11b3-a44c-4cd3-896a-000a60c34a09","Type":"ContainerDied","Data":"f2bdfdc9a55f57341d5987125674ae6dd139516489fa12a7df1f87aa9930c4e6"} Apr 24 21:56:19.836643 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.836539 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" event={"ID":"303b11b3-a44c-4cd3-896a-000a60c34a09","Type":"ContainerDied","Data":"8dd9eee6a63f499dd38f0d7f5836ca8408bbbb03cf0acb17aac8caf6890d60a5"} Apr 24 21:56:19.836643 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.836580 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh" Apr 24 21:56:19.843841 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.843825 2578 scope.go:117] "RemoveContainer" containerID="cbb3bf76ee8c3b68de1630a74ae63c604bf1545e3b1ac03a2eda9af223f609a6" Apr 24 21:56:19.851837 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.851823 2578 scope.go:117] "RemoveContainer" containerID="5e60521ff8d9a49349d4a74cb1b9371f07c9c7f4d593ac1d29733b46d8c7ced3" Apr 24 21:56:19.852063 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:56:19.852043 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e60521ff8d9a49349d4a74cb1b9371f07c9c7f4d593ac1d29733b46d8c7ced3\": container with ID starting with 5e60521ff8d9a49349d4a74cb1b9371f07c9c7f4d593ac1d29733b46d8c7ced3 not found: ID does not exist" containerID="5e60521ff8d9a49349d4a74cb1b9371f07c9c7f4d593ac1d29733b46d8c7ced3" Apr 24 21:56:19.852113 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.852069 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e60521ff8d9a49349d4a74cb1b9371f07c9c7f4d593ac1d29733b46d8c7ced3"} err="failed to get container status \"5e60521ff8d9a49349d4a74cb1b9371f07c9c7f4d593ac1d29733b46d8c7ced3\": rpc error: code = NotFound desc = could not find container \"5e60521ff8d9a49349d4a74cb1b9371f07c9c7f4d593ac1d29733b46d8c7ced3\": container with ID starting with 5e60521ff8d9a49349d4a74cb1b9371f07c9c7f4d593ac1d29733b46d8c7ced3 not found: ID does not exist" Apr 24 21:56:19.852113 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.852086 2578 scope.go:117] "RemoveContainer" containerID="cbb3bf76ee8c3b68de1630a74ae63c604bf1545e3b1ac03a2eda9af223f609a6" Apr 24 21:56:19.852313 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:56:19.852296 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbb3bf76ee8c3b68de1630a74ae63c604bf1545e3b1ac03a2eda9af223f609a6\": container with ID starting with cbb3bf76ee8c3b68de1630a74ae63c604bf1545e3b1ac03a2eda9af223f609a6 not found: ID does not exist" containerID="cbb3bf76ee8c3b68de1630a74ae63c604bf1545e3b1ac03a2eda9af223f609a6" Apr 24 21:56:19.852381 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.852322 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb3bf76ee8c3b68de1630a74ae63c604bf1545e3b1ac03a2eda9af223f609a6"} err="failed to get container status \"cbb3bf76ee8c3b68de1630a74ae63c604bf1545e3b1ac03a2eda9af223f609a6\": rpc error: code = NotFound desc = could not find container \"cbb3bf76ee8c3b68de1630a74ae63c604bf1545e3b1ac03a2eda9af223f609a6\": container with ID starting with cbb3bf76ee8c3b68de1630a74ae63c604bf1545e3b1ac03a2eda9af223f609a6 not found: ID does not exist" Apr 24 21:56:19.852381 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.852346 2578 scope.go:117] "RemoveContainer" containerID="bd21e692fe3f5e9dda57d1b564826bc10d06ae99da96fb71aaa1defa4ed9dd47" Apr 24 21:56:19.857498 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.857477 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9"] Apr 24 21:56:19.860243 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.860221 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d1134-predictor-b6cdf5487-xvsr9"] Apr 24 21:56:19.860715 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.860701 2578 scope.go:117] "RemoveContainer" containerID="f2bdfdc9a55f57341d5987125674ae6dd139516489fa12a7df1f87aa9930c4e6" Apr 24 21:56:19.867058 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.867043 2578 scope.go:117] "RemoveContainer" containerID="bd21e692fe3f5e9dda57d1b564826bc10d06ae99da96fb71aaa1defa4ed9dd47" Apr 24 21:56:19.867301 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:56:19.867282 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd21e692fe3f5e9dda57d1b564826bc10d06ae99da96fb71aaa1defa4ed9dd47\": container with ID starting with bd21e692fe3f5e9dda57d1b564826bc10d06ae99da96fb71aaa1defa4ed9dd47 not found: ID does not exist" containerID="bd21e692fe3f5e9dda57d1b564826bc10d06ae99da96fb71aaa1defa4ed9dd47" Apr 24 21:56:19.867350 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.867306 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd21e692fe3f5e9dda57d1b564826bc10d06ae99da96fb71aaa1defa4ed9dd47"} err="failed to get container status \"bd21e692fe3f5e9dda57d1b564826bc10d06ae99da96fb71aaa1defa4ed9dd47\": rpc error: code = NotFound desc = could not find container \"bd21e692fe3f5e9dda57d1b564826bc10d06ae99da96fb71aaa1defa4ed9dd47\": container with ID starting with bd21e692fe3f5e9dda57d1b564826bc10d06ae99da96fb71aaa1defa4ed9dd47 not found: ID does not exist" Apr 24 21:56:19.867350 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.867322 2578 scope.go:117] "RemoveContainer" containerID="f2bdfdc9a55f57341d5987125674ae6dd139516489fa12a7df1f87aa9930c4e6" Apr 24 21:56:19.867503 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:56:19.867489 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2bdfdc9a55f57341d5987125674ae6dd139516489fa12a7df1f87aa9930c4e6\": container with ID starting with f2bdfdc9a55f57341d5987125674ae6dd139516489fa12a7df1f87aa9930c4e6 not found: ID does not exist" containerID="f2bdfdc9a55f57341d5987125674ae6dd139516489fa12a7df1f87aa9930c4e6" Apr 24 21:56:19.867561 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.867507 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2bdfdc9a55f57341d5987125674ae6dd139516489fa12a7df1f87aa9930c4e6"} err="failed to get container status \"f2bdfdc9a55f57341d5987125674ae6dd139516489fa12a7df1f87aa9930c4e6\": rpc error: code = NotFound desc = could not find container \"f2bdfdc9a55f57341d5987125674ae6dd139516489fa12a7df1f87aa9930c4e6\": container with ID starting with f2bdfdc9a55f57341d5987125674ae6dd139516489fa12a7df1f87aa9930c4e6 not found: ID does not exist" Apr 24 21:56:19.869472 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.869455 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh"] Apr 24 21:56:19.873606 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:19.873588 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d1134-predictor-796fc776b4-5ssqh"] Apr 24 21:56:20.421847 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:20.421811 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="303b11b3-a44c-4cd3-896a-000a60c34a09" path="/var/lib/kubelet/pods/303b11b3-a44c-4cd3-896a-000a60c34a09/volumes" Apr 24 21:56:20.422214 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:20.422200 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67" path="/var/lib/kubelet/pods/7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67/volumes" Apr 24 21:56:22.683405 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:22.683379 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" Apr 24 21:56:22.829731 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:22.829706 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" Apr 24 21:56:22.830208 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:22.830187 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" podUID="42e97302-b125-4622-918c-666a6a28d9c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 21:56:23.690425 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:23.690394 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" Apr 24 21:56:23.834392 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:23.834365 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" Apr 24 21:56:23.834831 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:23.834809 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" podUID="f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 21:56:32.831197 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:32.831159 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" podUID="42e97302-b125-4622-918c-666a6a28d9c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 21:56:33.834814 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:33.834777 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" podUID="f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 21:56:42.830956 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:42.830913 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" podUID="42e97302-b125-4622-918c-666a6a28d9c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 21:56:43.835740 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:43.835693 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" podUID="f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 21:56:45.782166 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.782127 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65"] Apr 24 21:56:45.782679 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.782619 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" podUID="61ed92de-cc0f-4baf-9b8d-8bb6e4019d55" containerName="kserve-container" containerID="cri-o://3da2cc6cb8c12f6b3d02d0ea2958d39f19390f275a129aff3c3f04b369ad8b92" gracePeriod=30 Apr 24 21:56:45.782917 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.782861 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" podUID="61ed92de-cc0f-4baf-9b8d-8bb6e4019d55" containerName="kube-rbac-proxy" containerID="cri-o://182b8170e65a387c0435978b15a23ff190aeda7a78dc03c6923b469fb4ef7146" gracePeriod=30 Apr 24 21:56:45.831344 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.831314 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc"] Apr 24 21:56:45.831674 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.831633 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" podUID="9302faf7-7f03-465f-8e33-8ea9cdf22fdd" containerName="kserve-container" containerID="cri-o://a8450e82ec91990d4486549521f5c59a4b9dce46258412261072c6811100d628" gracePeriod=30 Apr 24 21:56:45.831793 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.831676 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" podUID="9302faf7-7f03-465f-8e33-8ea9cdf22fdd" containerName="kube-rbac-proxy" containerID="cri-o://cc03c0c86c5628c072509eecab56623f992e060ff3268dabe2cd9c800bc82eeb" gracePeriod=30 Apr 24 21:56:45.879307 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.879282 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf"] Apr 24 21:56:45.879701 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.879689 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67" containerName="kserve-container" Apr 24 21:56:45.879749 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.879705 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67" containerName="kserve-container" Apr 24 21:56:45.879749 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.879728 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="303b11b3-a44c-4cd3-896a-000a60c34a09" containerName="kube-rbac-proxy" Apr 24 21:56:45.879749 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.879734 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="303b11b3-a44c-4cd3-896a-000a60c34a09" containerName="kube-rbac-proxy" Apr 24 21:56:45.879835 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.879743 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67" containerName="kube-rbac-proxy" Apr 24 21:56:45.879835 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.879759 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67" containerName="kube-rbac-proxy" Apr 24 21:56:45.879835 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.879765 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="303b11b3-a44c-4cd3-896a-000a60c34a09" containerName="kserve-container" Apr 24 21:56:45.879835 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.879770 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="303b11b3-a44c-4cd3-896a-000a60c34a09" containerName="kserve-container" Apr 24 21:56:45.879835 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.879824 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="303b11b3-a44c-4cd3-896a-000a60c34a09" containerName="kserve-container" Apr 24 21:56:45.879835 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.879834 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="303b11b3-a44c-4cd3-896a-000a60c34a09" containerName="kube-rbac-proxy" Apr 24 21:56:45.880012 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.879842 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67" containerName="kserve-container" Apr 24 21:56:45.880012 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.879848 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="7eeb6208-ba7c-40e7-9ccf-0b38b2a3bc67" containerName="kube-rbac-proxy" Apr 24 21:56:45.884430 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.884414 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" Apr 24 21:56:45.886889 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.886866 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-6941b-kube-rbac-proxy-sar-config\"" Apr 24 21:56:45.887302 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.887286 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-6941b-predictor-serving-cert\"" Apr 24 21:56:45.893362 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.893342 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf"] Apr 24 21:56:45.926948 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.926917 2578 generic.go:358] "Generic (PLEG): container finished" podID="61ed92de-cc0f-4baf-9b8d-8bb6e4019d55" containerID="182b8170e65a387c0435978b15a23ff190aeda7a78dc03c6923b469fb4ef7146" exitCode=2 Apr 24 21:56:45.927113 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.926993 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" event={"ID":"61ed92de-cc0f-4baf-9b8d-8bb6e4019d55","Type":"ContainerDied","Data":"182b8170e65a387c0435978b15a23ff190aeda7a78dc03c6923b469fb4ef7146"} Apr 24 21:56:45.983827 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.983800 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx"] Apr 24 21:56:45.987285 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.987266 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" Apr 24 21:56:45.989500 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.989481 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-6941b-predictor-serving-cert\"" Apr 24 21:56:45.989633 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.989506 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-6941b-kube-rbac-proxy-sar-config\"" Apr 24 21:56:45.996163 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:45.996138 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx"] Apr 24 21:56:46.011024 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.011003 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-6941b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0e9e117e-b849-44a2-b9c2-e2290e170fe1-success-200-isvc-6941b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf\" (UID: \"0e9e117e-b849-44a2-b9c2-e2290e170fe1\") " pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" Apr 24 21:56:46.011110 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.011038 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0e9e117e-b849-44a2-b9c2-e2290e170fe1-proxy-tls\") pod \"success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf\" (UID: \"0e9e117e-b849-44a2-b9c2-e2290e170fe1\") " pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" Apr 24 21:56:46.011110 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.011081 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88mdk\" (UniqueName: \"kubernetes.io/projected/0e9e117e-b849-44a2-b9c2-e2290e170fe1-kube-api-access-88mdk\") pod \"success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf\" (UID: \"0e9e117e-b849-44a2-b9c2-e2290e170fe1\") " pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" Apr 24 21:56:46.112651 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.112573 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88mdk\" (UniqueName: \"kubernetes.io/projected/0e9e117e-b849-44a2-b9c2-e2290e170fe1-kube-api-access-88mdk\") pod \"success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf\" (UID: \"0e9e117e-b849-44a2-b9c2-e2290e170fe1\") " pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" Apr 24 21:56:46.112801 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.112703 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-6941b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6719c2fc-f689-4854-9eb7-9dbba8fe4bfb-error-404-isvc-6941b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-6941b-predictor-db66cc574-zc9wx\" (UID: \"6719c2fc-f689-4854-9eb7-9dbba8fe4bfb\") " pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" Apr 24 21:56:46.112801 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.112730 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q7dq\" (UniqueName: \"kubernetes.io/projected/6719c2fc-f689-4854-9eb7-9dbba8fe4bfb-kube-api-access-6q7dq\") pod \"error-404-isvc-6941b-predictor-db66cc574-zc9wx\" (UID: \"6719c2fc-f689-4854-9eb7-9dbba8fe4bfb\") " pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" Apr 24 21:56:46.112801 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.112782 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-6941b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0e9e117e-b849-44a2-b9c2-e2290e170fe1-success-200-isvc-6941b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf\" (UID: \"0e9e117e-b849-44a2-b9c2-e2290e170fe1\") " pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" Apr 24 21:56:46.113017 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.112851 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0e9e117e-b849-44a2-b9c2-e2290e170fe1-proxy-tls\") pod \"success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf\" (UID: \"0e9e117e-b849-44a2-b9c2-e2290e170fe1\") " pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" Apr 24 21:56:46.113017 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.112885 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6719c2fc-f689-4854-9eb7-9dbba8fe4bfb-proxy-tls\") pod \"error-404-isvc-6941b-predictor-db66cc574-zc9wx\" (UID: \"6719c2fc-f689-4854-9eb7-9dbba8fe4bfb\") " pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" Apr 24 21:56:46.113137 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:56:46.113020 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-6941b-predictor-serving-cert: secret "success-200-isvc-6941b-predictor-serving-cert" not found Apr 24 21:56:46.113137 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:56:46.113076 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e9e117e-b849-44a2-b9c2-e2290e170fe1-proxy-tls podName:0e9e117e-b849-44a2-b9c2-e2290e170fe1 nodeName:}" failed. No retries permitted until 2026-04-24 21:56:46.613058857 +0000 UTC m=+1752.788793509 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/0e9e117e-b849-44a2-b9c2-e2290e170fe1-proxy-tls") pod "success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" (UID: "0e9e117e-b849-44a2-b9c2-e2290e170fe1") : secret "success-200-isvc-6941b-predictor-serving-cert" not found Apr 24 21:56:46.113520 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.113492 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-6941b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0e9e117e-b849-44a2-b9c2-e2290e170fe1-success-200-isvc-6941b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf\" (UID: \"0e9e117e-b849-44a2-b9c2-e2290e170fe1\") " pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" Apr 24 21:56:46.124255 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.124234 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88mdk\" (UniqueName: \"kubernetes.io/projected/0e9e117e-b849-44a2-b9c2-e2290e170fe1-kube-api-access-88mdk\") pod \"success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf\" (UID: \"0e9e117e-b849-44a2-b9c2-e2290e170fe1\") " pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" Apr 24 21:56:46.213711 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.213671 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-6941b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6719c2fc-f689-4854-9eb7-9dbba8fe4bfb-error-404-isvc-6941b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-6941b-predictor-db66cc574-zc9wx\" (UID: \"6719c2fc-f689-4854-9eb7-9dbba8fe4bfb\") " pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" Apr 24 21:56:46.213846 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.213725 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6q7dq\" (UniqueName: \"kubernetes.io/projected/6719c2fc-f689-4854-9eb7-9dbba8fe4bfb-kube-api-access-6q7dq\") pod \"error-404-isvc-6941b-predictor-db66cc574-zc9wx\" (UID: \"6719c2fc-f689-4854-9eb7-9dbba8fe4bfb\") " pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" Apr 24 21:56:46.213846 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.213806 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6719c2fc-f689-4854-9eb7-9dbba8fe4bfb-proxy-tls\") pod \"error-404-isvc-6941b-predictor-db66cc574-zc9wx\" (UID: \"6719c2fc-f689-4854-9eb7-9dbba8fe4bfb\") " pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" Apr 24 21:56:46.213977 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:56:46.213950 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-6941b-predictor-serving-cert: secret "error-404-isvc-6941b-predictor-serving-cert" not found Apr 24 21:56:46.214055 ip-10-0-132-124 kubenswrapper[2578]: E0424 21:56:46.214015 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6719c2fc-f689-4854-9eb7-9dbba8fe4bfb-proxy-tls podName:6719c2fc-f689-4854-9eb7-9dbba8fe4bfb nodeName:}" failed. No retries permitted until 2026-04-24 21:56:46.713996989 +0000 UTC m=+1752.889731662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6719c2fc-f689-4854-9eb7-9dbba8fe4bfb-proxy-tls") pod "error-404-isvc-6941b-predictor-db66cc574-zc9wx" (UID: "6719c2fc-f689-4854-9eb7-9dbba8fe4bfb") : secret "error-404-isvc-6941b-predictor-serving-cert" not found Apr 24 21:56:46.214299 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.214274 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-6941b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6719c2fc-f689-4854-9eb7-9dbba8fe4bfb-error-404-isvc-6941b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-6941b-predictor-db66cc574-zc9wx\" (UID: \"6719c2fc-f689-4854-9eb7-9dbba8fe4bfb\") " pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" Apr 24 21:56:46.222916 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.222896 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q7dq\" (UniqueName: \"kubernetes.io/projected/6719c2fc-f689-4854-9eb7-9dbba8fe4bfb-kube-api-access-6q7dq\") pod \"error-404-isvc-6941b-predictor-db66cc574-zc9wx\" (UID: \"6719c2fc-f689-4854-9eb7-9dbba8fe4bfb\") " pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" Apr 24 21:56:46.618392 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.618356 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0e9e117e-b849-44a2-b9c2-e2290e170fe1-proxy-tls\") pod \"success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf\" (UID: \"0e9e117e-b849-44a2-b9c2-e2290e170fe1\") " pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" Apr 24 21:56:46.621525 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.621484 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0e9e117e-b849-44a2-b9c2-e2290e170fe1-proxy-tls\") pod \"success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf\" (UID: \"0e9e117e-b849-44a2-b9c2-e2290e170fe1\") " pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" Apr 24 21:56:46.719027 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.718991 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6719c2fc-f689-4854-9eb7-9dbba8fe4bfb-proxy-tls\") pod \"error-404-isvc-6941b-predictor-db66cc574-zc9wx\" (UID: \"6719c2fc-f689-4854-9eb7-9dbba8fe4bfb\") " pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" Apr 24 21:56:46.721892 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.721872 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6719c2fc-f689-4854-9eb7-9dbba8fe4bfb-proxy-tls\") pod \"error-404-isvc-6941b-predictor-db66cc574-zc9wx\" (UID: \"6719c2fc-f689-4854-9eb7-9dbba8fe4bfb\") " pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" Apr 24 21:56:46.795885 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.795856 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" Apr 24 21:56:46.898105 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.898074 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" Apr 24 21:56:46.922795 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.922759 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf"] Apr 24 21:56:46.925975 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:56:46.925926 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e9e117e_b849_44a2_b9c2_e2290e170fe1.slice/crio-618c2d4781e81f337b672b479e7db1ba17da748098b300d738c9d0309d4ca6d8 WatchSource:0}: Error finding container 618c2d4781e81f337b672b479e7db1ba17da748098b300d738c9d0309d4ca6d8: Status 404 returned error can't find the container with id 618c2d4781e81f337b672b479e7db1ba17da748098b300d738c9d0309d4ca6d8 Apr 24 21:56:46.931298 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.931273 2578 generic.go:358] "Generic (PLEG): container finished" podID="9302faf7-7f03-465f-8e33-8ea9cdf22fdd" containerID="cc03c0c86c5628c072509eecab56623f992e060ff3268dabe2cd9c800bc82eeb" exitCode=2 Apr 24 21:56:46.931384 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.931331 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" event={"ID":"9302faf7-7f03-465f-8e33-8ea9cdf22fdd","Type":"ContainerDied","Data":"cc03c0c86c5628c072509eecab56623f992e060ff3268dabe2cd9c800bc82eeb"} Apr 24 21:56:46.932337 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:46.932318 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" event={"ID":"0e9e117e-b849-44a2-b9c2-e2290e170fe1","Type":"ContainerStarted","Data":"618c2d4781e81f337b672b479e7db1ba17da748098b300d738c9d0309d4ca6d8"} Apr 24 21:56:47.031239 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:47.031212 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx"] Apr 24 21:56:47.039703 ip-10-0-132-124 kubenswrapper[2578]: W0424 21:56:47.039679 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6719c2fc_f689_4854_9eb7_9dbba8fe4bfb.slice/crio-39a4a6ff3266b05be22c54c617c3d9ee6ea206fd0e20041837556664e0bd2353 WatchSource:0}: Error finding container 39a4a6ff3266b05be22c54c617c3d9ee6ea206fd0e20041837556664e0bd2353: Status 404 returned error can't find the container with id 39a4a6ff3266b05be22c54c617c3d9ee6ea206fd0e20041837556664e0bd2353 Apr 24 21:56:47.678109 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:47.678068 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" podUID="9302faf7-7f03-465f-8e33-8ea9cdf22fdd" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.38:8643/healthz\": dial tcp 10.134.0.38:8643: connect: connection refused" Apr 24 21:56:47.937640 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:47.937537 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" event={"ID":"6719c2fc-f689-4854-9eb7-9dbba8fe4bfb","Type":"ContainerStarted","Data":"fab1c0623472f14b433ff90b13a34d3da14645c9f22c64f8ddf08d5da888ed87"} Apr 24 21:56:47.937640 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:47.937590 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" event={"ID":"6719c2fc-f689-4854-9eb7-9dbba8fe4bfb","Type":"ContainerStarted","Data":"50c6e66b089388c00050961bccf7dc895e75b13e0a02f80e4c2e01d60517c8f5"} Apr 24 21:56:47.937640 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:47.937601 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" event={"ID":"6719c2fc-f689-4854-9eb7-9dbba8fe4bfb","Type":"ContainerStarted","Data":"39a4a6ff3266b05be22c54c617c3d9ee6ea206fd0e20041837556664e0bd2353"} Apr 24 21:56:47.938221 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:47.937727 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" Apr 24 21:56:47.939170 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:47.939152 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" event={"ID":"0e9e117e-b849-44a2-b9c2-e2290e170fe1","Type":"ContainerStarted","Data":"27cad8b62e3626d9a94ee0b15467851c4afd87dd5f8b7e885f9b6e6067e1483f"} Apr 24 21:56:47.939240 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:47.939177 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" event={"ID":"0e9e117e-b849-44a2-b9c2-e2290e170fe1","Type":"ContainerStarted","Data":"8515fc0affc42a483da93846dfeee417534359fa0d6f93f01901ef80bc7e0502"} Apr 24 21:56:47.939292 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:47.939278 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" Apr 24 21:56:47.958194 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:47.958152 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" podStartSLOduration=2.958139964 podStartE2EDuration="2.958139964s" podCreationTimestamp="2026-04-24 21:56:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:56:47.955900799 +0000 UTC m=+1754.131635472" watchObservedRunningTime="2026-04-24 21:56:47.958139964 +0000 UTC m=+1754.133874637" Apr 24 21:56:47.973426 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:47.973384 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" podStartSLOduration=2.973371393 podStartE2EDuration="2.973371393s" podCreationTimestamp="2026-04-24 21:56:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:56:47.972074417 +0000 UTC m=+1754.147809089" watchObservedRunningTime="2026-04-24 21:56:47.973371393 +0000 UTC m=+1754.149106135" Apr 24 21:56:48.685386 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:48.685356 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" podUID="61ed92de-cc0f-4baf-9b8d-8bb6e4019d55" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.39:8643/healthz\": dial tcp 10.134.0.39:8643: connect: connection refused" Apr 24 21:56:48.944868 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:48.944844 2578 generic.go:358] "Generic (PLEG): container finished" podID="61ed92de-cc0f-4baf-9b8d-8bb6e4019d55" containerID="3da2cc6cb8c12f6b3d02d0ea2958d39f19390f275a129aff3c3f04b369ad8b92" exitCode=0 Apr 24 21:56:48.945240 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:48.944919 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" event={"ID":"61ed92de-cc0f-4baf-9b8d-8bb6e4019d55","Type":"ContainerDied","Data":"3da2cc6cb8c12f6b3d02d0ea2958d39f19390f275a129aff3c3f04b369ad8b92"} Apr 24 21:56:48.946727 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:48.946684 2578 generic.go:358] "Generic (PLEG): container finished" podID="9302faf7-7f03-465f-8e33-8ea9cdf22fdd" containerID="a8450e82ec91990d4486549521f5c59a4b9dce46258412261072c6811100d628" exitCode=0 Apr 24 21:56:48.946864 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:48.946753 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" event={"ID":"9302faf7-7f03-465f-8e33-8ea9cdf22fdd","Type":"ContainerDied","Data":"a8450e82ec91990d4486549521f5c59a4b9dce46258412261072c6811100d628"} Apr 24 21:56:48.947413 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:48.947343 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" Apr 24 21:56:48.947413 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:48.947375 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" Apr 24 21:56:48.948137 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:48.948102 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" podUID="0e9e117e-b849-44a2-b9c2-e2290e170fe1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 21:56:48.948262 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:48.948234 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" podUID="6719c2fc-f689-4854-9eb7-9dbba8fe4bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 21:56:49.031203 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.031182 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" Apr 24 21:56:49.067082 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.067060 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" Apr 24 21:56:49.140142 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.140114 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgbs4\" (UniqueName: \"kubernetes.io/projected/9302faf7-7f03-465f-8e33-8ea9cdf22fdd-kube-api-access-bgbs4\") pod \"9302faf7-7f03-465f-8e33-8ea9cdf22fdd\" (UID: \"9302faf7-7f03-465f-8e33-8ea9cdf22fdd\") " Apr 24 21:56:49.140313 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.140159 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-6842b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9302faf7-7f03-465f-8e33-8ea9cdf22fdd-success-200-isvc-6842b-kube-rbac-proxy-sar-config\") pod \"9302faf7-7f03-465f-8e33-8ea9cdf22fdd\" (UID: \"9302faf7-7f03-465f-8e33-8ea9cdf22fdd\") " Apr 24 21:56:49.140313 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.140184 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61ed92de-cc0f-4baf-9b8d-8bb6e4019d55-proxy-tls\") pod \"61ed92de-cc0f-4baf-9b8d-8bb6e4019d55\" (UID: \"61ed92de-cc0f-4baf-9b8d-8bb6e4019d55\") " Apr 24 21:56:49.140313 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.140246 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9302faf7-7f03-465f-8e33-8ea9cdf22fdd-proxy-tls\") pod \"9302faf7-7f03-465f-8e33-8ea9cdf22fdd\" (UID: \"9302faf7-7f03-465f-8e33-8ea9cdf22fdd\") " Apr 24 21:56:49.140313 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.140267 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-6842b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/61ed92de-cc0f-4baf-9b8d-8bb6e4019d55-error-404-isvc-6842b-kube-rbac-proxy-sar-config\") pod \"61ed92de-cc0f-4baf-9b8d-8bb6e4019d55\" (UID: \"61ed92de-cc0f-4baf-9b8d-8bb6e4019d55\") " Apr 24 21:56:49.140313 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.140292 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkfnd\" (UniqueName: \"kubernetes.io/projected/61ed92de-cc0f-4baf-9b8d-8bb6e4019d55-kube-api-access-dkfnd\") pod \"61ed92de-cc0f-4baf-9b8d-8bb6e4019d55\" (UID: \"61ed92de-cc0f-4baf-9b8d-8bb6e4019d55\") " Apr 24 21:56:49.140612 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.140568 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9302faf7-7f03-465f-8e33-8ea9cdf22fdd-success-200-isvc-6842b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-6842b-kube-rbac-proxy-sar-config") pod "9302faf7-7f03-465f-8e33-8ea9cdf22fdd" (UID: "9302faf7-7f03-465f-8e33-8ea9cdf22fdd"). InnerVolumeSpecName "success-200-isvc-6842b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:56:49.140670 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.140645 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61ed92de-cc0f-4baf-9b8d-8bb6e4019d55-error-404-isvc-6842b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-6842b-kube-rbac-proxy-sar-config") pod "61ed92de-cc0f-4baf-9b8d-8bb6e4019d55" (UID: "61ed92de-cc0f-4baf-9b8d-8bb6e4019d55"). InnerVolumeSpecName "error-404-isvc-6842b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:56:49.142444 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.142418 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9302faf7-7f03-465f-8e33-8ea9cdf22fdd-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9302faf7-7f03-465f-8e33-8ea9cdf22fdd" (UID: "9302faf7-7f03-465f-8e33-8ea9cdf22fdd"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:56:49.142599 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.142532 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61ed92de-cc0f-4baf-9b8d-8bb6e4019d55-kube-api-access-dkfnd" (OuterVolumeSpecName: "kube-api-access-dkfnd") pod "61ed92de-cc0f-4baf-9b8d-8bb6e4019d55" (UID: "61ed92de-cc0f-4baf-9b8d-8bb6e4019d55"). InnerVolumeSpecName "kube-api-access-dkfnd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:56:49.142829 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.142808 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61ed92de-cc0f-4baf-9b8d-8bb6e4019d55-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "61ed92de-cc0f-4baf-9b8d-8bb6e4019d55" (UID: "61ed92de-cc0f-4baf-9b8d-8bb6e4019d55"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:56:49.142829 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.142810 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9302faf7-7f03-465f-8e33-8ea9cdf22fdd-kube-api-access-bgbs4" (OuterVolumeSpecName: "kube-api-access-bgbs4") pod "9302faf7-7f03-465f-8e33-8ea9cdf22fdd" (UID: "9302faf7-7f03-465f-8e33-8ea9cdf22fdd"). InnerVolumeSpecName "kube-api-access-bgbs4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:56:49.241705 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.241671 2578 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-6842b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9302faf7-7f03-465f-8e33-8ea9cdf22fdd-success-200-isvc-6842b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:56:49.241705 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.241702 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61ed92de-cc0f-4baf-9b8d-8bb6e4019d55-proxy-tls\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:56:49.241893 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.241716 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9302faf7-7f03-465f-8e33-8ea9cdf22fdd-proxy-tls\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:56:49.241893 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.241728 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-6842b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/61ed92de-cc0f-4baf-9b8d-8bb6e4019d55-error-404-isvc-6842b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:56:49.241893 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.241741 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dkfnd\" (UniqueName: \"kubernetes.io/projected/61ed92de-cc0f-4baf-9b8d-8bb6e4019d55-kube-api-access-dkfnd\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:56:49.241893 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.241754 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bgbs4\" (UniqueName: \"kubernetes.io/projected/9302faf7-7f03-465f-8e33-8ea9cdf22fdd-kube-api-access-bgbs4\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 21:56:49.952358 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.952322 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" event={"ID":"61ed92de-cc0f-4baf-9b8d-8bb6e4019d55","Type":"ContainerDied","Data":"b5d6e2c0e881a0b67ceac8cadcad1317b3c10d2e80ffdec25cf7eea32f3a8739"} Apr 24 21:56:49.952358 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.952343 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65" Apr 24 21:56:49.952852 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.952375 2578 scope.go:117] "RemoveContainer" containerID="182b8170e65a387c0435978b15a23ff190aeda7a78dc03c6923b469fb4ef7146" Apr 24 21:56:49.953763 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.953737 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" event={"ID":"9302faf7-7f03-465f-8e33-8ea9cdf22fdd","Type":"ContainerDied","Data":"c32a7d21821afac3daa7a88986df8b6b389f74a1b6bc0881d4519f323bd1dc87"} Apr 24 21:56:49.953857 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.953744 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc" Apr 24 21:56:49.954143 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.954117 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" podUID="6719c2fc-f689-4854-9eb7-9dbba8fe4bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 21:56:49.954517 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.954474 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" podUID="0e9e117e-b849-44a2-b9c2-e2290e170fe1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 21:56:49.961264 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.961141 2578 scope.go:117] "RemoveContainer" containerID="3da2cc6cb8c12f6b3d02d0ea2958d39f19390f275a129aff3c3f04b369ad8b92" Apr 24 21:56:49.968605 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.968591 2578 scope.go:117] "RemoveContainer" containerID="cc03c0c86c5628c072509eecab56623f992e060ff3268dabe2cd9c800bc82eeb" Apr 24 21:56:49.975278 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.975262 2578 scope.go:117] "RemoveContainer" containerID="a8450e82ec91990d4486549521f5c59a4b9dce46258412261072c6811100d628" Apr 24 21:56:49.992639 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:49.992619 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc"] Apr 24 21:56:50.004756 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:50.004735 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6842b-predictor-7bd4c8d8df-c8zpc"] Apr 24 21:56:50.025332 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:50.025301 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65"] Apr 24 21:56:50.032250 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:50.032213 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6842b-predictor-565dd576f8-rwz65"] Apr 24 21:56:50.426675 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:50.426640 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61ed92de-cc0f-4baf-9b8d-8bb6e4019d55" path="/var/lib/kubelet/pods/61ed92de-cc0f-4baf-9b8d-8bb6e4019d55/volumes" Apr 24 21:56:50.427152 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:50.427134 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9302faf7-7f03-465f-8e33-8ea9cdf22fdd" path="/var/lib/kubelet/pods/9302faf7-7f03-465f-8e33-8ea9cdf22fdd/volumes" Apr 24 21:56:52.830459 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:52.830427 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" podUID="42e97302-b125-4622-918c-666a6a28d9c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 21:56:53.835698 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:53.835662 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" podUID="f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 21:56:54.959069 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:54.959031 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" Apr 24 21:56:54.959511 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:54.959429 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" Apr 24 21:56:54.959612 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:54.959529 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" podUID="0e9e117e-b849-44a2-b9c2-e2290e170fe1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 21:56:54.959922 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:56:54.959897 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" podUID="6719c2fc-f689-4854-9eb7-9dbba8fe4bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 21:57:02.831529 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:57:02.831502 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" Apr 24 21:57:03.835592 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:57:03.835566 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" Apr 24 21:57:04.960501 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:57:04.960461 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" podUID="6719c2fc-f689-4854-9eb7-9dbba8fe4bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 21:57:04.960879 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:57:04.960461 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" podUID="0e9e117e-b849-44a2-b9c2-e2290e170fe1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 21:57:14.959975 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:57:14.959936 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" podUID="0e9e117e-b849-44a2-b9c2-e2290e170fe1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 21:57:14.960358 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:57:14.959935 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" podUID="6719c2fc-f689-4854-9eb7-9dbba8fe4bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 21:57:24.960330 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:57:24.960244 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" podUID="0e9e117e-b849-44a2-b9c2-e2290e170fe1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 21:57:24.960330 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:57:24.960264 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" podUID="6719c2fc-f689-4854-9eb7-9dbba8fe4bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 21:57:34.960805 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:57:34.960721 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" Apr 24 21:57:34.960805 ip-10-0-132-124 kubenswrapper[2578]: I0424 21:57:34.960773 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" Apr 24 22:06:00.561812 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:00.561781 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf"] Apr 24 22:06:00.564481 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:00.562045 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" podUID="0e9e117e-b849-44a2-b9c2-e2290e170fe1" containerName="kserve-container" containerID="cri-o://8515fc0affc42a483da93846dfeee417534359fa0d6f93f01901ef80bc7e0502" gracePeriod=30 Apr 24 22:06:00.564481 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:00.562089 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" podUID="0e9e117e-b849-44a2-b9c2-e2290e170fe1" containerName="kube-rbac-proxy" containerID="cri-o://27cad8b62e3626d9a94ee0b15467851c4afd87dd5f8b7e885f9b6e6067e1483f" gracePeriod=30 Apr 24 22:06:00.608477 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:00.608450 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx"] Apr 24 22:06:00.608805 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:00.608752 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" podUID="6719c2fc-f689-4854-9eb7-9dbba8fe4bfb" containerName="kserve-container" containerID="cri-o://50c6e66b089388c00050961bccf7dc895e75b13e0a02f80e4c2e01d60517c8f5" gracePeriod=30 Apr 24 22:06:00.608805 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:00.608776 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" podUID="6719c2fc-f689-4854-9eb7-9dbba8fe4bfb" containerName="kube-rbac-proxy" containerID="cri-o://fab1c0623472f14b433ff90b13a34d3da14645c9f22c64f8ddf08d5da888ed87" gracePeriod=30 Apr 24 22:06:01.573033 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:01.573000 2578 generic.go:358] "Generic (PLEG): container finished" podID="6719c2fc-f689-4854-9eb7-9dbba8fe4bfb" containerID="fab1c0623472f14b433ff90b13a34d3da14645c9f22c64f8ddf08d5da888ed87" exitCode=2 Apr 24 22:06:01.573447 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:01.573058 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" event={"ID":"6719c2fc-f689-4854-9eb7-9dbba8fe4bfb","Type":"ContainerDied","Data":"fab1c0623472f14b433ff90b13a34d3da14645c9f22c64f8ddf08d5da888ed87"} Apr 24 22:06:01.574421 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:01.574399 2578 generic.go:358] "Generic (PLEG): container finished" podID="0e9e117e-b849-44a2-b9c2-e2290e170fe1" containerID="27cad8b62e3626d9a94ee0b15467851c4afd87dd5f8b7e885f9b6e6067e1483f" exitCode=2 Apr 24 22:06:01.574533 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:01.574464 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" event={"ID":"0e9e117e-b849-44a2-b9c2-e2290e170fe1","Type":"ContainerDied","Data":"27cad8b62e3626d9a94ee0b15467851c4afd87dd5f8b7e885f9b6e6067e1483f"} Apr 24 22:06:03.581041 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:03.581016 2578 generic.go:358] "Generic (PLEG): container finished" podID="0e9e117e-b849-44a2-b9c2-e2290e170fe1" containerID="8515fc0affc42a483da93846dfeee417534359fa0d6f93f01901ef80bc7e0502" exitCode=0 Apr 24 22:06:03.581310 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:03.581051 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" event={"ID":"0e9e117e-b849-44a2-b9c2-e2290e170fe1","Type":"ContainerDied","Data":"8515fc0affc42a483da93846dfeee417534359fa0d6f93f01901ef80bc7e0502"} Apr 24 22:06:03.771642 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:03.771622 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" Apr 24 22:06:03.774529 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:03.774514 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" Apr 24 22:06:03.914233 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:03.914207 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-6941b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6719c2fc-f689-4854-9eb7-9dbba8fe4bfb-error-404-isvc-6941b-kube-rbac-proxy-sar-config\") pod \"6719c2fc-f689-4854-9eb7-9dbba8fe4bfb\" (UID: \"6719c2fc-f689-4854-9eb7-9dbba8fe4bfb\") " Apr 24 22:06:03.914412 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:03.914270 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0e9e117e-b849-44a2-b9c2-e2290e170fe1-proxy-tls\") pod \"0e9e117e-b849-44a2-b9c2-e2290e170fe1\" (UID: \"0e9e117e-b849-44a2-b9c2-e2290e170fe1\") " Apr 24 22:06:03.914412 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:03.914324 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6719c2fc-f689-4854-9eb7-9dbba8fe4bfb-proxy-tls\") pod \"6719c2fc-f689-4854-9eb7-9dbba8fe4bfb\" (UID: \"6719c2fc-f689-4854-9eb7-9dbba8fe4bfb\") " Apr 24 22:06:03.914412 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:03.914359 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-6941b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0e9e117e-b849-44a2-b9c2-e2290e170fe1-success-200-isvc-6941b-kube-rbac-proxy-sar-config\") pod \"0e9e117e-b849-44a2-b9c2-e2290e170fe1\" (UID: \"0e9e117e-b849-44a2-b9c2-e2290e170fe1\") " Apr 24 22:06:03.914612 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:03.914416 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q7dq\" (UniqueName: \"kubernetes.io/projected/6719c2fc-f689-4854-9eb7-9dbba8fe4bfb-kube-api-access-6q7dq\") pod \"6719c2fc-f689-4854-9eb7-9dbba8fe4bfb\" (UID: \"6719c2fc-f689-4854-9eb7-9dbba8fe4bfb\") " Apr 24 22:06:03.914612 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:03.914441 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88mdk\" (UniqueName: \"kubernetes.io/projected/0e9e117e-b849-44a2-b9c2-e2290e170fe1-kube-api-access-88mdk\") pod \"0e9e117e-b849-44a2-b9c2-e2290e170fe1\" (UID: \"0e9e117e-b849-44a2-b9c2-e2290e170fe1\") " Apr 24 22:06:03.914709 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:03.914603 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6719c2fc-f689-4854-9eb7-9dbba8fe4bfb-error-404-isvc-6941b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-6941b-kube-rbac-proxy-sar-config") pod "6719c2fc-f689-4854-9eb7-9dbba8fe4bfb" (UID: "6719c2fc-f689-4854-9eb7-9dbba8fe4bfb"). InnerVolumeSpecName "error-404-isvc-6941b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:06:03.914761 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:03.914722 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-6941b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6719c2fc-f689-4854-9eb7-9dbba8fe4bfb-error-404-isvc-6941b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 22:06:03.914816 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:03.914776 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e9e117e-b849-44a2-b9c2-e2290e170fe1-success-200-isvc-6941b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-6941b-kube-rbac-proxy-sar-config") pod "0e9e117e-b849-44a2-b9c2-e2290e170fe1" (UID: "0e9e117e-b849-44a2-b9c2-e2290e170fe1"). InnerVolumeSpecName "success-200-isvc-6941b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:06:03.916524 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:03.916496 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6719c2fc-f689-4854-9eb7-9dbba8fe4bfb-kube-api-access-6q7dq" (OuterVolumeSpecName: "kube-api-access-6q7dq") pod "6719c2fc-f689-4854-9eb7-9dbba8fe4bfb" (UID: "6719c2fc-f689-4854-9eb7-9dbba8fe4bfb"). InnerVolumeSpecName "kube-api-access-6q7dq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:06:03.916653 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:03.916533 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e9e117e-b849-44a2-b9c2-e2290e170fe1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0e9e117e-b849-44a2-b9c2-e2290e170fe1" (UID: "0e9e117e-b849-44a2-b9c2-e2290e170fe1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:06:03.916653 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:03.916538 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6719c2fc-f689-4854-9eb7-9dbba8fe4bfb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6719c2fc-f689-4854-9eb7-9dbba8fe4bfb" (UID: "6719c2fc-f689-4854-9eb7-9dbba8fe4bfb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:06:03.916653 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:03.916570 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e9e117e-b849-44a2-b9c2-e2290e170fe1-kube-api-access-88mdk" (OuterVolumeSpecName: "kube-api-access-88mdk") pod "0e9e117e-b849-44a2-b9c2-e2290e170fe1" (UID: "0e9e117e-b849-44a2-b9c2-e2290e170fe1"). InnerVolumeSpecName "kube-api-access-88mdk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:06:04.015799 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.015744 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0e9e117e-b849-44a2-b9c2-e2290e170fe1-proxy-tls\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 22:06:04.015799 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.015766 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6719c2fc-f689-4854-9eb7-9dbba8fe4bfb-proxy-tls\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 22:06:04.015799 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.015774 2578 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-6941b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0e9e117e-b849-44a2-b9c2-e2290e170fe1-success-200-isvc-6941b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 22:06:04.015799 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.015784 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6q7dq\" (UniqueName: \"kubernetes.io/projected/6719c2fc-f689-4854-9eb7-9dbba8fe4bfb-kube-api-access-6q7dq\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 22:06:04.015799 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.015796 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-88mdk\" (UniqueName: \"kubernetes.io/projected/0e9e117e-b849-44a2-b9c2-e2290e170fe1-kube-api-access-88mdk\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 22:06:04.585231 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.585198 2578 generic.go:358] "Generic (PLEG): container finished" podID="6719c2fc-f689-4854-9eb7-9dbba8fe4bfb" containerID="50c6e66b089388c00050961bccf7dc895e75b13e0a02f80e4c2e01d60517c8f5" exitCode=0 Apr 24 22:06:04.585629 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.585262 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" event={"ID":"6719c2fc-f689-4854-9eb7-9dbba8fe4bfb","Type":"ContainerDied","Data":"50c6e66b089388c00050961bccf7dc895e75b13e0a02f80e4c2e01d60517c8f5"} Apr 24 22:06:04.585629 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.585269 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" Apr 24 22:06:04.585629 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.585292 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx" event={"ID":"6719c2fc-f689-4854-9eb7-9dbba8fe4bfb","Type":"ContainerDied","Data":"39a4a6ff3266b05be22c54c617c3d9ee6ea206fd0e20041837556664e0bd2353"} Apr 24 22:06:04.585629 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.585307 2578 scope.go:117] "RemoveContainer" containerID="fab1c0623472f14b433ff90b13a34d3da14645c9f22c64f8ddf08d5da888ed87" Apr 24 22:06:04.586722 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.586699 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" event={"ID":"0e9e117e-b849-44a2-b9c2-e2290e170fe1","Type":"ContainerDied","Data":"618c2d4781e81f337b672b479e7db1ba17da748098b300d738c9d0309d4ca6d8"} Apr 24 22:06:04.586811 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.586756 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf" Apr 24 22:06:04.595208 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.595190 2578 scope.go:117] "RemoveContainer" containerID="50c6e66b089388c00050961bccf7dc895e75b13e0a02f80e4c2e01d60517c8f5" Apr 24 22:06:04.602304 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.602281 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf"] Apr 24 22:06:04.602868 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.602854 2578 scope.go:117] "RemoveContainer" containerID="fab1c0623472f14b433ff90b13a34d3da14645c9f22c64f8ddf08d5da888ed87" Apr 24 22:06:04.603129 ip-10-0-132-124 kubenswrapper[2578]: E0424 22:06:04.603111 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fab1c0623472f14b433ff90b13a34d3da14645c9f22c64f8ddf08d5da888ed87\": container with ID starting with fab1c0623472f14b433ff90b13a34d3da14645c9f22c64f8ddf08d5da888ed87 not found: ID does not exist" containerID="fab1c0623472f14b433ff90b13a34d3da14645c9f22c64f8ddf08d5da888ed87" Apr 24 22:06:04.603173 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.603141 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fab1c0623472f14b433ff90b13a34d3da14645c9f22c64f8ddf08d5da888ed87"} err="failed to get container status \"fab1c0623472f14b433ff90b13a34d3da14645c9f22c64f8ddf08d5da888ed87\": rpc error: code = NotFound desc = could not find container \"fab1c0623472f14b433ff90b13a34d3da14645c9f22c64f8ddf08d5da888ed87\": container with ID starting with fab1c0623472f14b433ff90b13a34d3da14645c9f22c64f8ddf08d5da888ed87 not found: ID does not exist" Apr 24 22:06:04.603173 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.603167 2578 scope.go:117] "RemoveContainer" containerID="50c6e66b089388c00050961bccf7dc895e75b13e0a02f80e4c2e01d60517c8f5" Apr 24 22:06:04.603408 ip-10-0-132-124 kubenswrapper[2578]: E0424 22:06:04.603389 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50c6e66b089388c00050961bccf7dc895e75b13e0a02f80e4c2e01d60517c8f5\": container with ID starting with 50c6e66b089388c00050961bccf7dc895e75b13e0a02f80e4c2e01d60517c8f5 not found: ID does not exist" containerID="50c6e66b089388c00050961bccf7dc895e75b13e0a02f80e4c2e01d60517c8f5" Apr 24 22:06:04.603459 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.603415 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50c6e66b089388c00050961bccf7dc895e75b13e0a02f80e4c2e01d60517c8f5"} err="failed to get container status \"50c6e66b089388c00050961bccf7dc895e75b13e0a02f80e4c2e01d60517c8f5\": rpc error: code = NotFound desc = could not find container \"50c6e66b089388c00050961bccf7dc895e75b13e0a02f80e4c2e01d60517c8f5\": container with ID starting with 50c6e66b089388c00050961bccf7dc895e75b13e0a02f80e4c2e01d60517c8f5 not found: ID does not exist" Apr 24 22:06:04.603459 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.603434 2578 scope.go:117] "RemoveContainer" containerID="27cad8b62e3626d9a94ee0b15467851c4afd87dd5f8b7e885f9b6e6067e1483f" Apr 24 22:06:04.606078 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.606057 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6941b-predictor-58bfbfdd8f-nrzjf"] Apr 24 22:06:04.610430 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.610409 2578 scope.go:117] "RemoveContainer" containerID="8515fc0affc42a483da93846dfeee417534359fa0d6f93f01901ef80bc7e0502" Apr 24 22:06:04.615375 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.615338 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx"] Apr 24 22:06:04.617283 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:04.617260 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6941b-predictor-db66cc574-zc9wx"] Apr 24 22:06:06.421155 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:06.421122 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e9e117e-b849-44a2-b9c2-e2290e170fe1" path="/var/lib/kubelet/pods/0e9e117e-b849-44a2-b9c2-e2290e170fe1/volumes" Apr 24 22:06:06.421605 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:06:06.421507 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6719c2fc-f689-4854-9eb7-9dbba8fe4bfb" path="/var/lib/kubelet/pods/6719c2fc-f689-4854-9eb7-9dbba8fe4bfb/volumes" Apr 24 22:13:34.805395 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:34.805361 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj"] Apr 24 22:13:34.807788 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:34.805642 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" podUID="42e97302-b125-4622-918c-666a6a28d9c9" containerName="kserve-container" containerID="cri-o://58340add28dfe75c3c7875035a32d8363435ad22ecb235b29803fce0f708b2ba" gracePeriod=30 Apr 24 22:13:34.807788 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:34.805683 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" podUID="42e97302-b125-4622-918c-666a6a28d9c9" containerName="kube-rbac-proxy" containerID="cri-o://19207ed01113a2372277da2dbf34164fa2771fffbe5d097a1a047b005978f239" gracePeriod=30 Apr 24 22:13:34.855137 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:34.855110 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk"] Apr 24 22:13:34.855377 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:34.855356 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" podUID="f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d" containerName="kserve-container" containerID="cri-o://0bcfa3f4fe95dac8bb726e47906b151e6a96fd2d6495c0cb1a2a18ab61a96bb3" gracePeriod=30 Apr 24 22:13:34.855433 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:34.855407 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" podUID="f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d" containerName="kube-rbac-proxy" containerID="cri-o://60dcc603dcadf01183a278279395cc124482beb1437a1b8d4df8e3d7ce2df221" gracePeriod=30 Apr 24 22:13:35.874406 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:35.874369 2578 generic.go:358] "Generic (PLEG): container finished" podID="42e97302-b125-4622-918c-666a6a28d9c9" containerID="19207ed01113a2372277da2dbf34164fa2771fffbe5d097a1a047b005978f239" exitCode=2 Apr 24 22:13:35.874828 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:35.874441 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" event={"ID":"42e97302-b125-4622-918c-666a6a28d9c9","Type":"ContainerDied","Data":"19207ed01113a2372277da2dbf34164fa2771fffbe5d097a1a047b005978f239"} Apr 24 22:13:35.875877 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:35.875855 2578 generic.go:358] "Generic (PLEG): container finished" podID="f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d" containerID="60dcc603dcadf01183a278279395cc124482beb1437a1b8d4df8e3d7ce2df221" exitCode=2 Apr 24 22:13:35.875991 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:35.875928 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" event={"ID":"f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d","Type":"ContainerDied","Data":"60dcc603dcadf01183a278279395cc124482beb1437a1b8d4df8e3d7ce2df221"} Apr 24 22:13:36.196372 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196298 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gjxgj/must-gather-cm2l7"] Apr 24 22:13:36.196649 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196638 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9302faf7-7f03-465f-8e33-8ea9cdf22fdd" containerName="kserve-container" Apr 24 22:13:36.196700 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196651 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9302faf7-7f03-465f-8e33-8ea9cdf22fdd" containerName="kserve-container" Apr 24 22:13:36.196700 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196664 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6719c2fc-f689-4854-9eb7-9dbba8fe4bfb" containerName="kube-rbac-proxy" Apr 24 22:13:36.196700 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196670 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6719c2fc-f689-4854-9eb7-9dbba8fe4bfb" containerName="kube-rbac-proxy" Apr 24 22:13:36.196700 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196684 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6719c2fc-f689-4854-9eb7-9dbba8fe4bfb" containerName="kserve-container" Apr 24 22:13:36.196700 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196691 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6719c2fc-f689-4854-9eb7-9dbba8fe4bfb" containerName="kserve-container" Apr 24 22:13:36.196700 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196698 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9302faf7-7f03-465f-8e33-8ea9cdf22fdd" containerName="kube-rbac-proxy" Apr 24 22:13:36.196887 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196703 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9302faf7-7f03-465f-8e33-8ea9cdf22fdd" containerName="kube-rbac-proxy" Apr 24 22:13:36.196887 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196710 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e9e117e-b849-44a2-b9c2-e2290e170fe1" containerName="kserve-container" Apr 24 22:13:36.196887 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196714 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9e117e-b849-44a2-b9c2-e2290e170fe1" containerName="kserve-container" Apr 24 22:13:36.196887 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196722 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="61ed92de-cc0f-4baf-9b8d-8bb6e4019d55" containerName="kserve-container" Apr 24 22:13:36.196887 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196727 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="61ed92de-cc0f-4baf-9b8d-8bb6e4019d55" containerName="kserve-container" Apr 24 22:13:36.196887 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196733 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e9e117e-b849-44a2-b9c2-e2290e170fe1" containerName="kube-rbac-proxy" Apr 24 22:13:36.196887 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196738 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9e117e-b849-44a2-b9c2-e2290e170fe1" containerName="kube-rbac-proxy" Apr 24 22:13:36.196887 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196745 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="61ed92de-cc0f-4baf-9b8d-8bb6e4019d55" containerName="kube-rbac-proxy" Apr 24 22:13:36.196887 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196750 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="61ed92de-cc0f-4baf-9b8d-8bb6e4019d55" containerName="kube-rbac-proxy" Apr 24 22:13:36.196887 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196802 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="61ed92de-cc0f-4baf-9b8d-8bb6e4019d55" containerName="kube-rbac-proxy" Apr 24 22:13:36.196887 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196817 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e9e117e-b849-44a2-b9c2-e2290e170fe1" containerName="kube-rbac-proxy" Apr 24 22:13:36.196887 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196825 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6719c2fc-f689-4854-9eb7-9dbba8fe4bfb" containerName="kserve-container" Apr 24 22:13:36.196887 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196834 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9302faf7-7f03-465f-8e33-8ea9cdf22fdd" containerName="kserve-container" Apr 24 22:13:36.196887 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196839 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="61ed92de-cc0f-4baf-9b8d-8bb6e4019d55" containerName="kserve-container" Apr 24 22:13:36.196887 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196846 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e9e117e-b849-44a2-b9c2-e2290e170fe1" containerName="kserve-container" Apr 24 22:13:36.196887 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196851 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6719c2fc-f689-4854-9eb7-9dbba8fe4bfb" containerName="kube-rbac-proxy" Apr 24 22:13:36.196887 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.196859 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9302faf7-7f03-465f-8e33-8ea9cdf22fdd" containerName="kube-rbac-proxy" Apr 24 22:13:36.199750 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.199735 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gjxgj/must-gather-cm2l7" Apr 24 22:13:36.201844 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.201824 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gjxgj\"/\"openshift-service-ca.crt\"" Apr 24 22:13:36.201956 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.201824 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gjxgj\"/\"default-dockercfg-jhlvb\"" Apr 24 22:13:36.202731 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.202713 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gjxgj\"/\"kube-root-ca.crt\"" Apr 24 22:13:36.216151 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.216126 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gjxgj/must-gather-cm2l7"] Apr 24 22:13:36.335258 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.335220 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f2f6e05b-f6b9-43be-b038-8bf4416d2140-must-gather-output\") pod \"must-gather-cm2l7\" (UID: \"f2f6e05b-f6b9-43be-b038-8bf4416d2140\") " pod="openshift-must-gather-gjxgj/must-gather-cm2l7" Apr 24 22:13:36.335258 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.335253 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6xn2\" (UniqueName: \"kubernetes.io/projected/f2f6e05b-f6b9-43be-b038-8bf4416d2140-kube-api-access-v6xn2\") pod \"must-gather-cm2l7\" (UID: \"f2f6e05b-f6b9-43be-b038-8bf4416d2140\") " pod="openshift-must-gather-gjxgj/must-gather-cm2l7" Apr 24 22:13:36.436355 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.436327 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f2f6e05b-f6b9-43be-b038-8bf4416d2140-must-gather-output\") pod \"must-gather-cm2l7\" (UID: \"f2f6e05b-f6b9-43be-b038-8bf4416d2140\") " pod="openshift-must-gather-gjxgj/must-gather-cm2l7" Apr 24 22:13:36.436355 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.436353 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6xn2\" (UniqueName: \"kubernetes.io/projected/f2f6e05b-f6b9-43be-b038-8bf4416d2140-kube-api-access-v6xn2\") pod \"must-gather-cm2l7\" (UID: \"f2f6e05b-f6b9-43be-b038-8bf4416d2140\") " pod="openshift-must-gather-gjxgj/must-gather-cm2l7" Apr 24 22:13:36.436701 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.436682 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f2f6e05b-f6b9-43be-b038-8bf4416d2140-must-gather-output\") pod \"must-gather-cm2l7\" (UID: \"f2f6e05b-f6b9-43be-b038-8bf4416d2140\") " pod="openshift-must-gather-gjxgj/must-gather-cm2l7" Apr 24 22:13:36.445213 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.445191 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6xn2\" (UniqueName: \"kubernetes.io/projected/f2f6e05b-f6b9-43be-b038-8bf4416d2140-kube-api-access-v6xn2\") pod \"must-gather-cm2l7\" (UID: \"f2f6e05b-f6b9-43be-b038-8bf4416d2140\") " pod="openshift-must-gather-gjxgj/must-gather-cm2l7" Apr 24 22:13:36.522289 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.522267 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gjxgj/must-gather-cm2l7" Apr 24 22:13:36.638710 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.638675 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gjxgj/must-gather-cm2l7"] Apr 24 22:13:36.644506 ip-10-0-132-124 kubenswrapper[2578]: W0424 22:13:36.644479 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2f6e05b_f6b9_43be_b038_8bf4416d2140.slice/crio-8e50e34fbb24a354e5b7eed530a57af994bf02c993062364902a86d1e004dcae WatchSource:0}: Error finding container 8e50e34fbb24a354e5b7eed530a57af994bf02c993062364902a86d1e004dcae: Status 404 returned error can't find the container with id 8e50e34fbb24a354e5b7eed530a57af994bf02c993062364902a86d1e004dcae Apr 24 22:13:36.645992 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.645976 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:13:36.879847 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:36.879755 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gjxgj/must-gather-cm2l7" event={"ID":"f2f6e05b-f6b9-43be-b038-8bf4416d2140","Type":"ContainerStarted","Data":"8e50e34fbb24a354e5b7eed530a57af994bf02c993062364902a86d1e004dcae"} Apr 24 22:13:37.825993 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:37.825962 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" podUID="42e97302-b125-4622-918c-666a6a28d9c9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.40:8643/healthz\": dial tcp 10.134.0.40:8643: connect: connection refused" Apr 24 22:13:37.886217 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:37.886186 2578 generic.go:358] "Generic (PLEG): container finished" podID="f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d" containerID="0bcfa3f4fe95dac8bb726e47906b151e6a96fd2d6495c0cb1a2a18ab61a96bb3" exitCode=0 Apr 24 22:13:37.886626 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:37.886278 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" event={"ID":"f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d","Type":"ContainerDied","Data":"0bcfa3f4fe95dac8bb726e47906b151e6a96fd2d6495c0cb1a2a18ab61a96bb3"} Apr 24 22:13:38.066210 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.066152 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" Apr 24 22:13:38.070150 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.070133 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" Apr 24 22:13:38.152649 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.152611 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn87r\" (UniqueName: \"kubernetes.io/projected/42e97302-b125-4622-918c-666a6a28d9c9-kube-api-access-pn87r\") pod \"42e97302-b125-4622-918c-666a6a28d9c9\" (UID: \"42e97302-b125-4622-918c-666a6a28d9c9\") " Apr 24 22:13:38.152799 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.152663 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-fc125-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d-error-404-isvc-fc125-kube-rbac-proxy-sar-config\") pod \"f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d\" (UID: \"f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d\") " Apr 24 22:13:38.152799 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.152781 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42e97302-b125-4622-918c-666a6a28d9c9-proxy-tls\") pod \"42e97302-b125-4622-918c-666a6a28d9c9\" (UID: \"42e97302-b125-4622-918c-666a6a28d9c9\") " Apr 24 22:13:38.152940 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.152817 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7szk\" (UniqueName: \"kubernetes.io/projected/f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d-kube-api-access-b7szk\") pod \"f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d\" (UID: \"f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d\") " Apr 24 22:13:38.152940 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.152852 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d-proxy-tls\") pod \"f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d\" (UID: \"f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d\") " Apr 24 22:13:38.152940 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.152917 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-fc125-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42e97302-b125-4622-918c-666a6a28d9c9-success-200-isvc-fc125-kube-rbac-proxy-sar-config\") pod \"42e97302-b125-4622-918c-666a6a28d9c9\" (UID: \"42e97302-b125-4622-918c-666a6a28d9c9\") " Apr 24 22:13:38.153161 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.153138 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d-error-404-isvc-fc125-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-fc125-kube-rbac-proxy-sar-config") pod "f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d" (UID: "f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d"). InnerVolumeSpecName "error-404-isvc-fc125-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:13:38.153434 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.153404 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42e97302-b125-4622-918c-666a6a28d9c9-success-200-isvc-fc125-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-fc125-kube-rbac-proxy-sar-config") pod "42e97302-b125-4622-918c-666a6a28d9c9" (UID: "42e97302-b125-4622-918c-666a6a28d9c9"). InnerVolumeSpecName "success-200-isvc-fc125-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:13:38.155434 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.155400 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d" (UID: "f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:13:38.155582 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.155528 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d-kube-api-access-b7szk" (OuterVolumeSpecName: "kube-api-access-b7szk") pod "f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d" (UID: "f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d"). InnerVolumeSpecName "kube-api-access-b7szk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:13:38.155714 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.155681 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e97302-b125-4622-918c-666a6a28d9c9-kube-api-access-pn87r" (OuterVolumeSpecName: "kube-api-access-pn87r") pod "42e97302-b125-4622-918c-666a6a28d9c9" (UID: "42e97302-b125-4622-918c-666a6a28d9c9"). InnerVolumeSpecName "kube-api-access-pn87r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:13:38.157575 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.157513 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e97302-b125-4622-918c-666a6a28d9c9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "42e97302-b125-4622-918c-666a6a28d9c9" (UID: "42e97302-b125-4622-918c-666a6a28d9c9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:13:38.254063 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.254028 2578 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-fc125-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42e97302-b125-4622-918c-666a6a28d9c9-success-200-isvc-fc125-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 22:13:38.254214 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.254067 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pn87r\" (UniqueName: \"kubernetes.io/projected/42e97302-b125-4622-918c-666a6a28d9c9-kube-api-access-pn87r\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 22:13:38.254214 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.254089 2578 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-fc125-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d-error-404-isvc-fc125-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 22:13:38.254214 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.254107 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42e97302-b125-4622-918c-666a6a28d9c9-proxy-tls\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 22:13:38.254214 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.254122 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b7szk\" (UniqueName: \"kubernetes.io/projected/f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d-kube-api-access-b7szk\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 22:13:38.254214 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.254137 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d-proxy-tls\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 22:13:38.892897 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.892859 2578 generic.go:358] "Generic (PLEG): container finished" podID="42e97302-b125-4622-918c-666a6a28d9c9" containerID="58340add28dfe75c3c7875035a32d8363435ad22ecb235b29803fce0f708b2ba" exitCode=0 Apr 24 22:13:38.893341 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.892939 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" event={"ID":"42e97302-b125-4622-918c-666a6a28d9c9","Type":"ContainerDied","Data":"58340add28dfe75c3c7875035a32d8363435ad22ecb235b29803fce0f708b2ba"} Apr 24 22:13:38.893341 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.892948 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" Apr 24 22:13:38.893341 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.892970 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj" event={"ID":"42e97302-b125-4622-918c-666a6a28d9c9","Type":"ContainerDied","Data":"86b69108c80c140101c332af0abe45ea6b7d87c592a8714d112539088c80e1e4"} Apr 24 22:13:38.893341 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.892988 2578 scope.go:117] "RemoveContainer" containerID="19207ed01113a2372277da2dbf34164fa2771fffbe5d097a1a047b005978f239" Apr 24 22:13:38.894768 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.894744 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" event={"ID":"f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d","Type":"ContainerDied","Data":"5d1cd1e298a2d52b2b0268baa9ce70cbcb3b1f607c9d2aa9a2b9267bf3c7c055"} Apr 24 22:13:38.894768 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.894765 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk" Apr 24 22:13:38.901241 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.901227 2578 scope.go:117] "RemoveContainer" containerID="58340add28dfe75c3c7875035a32d8363435ad22ecb235b29803fce0f708b2ba" Apr 24 22:13:38.907721 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.907706 2578 scope.go:117] "RemoveContainer" containerID="19207ed01113a2372277da2dbf34164fa2771fffbe5d097a1a047b005978f239" Apr 24 22:13:38.907950 ip-10-0-132-124 kubenswrapper[2578]: E0424 22:13:38.907932 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19207ed01113a2372277da2dbf34164fa2771fffbe5d097a1a047b005978f239\": container with ID starting with 19207ed01113a2372277da2dbf34164fa2771fffbe5d097a1a047b005978f239 not found: ID does not exist" containerID="19207ed01113a2372277da2dbf34164fa2771fffbe5d097a1a047b005978f239" Apr 24 22:13:38.908004 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.907977 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19207ed01113a2372277da2dbf34164fa2771fffbe5d097a1a047b005978f239"} err="failed to get container status \"19207ed01113a2372277da2dbf34164fa2771fffbe5d097a1a047b005978f239\": rpc error: code = NotFound desc = could not find container \"19207ed01113a2372277da2dbf34164fa2771fffbe5d097a1a047b005978f239\": container with ID starting with 19207ed01113a2372277da2dbf34164fa2771fffbe5d097a1a047b005978f239 not found: ID does not exist" Apr 24 22:13:38.908004 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.907996 2578 scope.go:117] "RemoveContainer" containerID="58340add28dfe75c3c7875035a32d8363435ad22ecb235b29803fce0f708b2ba" Apr 24 22:13:38.908196 ip-10-0-132-124 kubenswrapper[2578]: E0424 22:13:38.908179 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58340add28dfe75c3c7875035a32d8363435ad22ecb235b29803fce0f708b2ba\": container with ID starting with 58340add28dfe75c3c7875035a32d8363435ad22ecb235b29803fce0f708b2ba not found: ID does not exist" containerID="58340add28dfe75c3c7875035a32d8363435ad22ecb235b29803fce0f708b2ba" Apr 24 22:13:38.908240 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.908202 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58340add28dfe75c3c7875035a32d8363435ad22ecb235b29803fce0f708b2ba"} err="failed to get container status \"58340add28dfe75c3c7875035a32d8363435ad22ecb235b29803fce0f708b2ba\": rpc error: code = NotFound desc = could not find container \"58340add28dfe75c3c7875035a32d8363435ad22ecb235b29803fce0f708b2ba\": container with ID starting with 58340add28dfe75c3c7875035a32d8363435ad22ecb235b29803fce0f708b2ba not found: ID does not exist" Apr 24 22:13:38.908240 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.908217 2578 scope.go:117] "RemoveContainer" containerID="60dcc603dcadf01183a278279395cc124482beb1437a1b8d4df8e3d7ce2df221" Apr 24 22:13:38.911039 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.911019 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj"] Apr 24 22:13:38.913303 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.913284 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-fc125-predictor-67497cd975-mrbkj"] Apr 24 22:13:38.916172 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.916030 2578 scope.go:117] "RemoveContainer" containerID="0bcfa3f4fe95dac8bb726e47906b151e6a96fd2d6495c0cb1a2a18ab61a96bb3" Apr 24 22:13:38.926912 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.926896 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk"] Apr 24 22:13:38.929309 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:38.929292 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-fc125-predictor-7967db9f76-wrnsk"] Apr 24 22:13:40.423069 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:40.423034 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e97302-b125-4622-918c-666a6a28d9c9" path="/var/lib/kubelet/pods/42e97302-b125-4622-918c-666a6a28d9c9/volumes" Apr 24 22:13:40.423604 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:40.423580 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d" path="/var/lib/kubelet/pods/f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d/volumes" Apr 24 22:13:41.909018 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:41.908934 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gjxgj/must-gather-cm2l7" event={"ID":"f2f6e05b-f6b9-43be-b038-8bf4416d2140","Type":"ContainerStarted","Data":"84ad019065a3afd18ae95b40907018121b50664d5ca422c51360d31027056208"} Apr 24 22:13:41.909018 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:41.908972 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gjxgj/must-gather-cm2l7" event={"ID":"f2f6e05b-f6b9-43be-b038-8bf4416d2140","Type":"ContainerStarted","Data":"97362ea2b1f6220ba80bccaa21be847a211323bf965cfbd152711b774e5345b7"} Apr 24 22:13:41.925948 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:13:41.925904 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gjxgj/must-gather-cm2l7" podStartSLOduration=1.074862418 podStartE2EDuration="5.92588497s" podCreationTimestamp="2026-04-24 22:13:36 +0000 UTC" firstStartedPulling="2026-04-24 22:13:36.646118682 +0000 UTC m=+2762.821853334" lastFinishedPulling="2026-04-24 22:13:41.497141231 +0000 UTC m=+2767.672875886" observedRunningTime="2026-04-24 22:13:41.925017661 +0000 UTC m=+2768.100752334" watchObservedRunningTime="2026-04-24 22:13:41.92588497 +0000 UTC m=+2768.101619645" Apr 24 22:14:01.977352 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:01.977320 2578 generic.go:358] "Generic (PLEG): container finished" podID="f2f6e05b-f6b9-43be-b038-8bf4416d2140" containerID="97362ea2b1f6220ba80bccaa21be847a211323bf965cfbd152711b774e5345b7" exitCode=0 Apr 24 22:14:01.977749 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:01.977370 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gjxgj/must-gather-cm2l7" event={"ID":"f2f6e05b-f6b9-43be-b038-8bf4416d2140","Type":"ContainerDied","Data":"97362ea2b1f6220ba80bccaa21be847a211323bf965cfbd152711b774e5345b7"} Apr 24 22:14:01.977749 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:01.977687 2578 scope.go:117] "RemoveContainer" containerID="97362ea2b1f6220ba80bccaa21be847a211323bf965cfbd152711b774e5345b7" Apr 24 22:14:02.129189 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.129161 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gjxgj_must-gather-cm2l7_f2f6e05b-f6b9-43be-b038-8bf4416d2140/gather/0.log" Apr 24 22:14:02.680921 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.680845 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ltmfk/must-gather-xdbk5"] Apr 24 22:14:02.681160 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.681149 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d" containerName="kube-rbac-proxy" Apr 24 22:14:02.681201 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.681162 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d" containerName="kube-rbac-proxy" Apr 24 22:14:02.681201 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.681176 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42e97302-b125-4622-918c-666a6a28d9c9" containerName="kube-rbac-proxy" Apr 24 22:14:02.681201 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.681181 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e97302-b125-4622-918c-666a6a28d9c9" containerName="kube-rbac-proxy" Apr 24 22:14:02.681201 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.681187 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d" containerName="kserve-container" Apr 24 22:14:02.681201 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.681192 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d" containerName="kserve-container" Apr 24 22:14:02.681350 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.681206 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42e97302-b125-4622-918c-666a6a28d9c9" containerName="kserve-container" Apr 24 22:14:02.681350 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.681211 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e97302-b125-4622-918c-666a6a28d9c9" containerName="kserve-container" Apr 24 22:14:02.681350 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.681258 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="42e97302-b125-4622-918c-666a6a28d9c9" containerName="kube-rbac-proxy" Apr 24 22:14:02.681350 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.681269 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="42e97302-b125-4622-918c-666a6a28d9c9" containerName="kserve-container" Apr 24 22:14:02.681350 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.681275 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d" containerName="kube-rbac-proxy" Apr 24 22:14:02.681350 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.681282 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f13ffeb5-f1e9-487a-946d-0c5a5d04cb3d" containerName="kserve-container" Apr 24 22:14:02.683374 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.683361 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltmfk/must-gather-xdbk5" Apr 24 22:14:02.685679 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.685653 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-ltmfk\"/\"default-dockercfg-9d9fp\"" Apr 24 22:14:02.685813 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.685687 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-ltmfk\"/\"openshift-service-ca.crt\"" Apr 24 22:14:02.685813 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.685660 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-ltmfk\"/\"kube-root-ca.crt\"" Apr 24 22:14:02.693358 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.693336 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ltmfk/must-gather-xdbk5"] Apr 24 22:14:02.781782 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.781756 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8a8e468e-12a1-4396-8bad-59663e81e438-must-gather-output\") pod \"must-gather-xdbk5\" (UID: \"8a8e468e-12a1-4396-8bad-59663e81e438\") " pod="openshift-must-gather-ltmfk/must-gather-xdbk5" Apr 24 22:14:02.781903 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.781788 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkvg5\" (UniqueName: \"kubernetes.io/projected/8a8e468e-12a1-4396-8bad-59663e81e438-kube-api-access-bkvg5\") pod \"must-gather-xdbk5\" (UID: \"8a8e468e-12a1-4396-8bad-59663e81e438\") " pod="openshift-must-gather-ltmfk/must-gather-xdbk5" Apr 24 22:14:02.882608 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.882585 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8a8e468e-12a1-4396-8bad-59663e81e438-must-gather-output\") pod \"must-gather-xdbk5\" (UID: \"8a8e468e-12a1-4396-8bad-59663e81e438\") " pod="openshift-must-gather-ltmfk/must-gather-xdbk5" Apr 24 22:14:02.882708 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.882616 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkvg5\" (UniqueName: \"kubernetes.io/projected/8a8e468e-12a1-4396-8bad-59663e81e438-kube-api-access-bkvg5\") pod \"must-gather-xdbk5\" (UID: \"8a8e468e-12a1-4396-8bad-59663e81e438\") " pod="openshift-must-gather-ltmfk/must-gather-xdbk5" Apr 24 22:14:02.882895 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.882879 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8a8e468e-12a1-4396-8bad-59663e81e438-must-gather-output\") pod \"must-gather-xdbk5\" (UID: \"8a8e468e-12a1-4396-8bad-59663e81e438\") " pod="openshift-must-gather-ltmfk/must-gather-xdbk5" Apr 24 22:14:02.891812 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.891786 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkvg5\" (UniqueName: \"kubernetes.io/projected/8a8e468e-12a1-4396-8bad-59663e81e438-kube-api-access-bkvg5\") pod \"must-gather-xdbk5\" (UID: \"8a8e468e-12a1-4396-8bad-59663e81e438\") " pod="openshift-must-gather-ltmfk/must-gather-xdbk5" Apr 24 22:14:02.992623 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:02.992603 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltmfk/must-gather-xdbk5" Apr 24 22:14:03.106495 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:03.106471 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ltmfk/must-gather-xdbk5"] Apr 24 22:14:03.108936 ip-10-0-132-124 kubenswrapper[2578]: W0424 22:14:03.108903 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a8e468e_12a1_4396_8bad_59663e81e438.slice/crio-9a98d14bd308b94b6d6cb6802210ec7343b1e746e336ba46f4ef75a7d4b3aa48 WatchSource:0}: Error finding container 9a98d14bd308b94b6d6cb6802210ec7343b1e746e336ba46f4ef75a7d4b3aa48: Status 404 returned error can't find the container with id 9a98d14bd308b94b6d6cb6802210ec7343b1e746e336ba46f4ef75a7d4b3aa48 Apr 24 22:14:03.990410 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:03.990361 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltmfk/must-gather-xdbk5" event={"ID":"8a8e468e-12a1-4396-8bad-59663e81e438","Type":"ContainerStarted","Data":"9a98d14bd308b94b6d6cb6802210ec7343b1e746e336ba46f4ef75a7d4b3aa48"} Apr 24 22:14:04.995990 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:04.995952 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltmfk/must-gather-xdbk5" event={"ID":"8a8e468e-12a1-4396-8bad-59663e81e438","Type":"ContainerStarted","Data":"b7c0f6b3673a8347ab97239696e39bd18fea90cb406089d0b39f643601764da6"} Apr 24 22:14:04.996410 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:04.995997 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltmfk/must-gather-xdbk5" event={"ID":"8a8e468e-12a1-4396-8bad-59663e81e438","Type":"ContainerStarted","Data":"16c54cb70a5c3c2b8c7d17cfb795c8d60ff31e565625ebca378f9536410c4177"} Apr 24 22:14:05.014129 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:05.014066 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ltmfk/must-gather-xdbk5" podStartSLOduration=2.194303745 podStartE2EDuration="3.014049862s" podCreationTimestamp="2026-04-24 22:14:02 +0000 UTC" firstStartedPulling="2026-04-24 22:14:03.110588138 +0000 UTC m=+2789.286322789" lastFinishedPulling="2026-04-24 22:14:03.930334251 +0000 UTC m=+2790.106068906" observedRunningTime="2026-04-24 22:14:05.01246533 +0000 UTC m=+2791.188200003" watchObservedRunningTime="2026-04-24 22:14:05.014049862 +0000 UTC m=+2791.189784534" Apr 24 22:14:05.400118 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:05.400046 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-trqmn_01bf52ee-b1fb-4321-b6af-07d7d9f23bf8/global-pull-secret-syncer/0.log" Apr 24 22:14:05.476659 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:05.476623 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-csj8d_83486ef0-fe96-4f97-a0e5-bec233422715/konnectivity-agent/0.log" Apr 24 22:14:05.579338 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:05.579309 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-124.ec2.internal_455f5e57cbc63caaa90072de2a2bd596/haproxy/0.log" Apr 24 22:14:07.534835 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:07.534795 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gjxgj/must-gather-cm2l7"] Apr 24 22:14:07.535408 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:07.535148 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-gjxgj/must-gather-cm2l7" podUID="f2f6e05b-f6b9-43be-b038-8bf4416d2140" containerName="copy" containerID="cri-o://84ad019065a3afd18ae95b40907018121b50664d5ca422c51360d31027056208" gracePeriod=2 Apr 24 22:14:07.537433 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:07.537405 2578 status_manager.go:895] "Failed to get status for pod" podUID="f2f6e05b-f6b9-43be-b038-8bf4416d2140" pod="openshift-must-gather-gjxgj/must-gather-cm2l7" err="pods \"must-gather-cm2l7\" is forbidden: User \"system:node:ip-10-0-132-124.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-gjxgj\": no relationship found between node 'ip-10-0-132-124.ec2.internal' and this object" Apr 24 22:14:07.538103 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:07.538077 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gjxgj/must-gather-cm2l7"] Apr 24 22:14:07.884153 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:07.884070 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gjxgj_must-gather-cm2l7_f2f6e05b-f6b9-43be-b038-8bf4416d2140/copy/0.log" Apr 24 22:14:07.884968 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:07.884704 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gjxgj/must-gather-cm2l7" Apr 24 22:14:07.887004 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:07.886965 2578 status_manager.go:895] "Failed to get status for pod" podUID="f2f6e05b-f6b9-43be-b038-8bf4416d2140" pod="openshift-must-gather-gjxgj/must-gather-cm2l7" err="pods \"must-gather-cm2l7\" is forbidden: User \"system:node:ip-10-0-132-124.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-gjxgj\": no relationship found between node 'ip-10-0-132-124.ec2.internal' and this object" Apr 24 22:14:07.934102 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:07.932685 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f2f6e05b-f6b9-43be-b038-8bf4416d2140-must-gather-output\") pod \"f2f6e05b-f6b9-43be-b038-8bf4416d2140\" (UID: \"f2f6e05b-f6b9-43be-b038-8bf4416d2140\") " Apr 24 22:14:07.934334 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:07.934047 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2f6e05b-f6b9-43be-b038-8bf4416d2140-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f2f6e05b-f6b9-43be-b038-8bf4416d2140" (UID: "f2f6e05b-f6b9-43be-b038-8bf4416d2140"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:14:07.934569 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:07.934525 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6xn2\" (UniqueName: \"kubernetes.io/projected/f2f6e05b-f6b9-43be-b038-8bf4416d2140-kube-api-access-v6xn2\") pod \"f2f6e05b-f6b9-43be-b038-8bf4416d2140\" (UID: \"f2f6e05b-f6b9-43be-b038-8bf4416d2140\") " Apr 24 22:14:07.936536 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:07.936505 2578 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f2f6e05b-f6b9-43be-b038-8bf4416d2140-must-gather-output\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 22:14:07.945112 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:07.945070 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2f6e05b-f6b9-43be-b038-8bf4416d2140-kube-api-access-v6xn2" (OuterVolumeSpecName: "kube-api-access-v6xn2") pod "f2f6e05b-f6b9-43be-b038-8bf4416d2140" (UID: "f2f6e05b-f6b9-43be-b038-8bf4416d2140"). InnerVolumeSpecName "kube-api-access-v6xn2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:14:08.014772 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:08.014686 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gjxgj_must-gather-cm2l7_f2f6e05b-f6b9-43be-b038-8bf4416d2140/copy/0.log" Apr 24 22:14:08.018571 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:08.015281 2578 generic.go:358] "Generic (PLEG): container finished" podID="f2f6e05b-f6b9-43be-b038-8bf4416d2140" containerID="84ad019065a3afd18ae95b40907018121b50664d5ca422c51360d31027056208" exitCode=143 Apr 24 22:14:08.018571 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:08.015363 2578 scope.go:117] "RemoveContainer" containerID="84ad019065a3afd18ae95b40907018121b50664d5ca422c51360d31027056208" Apr 24 22:14:08.018571 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:08.015498 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gjxgj/must-gather-cm2l7" Apr 24 22:14:08.022672 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:08.022641 2578 status_manager.go:895] "Failed to get status for pod" podUID="f2f6e05b-f6b9-43be-b038-8bf4416d2140" pod="openshift-must-gather-gjxgj/must-gather-cm2l7" err="pods \"must-gather-cm2l7\" is forbidden: User \"system:node:ip-10-0-132-124.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-gjxgj\": no relationship found between node 'ip-10-0-132-124.ec2.internal' and this object" Apr 24 22:14:08.034060 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:08.034040 2578 scope.go:117] "RemoveContainer" containerID="97362ea2b1f6220ba80bccaa21be847a211323bf965cfbd152711b774e5345b7" Apr 24 22:14:08.037415 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:08.037393 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v6xn2\" (UniqueName: \"kubernetes.io/projected/f2f6e05b-f6b9-43be-b038-8bf4416d2140-kube-api-access-v6xn2\") on node \"ip-10-0-132-124.ec2.internal\" DevicePath \"\"" Apr 24 22:14:08.044106 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:08.043730 2578 status_manager.go:895] "Failed to get status for pod" podUID="f2f6e05b-f6b9-43be-b038-8bf4416d2140" pod="openshift-must-gather-gjxgj/must-gather-cm2l7" err="pods \"must-gather-cm2l7\" is forbidden: User \"system:node:ip-10-0-132-124.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-gjxgj\": no relationship found between node 'ip-10-0-132-124.ec2.internal' and this object" Apr 24 22:14:08.056516 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:08.056475 2578 scope.go:117] "RemoveContainer" containerID="84ad019065a3afd18ae95b40907018121b50664d5ca422c51360d31027056208" Apr 24 22:14:08.057292 ip-10-0-132-124 kubenswrapper[2578]: E0424 22:14:08.056910 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84ad019065a3afd18ae95b40907018121b50664d5ca422c51360d31027056208\": container with ID starting with 84ad019065a3afd18ae95b40907018121b50664d5ca422c51360d31027056208 not found: ID does not exist" containerID="84ad019065a3afd18ae95b40907018121b50664d5ca422c51360d31027056208" Apr 24 22:14:08.057292 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:08.056949 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84ad019065a3afd18ae95b40907018121b50664d5ca422c51360d31027056208"} err="failed to get container status \"84ad019065a3afd18ae95b40907018121b50664d5ca422c51360d31027056208\": rpc error: code = NotFound desc = could not find container \"84ad019065a3afd18ae95b40907018121b50664d5ca422c51360d31027056208\": container with ID starting with 84ad019065a3afd18ae95b40907018121b50664d5ca422c51360d31027056208 not found: ID does not exist" Apr 24 22:14:08.057292 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:08.056977 2578 scope.go:117] "RemoveContainer" containerID="97362ea2b1f6220ba80bccaa21be847a211323bf965cfbd152711b774e5345b7" Apr 24 22:14:08.057292 ip-10-0-132-124 kubenswrapper[2578]: E0424 22:14:08.057219 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97362ea2b1f6220ba80bccaa21be847a211323bf965cfbd152711b774e5345b7\": container with ID starting with 97362ea2b1f6220ba80bccaa21be847a211323bf965cfbd152711b774e5345b7 not found: ID does not exist" containerID="97362ea2b1f6220ba80bccaa21be847a211323bf965cfbd152711b774e5345b7" Apr 24 22:14:08.057292 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:08.057246 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97362ea2b1f6220ba80bccaa21be847a211323bf965cfbd152711b774e5345b7"} err="failed to get container status \"97362ea2b1f6220ba80bccaa21be847a211323bf965cfbd152711b774e5345b7\": rpc error: code = NotFound desc = could not find container \"97362ea2b1f6220ba80bccaa21be847a211323bf965cfbd152711b774e5345b7\": container with ID starting with 97362ea2b1f6220ba80bccaa21be847a211323bf965cfbd152711b774e5345b7 not found: ID does not exist" Apr 24 22:14:08.423563 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:08.423511 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2f6e05b-f6b9-43be-b038-8bf4416d2140" path="/var/lib/kubelet/pods/f2f6e05b-f6b9-43be-b038-8bf4416d2140/volumes" Apr 24 22:14:08.954857 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:08.954777 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8f2739b7-bd60-4645-80e1-15fbf600cd25/alertmanager/0.log" Apr 24 22:14:08.977632 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:08.977462 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8f2739b7-bd60-4645-80e1-15fbf600cd25/config-reloader/0.log" Apr 24 22:14:08.999866 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:08.999715 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8f2739b7-bd60-4645-80e1-15fbf600cd25/kube-rbac-proxy-web/0.log" Apr 24 22:14:09.021374 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.021311 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8f2739b7-bd60-4645-80e1-15fbf600cd25/kube-rbac-proxy/0.log" Apr 24 22:14:09.046659 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.046629 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8f2739b7-bd60-4645-80e1-15fbf600cd25/kube-rbac-proxy-metric/0.log" Apr 24 22:14:09.069713 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.069685 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8f2739b7-bd60-4645-80e1-15fbf600cd25/prom-label-proxy/0.log" Apr 24 22:14:09.094380 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.094354 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8f2739b7-bd60-4645-80e1-15fbf600cd25/init-config-reloader/0.log" Apr 24 22:14:09.237137 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.237104 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-pqztz_95be5fef-54c5-493f-8d81-418b407f5be9/kube-state-metrics/0.log" Apr 24 22:14:09.257308 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.257285 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-pqztz_95be5fef-54c5-493f-8d81-418b407f5be9/kube-rbac-proxy-main/0.log" Apr 24 22:14:09.286383 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.286293 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-pqztz_95be5fef-54c5-493f-8d81-418b407f5be9/kube-rbac-proxy-self/0.log" Apr 24 22:14:09.311773 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.311734 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-fb766d8dc-4j2hh_dabc3649-986c-417d-8a62-0996e4d2bc1c/metrics-server/0.log" Apr 24 22:14:09.335024 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.334992 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-qqfdb_747794d3-8a8d-4ce6-8607-23994becf49d/monitoring-plugin/0.log" Apr 24 22:14:09.365827 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.365800 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dhqs2_96626301-1303-4330-95d7-03f32a1420c6/node-exporter/0.log" Apr 24 22:14:09.388444 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.388410 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dhqs2_96626301-1303-4330-95d7-03f32a1420c6/kube-rbac-proxy/0.log" Apr 24 22:14:09.415571 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.413202 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dhqs2_96626301-1303-4330-95d7-03f32a1420c6/init-textfile/0.log" Apr 24 22:14:09.629794 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.629762 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-wp9pw_b3482684-9774-41e9-b6e1-2fc96e50331a/kube-rbac-proxy-main/0.log" Apr 24 22:14:09.655616 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.655591 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-wp9pw_b3482684-9774-41e9-b6e1-2fc96e50331a/kube-rbac-proxy-self/0.log" Apr 24 22:14:09.680372 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.680336 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-wp9pw_b3482684-9774-41e9-b6e1-2fc96e50331a/openshift-state-metrics/0.log" Apr 24 22:14:09.725028 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.724999 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5be176d2-7d94-46ec-82df-9f25aaa1ffd4/prometheus/0.log" Apr 24 22:14:09.742143 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.742117 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5be176d2-7d94-46ec-82df-9f25aaa1ffd4/config-reloader/0.log" Apr 24 22:14:09.762654 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.762624 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5be176d2-7d94-46ec-82df-9f25aaa1ffd4/thanos-sidecar/0.log" Apr 24 22:14:09.785045 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.785014 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5be176d2-7d94-46ec-82df-9f25aaa1ffd4/kube-rbac-proxy-web/0.log" Apr 24 22:14:09.804628 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.804599 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5be176d2-7d94-46ec-82df-9f25aaa1ffd4/kube-rbac-proxy/0.log" Apr 24 22:14:09.826368 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.826337 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5be176d2-7d94-46ec-82df-9f25aaa1ffd4/kube-rbac-proxy-thanos/0.log" Apr 24 22:14:09.849096 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.849073 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5be176d2-7d94-46ec-82df-9f25aaa1ffd4/init-config-reloader/0.log" Apr 24 22:14:09.983411 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:09.983378 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d6b7fc5bf-9lxwp_e3fdaddf-786f-413e-83cb-3bab2578a8a5/telemeter-client/0.log" Apr 24 22:14:10.012372 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:10.012340 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d6b7fc5bf-9lxwp_e3fdaddf-786f-413e-83cb-3bab2578a8a5/reload/0.log" Apr 24 22:14:10.042263 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:10.042231 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d6b7fc5bf-9lxwp_e3fdaddf-786f-413e-83cb-3bab2578a8a5/kube-rbac-proxy/0.log" Apr 24 22:14:10.091615 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:10.091560 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6d7ffb5569-nnk78_8814da05-08f7-4703-81e5-78d626a5bcd6/thanos-query/0.log" Apr 24 22:14:10.118379 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:10.118325 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6d7ffb5569-nnk78_8814da05-08f7-4703-81e5-78d626a5bcd6/kube-rbac-proxy-web/0.log" Apr 24 22:14:10.140343 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:10.140310 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6d7ffb5569-nnk78_8814da05-08f7-4703-81e5-78d626a5bcd6/kube-rbac-proxy/0.log" Apr 24 22:14:10.166440 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:10.166410 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6d7ffb5569-nnk78_8814da05-08f7-4703-81e5-78d626a5bcd6/prom-label-proxy/0.log" Apr 24 22:14:10.201727 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:10.201374 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6d7ffb5569-nnk78_8814da05-08f7-4703-81e5-78d626a5bcd6/kube-rbac-proxy-rules/0.log" Apr 24 22:14:10.226691 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:10.226658 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6d7ffb5569-nnk78_8814da05-08f7-4703-81e5-78d626a5bcd6/kube-rbac-proxy-metrics/0.log" Apr 24 22:14:11.956321 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:11.956296 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-557bb87958-ps7pj_4553a899-e53e-4df4-a0c2-c6dd0e52fb68/console/0.log" Apr 24 22:14:12.400109 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.400078 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw"] Apr 24 22:14:12.400411 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.400398 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2f6e05b-f6b9-43be-b038-8bf4416d2140" containerName="gather" Apr 24 22:14:12.400411 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.400411 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f6e05b-f6b9-43be-b038-8bf4416d2140" containerName="gather" Apr 24 22:14:12.400574 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.400426 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2f6e05b-f6b9-43be-b038-8bf4416d2140" containerName="copy" Apr 24 22:14:12.400574 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.400432 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f6e05b-f6b9-43be-b038-8bf4416d2140" containerName="copy" Apr 24 22:14:12.400574 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.400518 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f2f6e05b-f6b9-43be-b038-8bf4416d2140" containerName="gather" Apr 24 22:14:12.400574 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.400568 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f2f6e05b-f6b9-43be-b038-8bf4416d2140" containerName="copy" Apr 24 22:14:12.403465 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.403443 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw" Apr 24 22:14:12.410392 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.410367 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw"] Apr 24 22:14:12.480337 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.480305 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/749f3f7d-e209-42fe-9ed1-c975f904eb8a-sys\") pod \"perf-node-gather-daemonset-54kcw\" (UID: \"749f3f7d-e209-42fe-9ed1-c975f904eb8a\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw" Apr 24 22:14:12.480337 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.480340 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/749f3f7d-e209-42fe-9ed1-c975f904eb8a-podres\") pod \"perf-node-gather-daemonset-54kcw\" (UID: \"749f3f7d-e209-42fe-9ed1-c975f904eb8a\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw" Apr 24 22:14:12.480615 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.480400 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw5z9\" (UniqueName: \"kubernetes.io/projected/749f3f7d-e209-42fe-9ed1-c975f904eb8a-kube-api-access-pw5z9\") pod \"perf-node-gather-daemonset-54kcw\" (UID: \"749f3f7d-e209-42fe-9ed1-c975f904eb8a\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw" Apr 24 22:14:12.480615 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.480432 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/749f3f7d-e209-42fe-9ed1-c975f904eb8a-proc\") pod \"perf-node-gather-daemonset-54kcw\" (UID: \"749f3f7d-e209-42fe-9ed1-c975f904eb8a\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw" Apr 24 22:14:12.480615 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.480515 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/749f3f7d-e209-42fe-9ed1-c975f904eb8a-lib-modules\") pod \"perf-node-gather-daemonset-54kcw\" (UID: \"749f3f7d-e209-42fe-9ed1-c975f904eb8a\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw" Apr 24 22:14:12.581811 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.581772 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/749f3f7d-e209-42fe-9ed1-c975f904eb8a-lib-modules\") pod \"perf-node-gather-daemonset-54kcw\" (UID: \"749f3f7d-e209-42fe-9ed1-c975f904eb8a\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw" Apr 24 22:14:12.582000 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.581867 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/749f3f7d-e209-42fe-9ed1-c975f904eb8a-sys\") pod \"perf-node-gather-daemonset-54kcw\" (UID: \"749f3f7d-e209-42fe-9ed1-c975f904eb8a\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw" Apr 24 22:14:12.582000 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.581887 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/749f3f7d-e209-42fe-9ed1-c975f904eb8a-podres\") pod \"perf-node-gather-daemonset-54kcw\" (UID: \"749f3f7d-e209-42fe-9ed1-c975f904eb8a\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw" Apr 24 22:14:12.582000 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.581925 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pw5z9\" (UniqueName: \"kubernetes.io/projected/749f3f7d-e209-42fe-9ed1-c975f904eb8a-kube-api-access-pw5z9\") pod \"perf-node-gather-daemonset-54kcw\" (UID: \"749f3f7d-e209-42fe-9ed1-c975f904eb8a\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw" Apr 24 22:14:12.582000 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.581931 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/749f3f7d-e209-42fe-9ed1-c975f904eb8a-lib-modules\") pod \"perf-node-gather-daemonset-54kcw\" (UID: \"749f3f7d-e209-42fe-9ed1-c975f904eb8a\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw" Apr 24 22:14:12.582000 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.581955 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/749f3f7d-e209-42fe-9ed1-c975f904eb8a-proc\") pod \"perf-node-gather-daemonset-54kcw\" (UID: \"749f3f7d-e209-42fe-9ed1-c975f904eb8a\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw" Apr 24 22:14:12.582000 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.581990 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/749f3f7d-e209-42fe-9ed1-c975f904eb8a-sys\") pod \"perf-node-gather-daemonset-54kcw\" (UID: \"749f3f7d-e209-42fe-9ed1-c975f904eb8a\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw" Apr 24 22:14:12.582307 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.582026 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/749f3f7d-e209-42fe-9ed1-c975f904eb8a-podres\") pod \"perf-node-gather-daemonset-54kcw\" (UID: \"749f3f7d-e209-42fe-9ed1-c975f904eb8a\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw" Apr 24 22:14:12.582307 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.582031 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/749f3f7d-e209-42fe-9ed1-c975f904eb8a-proc\") pod \"perf-node-gather-daemonset-54kcw\" (UID: \"749f3f7d-e209-42fe-9ed1-c975f904eb8a\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw" Apr 24 22:14:12.590132 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.590111 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw5z9\" (UniqueName: \"kubernetes.io/projected/749f3f7d-e209-42fe-9ed1-c975f904eb8a-kube-api-access-pw5z9\") pod \"perf-node-gather-daemonset-54kcw\" (UID: \"749f3f7d-e209-42fe-9ed1-c975f904eb8a\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw" Apr 24 22:14:12.715687 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.715609 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw" Apr 24 22:14:12.861855 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:12.861828 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw"] Apr 24 22:14:12.864890 ip-10-0-132-124 kubenswrapper[2578]: W0424 22:14:12.864860 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod749f3f7d_e209_42fe_9ed1_c975f904eb8a.slice/crio-7484380b1371df8e2f2a370a825f05fa348dbf2a8b091aa3b846ccfa6dd1474d WatchSource:0}: Error finding container 7484380b1371df8e2f2a370a825f05fa348dbf2a8b091aa3b846ccfa6dd1474d: Status 404 returned error can't find the container with id 7484380b1371df8e2f2a370a825f05fa348dbf2a8b091aa3b846ccfa6dd1474d Apr 24 22:14:13.037385 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:13.037347 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw" event={"ID":"749f3f7d-e209-42fe-9ed1-c975f904eb8a","Type":"ContainerStarted","Data":"8de1c443e87f533a41e0abadc50100413a4032352412bc97afc64dfd3fa21022"} Apr 24 22:14:13.037818 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:13.037397 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw" event={"ID":"749f3f7d-e209-42fe-9ed1-c975f904eb8a","Type":"ContainerStarted","Data":"7484380b1371df8e2f2a370a825f05fa348dbf2a8b091aa3b846ccfa6dd1474d"} Apr 24 22:14:13.037818 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:13.037652 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw" Apr 24 22:14:13.054828 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:13.054784 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw" podStartSLOduration=1.054771161 podStartE2EDuration="1.054771161s" podCreationTimestamp="2026-04-24 22:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:14:13.053799344 +0000 UTC m=+2799.229534017" watchObservedRunningTime="2026-04-24 22:14:13.054771161 +0000 UTC m=+2799.230505834" Apr 24 22:14:13.124819 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:13.124790 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rrzpv_cb2a7484-3606-4f41-8444-8efbab81200b/dns/0.log" Apr 24 22:14:13.144335 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:13.144310 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rrzpv_cb2a7484-3606-4f41-8444-8efbab81200b/kube-rbac-proxy/0.log" Apr 24 22:14:13.209103 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:13.209080 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-48sm2_0f52ff53-2325-461a-9bb4-dde9a76323fb/dns-node-resolver/0.log" Apr 24 22:14:13.687367 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:13.687333 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5t5rz_9c64a924-1f49-45bd-870b-9fb356e61e75/node-ca/0.log" Apr 24 22:14:14.745027 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:14.744969 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-48v2j_e329951c-d495-4bf9-8751-384a26c4a2ce/serve-healthcheck-canary/0.log" Apr 24 22:14:15.342777 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:15.342746 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wbfvk_bdd64acb-d899-4f75-b460-f0b05adbbbab/kube-rbac-proxy/0.log" Apr 24 22:14:15.361933 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:15.361907 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wbfvk_bdd64acb-d899-4f75-b460-f0b05adbbbab/exporter/0.log" Apr 24 22:14:15.382596 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:15.382574 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wbfvk_bdd64acb-d899-4f75-b460-f0b05adbbbab/extractor/0.log" Apr 24 22:14:17.835354 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:17.835315 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-szfgc_4ad9ac5d-8db6-48b3-b364-07a2d30cd0e5/s3-init/0.log" Apr 24 22:14:19.051437 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:19.051413 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-54kcw" Apr 24 22:14:23.091214 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:23.091178 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fztlb_05330bcd-753c-4b2c-add6-ad37ce95d4d1/kube-multus-additional-cni-plugins/0.log" Apr 24 22:14:23.113428 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:23.113403 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fztlb_05330bcd-753c-4b2c-add6-ad37ce95d4d1/egress-router-binary-copy/0.log" Apr 24 22:14:23.136282 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:23.136246 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fztlb_05330bcd-753c-4b2c-add6-ad37ce95d4d1/cni-plugins/0.log" Apr 24 22:14:23.158402 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:23.158372 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fztlb_05330bcd-753c-4b2c-add6-ad37ce95d4d1/bond-cni-plugin/0.log" Apr 24 22:14:23.181466 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:23.181440 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fztlb_05330bcd-753c-4b2c-add6-ad37ce95d4d1/routeoverride-cni/0.log" Apr 24 22:14:23.204019 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:23.203998 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fztlb_05330bcd-753c-4b2c-add6-ad37ce95d4d1/whereabouts-cni-bincopy/0.log" Apr 24 22:14:23.228773 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:23.228752 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fztlb_05330bcd-753c-4b2c-add6-ad37ce95d4d1/whereabouts-cni/0.log" Apr 24 22:14:23.417666 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:23.417597 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lshqp_fd8fb7f1-8db0-4a33-b951-6a4739d1a1cd/kube-multus/0.log" Apr 24 22:14:23.488202 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:23.488174 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6wxzd_b536a581-6c7c-4e7e-9fb3-6223e4ab90f0/network-metrics-daemon/0.log" Apr 24 22:14:23.510951 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:23.510916 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6wxzd_b536a581-6c7c-4e7e-9fb3-6223e4ab90f0/kube-rbac-proxy/0.log" Apr 24 22:14:24.257330 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:24.257297 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ecf62cd7-5041-4b2d-8eff-453431841db5/ovn-controller/0.log" Apr 24 22:14:24.289630 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:24.289594 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ecf62cd7-5041-4b2d-8eff-453431841db5/ovn-acl-logging/0.log" Apr 24 22:14:24.308503 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:24.308470 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ecf62cd7-5041-4b2d-8eff-453431841db5/kube-rbac-proxy-node/0.log" Apr 24 22:14:24.330668 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:24.330650 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ecf62cd7-5041-4b2d-8eff-453431841db5/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:14:24.351556 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:24.351524 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ecf62cd7-5041-4b2d-8eff-453431841db5/northd/0.log" Apr 24 22:14:24.373050 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:24.373026 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ecf62cd7-5041-4b2d-8eff-453431841db5/nbdb/0.log" Apr 24 22:14:24.395895 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:24.395874 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ecf62cd7-5041-4b2d-8eff-453431841db5/sbdb/0.log" Apr 24 22:14:24.500624 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:24.500598 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4256n_ecf62cd7-5041-4b2d-8eff-453431841db5/ovnkube-controller/0.log" Apr 24 22:14:26.133272 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:26.133249 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-rkwhf_c2ebfd07-c3ce-4ce2-b482-4596f9db1c1e/network-check-target-container/0.log" Apr 24 22:14:26.975105 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:26.975075 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-fdh56_6183aed2-60ab-4cae-8455-c797d1e3ebf6/iptables-alerter/0.log" Apr 24 22:14:27.598161 ip-10-0-132-124 kubenswrapper[2578]: I0424 22:14:27.598136 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-nr5k8_007dd22d-9512-495a-ad7f-d8424286a304/tuned/0.log"