Apr 24 21:27:23.053228 ip-10-0-139-184 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:27:23.496573 ip-10-0-139-184 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:23.496573 ip-10-0-139-184 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:27:23.496573 ip-10-0-139-184 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:23.496573 ip-10-0-139-184 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:27:23.496573 ip-10-0-139-184 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:23.499423 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.499265 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:27:23.504802 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504781 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:23.504802 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504804 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:23.504884 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504809 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:23.504884 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504813 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:23.504884 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504817 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:23.504884 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504821 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:23.504884 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504824 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:23.504884 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504827 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:23.504884 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504830 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:23.504884 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504833 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:23.504884 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504835 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:23.504884 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504838 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:23.504884 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504841 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:23.504884 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504844 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:23.504884 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504846 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:23.504884 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504849 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:23.504884 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504852 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:23.504884 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504855 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:23.504884 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504857 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:23.504884 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504860 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:23.504884 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504863 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:23.505357 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504868 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:23.505357 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504871 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:23.505357 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504874 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:23.505357 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504876 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:23.505357 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504879 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:23.505357 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504882 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:23.505357 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504884 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:23.505357 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504901 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:23.505357 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504903 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:23.505357 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504906 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:23.505357 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504909 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:23.505357 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504912 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:23.505357 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504914 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:23.505357 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504918 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:23.505357 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504921 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:23.505357 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504923 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:23.505357 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504926 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:23.505357 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504928 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:23.505357 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504931 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:23.505357 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504934 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:23.505906 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504936 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:23.505906 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504938 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:23.505906 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504941 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:23.505906 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504943 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:23.505906 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504946 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:23.505906 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504949 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:23.505906 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504951 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:23.505906 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504954 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:23.505906 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504956 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:23.505906 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504959 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:23.505906 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504962 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:23.505906 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504964 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:23.505906 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504967 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:23.505906 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504973 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:23.505906 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504975 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:23.505906 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504978 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:23.505906 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504981 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:23.505906 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504984 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:23.505906 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504987 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:23.505906 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504990 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:23.506403 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504993 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:23.506403 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504996 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:23.506403 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.504998 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:23.506403 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505001 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:23.506403 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505004 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:23.506403 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505006 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:23.506403 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505009 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:23.506403 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505011 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:23.506403 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505015 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:23.506403 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505018 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:23.506403 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505020 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:23.506403 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505023 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:23.506403 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505025 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:23.506403 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505028 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:23.506403 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505035 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:23.506403 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505038 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:23.506403 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505041 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:23.506403 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505043 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:23.506403 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505046 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:23.506861 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505048 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:23.506861 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505051 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:23.506861 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505053 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:23.506861 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505055 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:23.506861 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505064 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:23.506861 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505068 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:23.506861 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505530 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:23.506861 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505536 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:23.506861 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505540 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:23.506861 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505544 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:23.506861 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505546 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:23.506861 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505550 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:23.506861 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505553 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:23.506861 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505556 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:23.506861 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505559 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:23.506861 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505562 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:23.506861 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505564 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:23.506861 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505567 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:23.506861 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505570 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:23.506861 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505573 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:23.507367 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505575 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:23.507367 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505578 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:23.507367 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505581 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:23.507367 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505584 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:23.507367 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505586 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:23.507367 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505589 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:23.507367 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505592 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:23.507367 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505595 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:23.507367 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505597 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:23.507367 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505600 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:23.507367 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505603 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:23.507367 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505605 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:23.507367 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505608 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:23.507367 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505610 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:23.507367 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505613 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:23.507367 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505615 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:23.507367 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505618 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:23.507367 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505620 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:23.507367 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505623 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:23.507367 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505626 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:23.507972 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505629 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:23.507972 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505632 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:23.507972 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505635 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:23.507972 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505637 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:23.507972 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505640 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:23.507972 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505643 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:23.507972 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505645 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:23.507972 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505648 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:23.507972 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505650 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:23.507972 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505652 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:23.507972 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505655 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:23.507972 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505657 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:23.507972 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505660 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:23.507972 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505663 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:23.507972 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505665 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:23.507972 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505668 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:23.507972 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505671 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:23.507972 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505674 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:23.507972 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505676 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:23.507972 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505679 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:23.508492 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505682 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:23.508492 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505685 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:23.508492 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505687 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:23.508492 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505690 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:23.508492 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505692 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:23.508492 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505695 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:23.508492 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505698 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:23.508492 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505701 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:23.508492 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505703 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:23.508492 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505705 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:23.508492 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505709 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:23.508492 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505712 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:23.508492 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505715 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:23.508492 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505717 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:23.508492 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505720 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:23.508492 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505722 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:23.508492 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505725 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:23.508492 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505727 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:23.508492 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505730 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:23.508985 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505737 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:23.508985 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505740 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:23.508985 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505744 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:23.508985 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505748 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:23.508985 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505751 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:23.508985 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505754 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:23.508985 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505757 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:23.508985 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505760 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:23.508985 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505763 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:23.508985 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505766 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:23.508985 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505768 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:23.508985 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505771 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:23.508985 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.505773 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:23.508985 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505856 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:27:23.508985 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505864 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:27:23.508985 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505871 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:27:23.508985 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505875 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:27:23.508985 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505879 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:27:23.508985 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505884 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:27:23.508985 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505907 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505914 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505919 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505924 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505928 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505931 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505934 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505938 2578 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505940 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505944 2578 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505946 2578 flags.go:64] FLAG: --cloud-config="" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505949 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505952 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505957 2578 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505960 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505964 2578 flags.go:64] FLAG: --config-dir="" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505966 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505969 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505973 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505977 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505980 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505984 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505987 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505990 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:27:23.509465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505993 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505996 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.505999 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506003 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506008 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506011 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506014 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506017 2578 flags.go:64] FLAG: --enable-server="true" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506020 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506025 2578 flags.go:64] FLAG: --event-burst="100" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506028 2578 flags.go:64] FLAG: --event-qps="50" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506031 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506035 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506038 2578 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506042 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506045 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506048 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506051 2578 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506054 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506057 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506060 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506064 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506067 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506069 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506072 2578 flags.go:64] FLAG: --feature-gates="" Apr 24 21:27:23.510053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506076 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506079 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506082 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506085 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506089 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506092 2578 flags.go:64] FLAG: --help="false" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506096 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-139-184.ec2.internal" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506099 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506102 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506105 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506108 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506112 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506115 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506119 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506122 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506125 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506128 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506131 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506134 2578 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506137 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506140 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506144 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506146 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506149 2578 flags.go:64] FLAG: --lock-file="" Apr 24 21:27:23.510656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506152 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506155 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506158 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506164 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506167 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506170 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506173 2578 flags.go:64] FLAG: --logging-format="text" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506176 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506180 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506182 2578 flags.go:64] FLAG: --manifest-url="" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506185 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506190 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506193 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506200 2578 flags.go:64] FLAG: --max-pods="110" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506204 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506207 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506210 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506213 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506216 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506219 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506222 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506233 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506236 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506239 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:27:23.511272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506242 2578 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506245 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506251 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506254 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506257 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506260 2578 flags.go:64] FLAG: --port="10250" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506263 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506266 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ad12722826e278a1" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506270 2578 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506273 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506275 2578 flags.go:64] FLAG: --register-node="true" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506278 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506282 2578 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506285 2578 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506288 2578 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506291 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506311 2578 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506316 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506319 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506323 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506326 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506331 2578 flags.go:64] FLAG: --runonce="false" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506334 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506338 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506341 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:27:23.511857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506344 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506347 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506350 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506353 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506357 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506360 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506363 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506367 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506369 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506372 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506375 2578 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506378 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506384 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506387 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506390 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506398 2578 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506400 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506404 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506407 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506410 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506413 2578 flags.go:64] FLAG: --v="2" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506418 2578 flags.go:64] FLAG: --version="false" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506425 2578 flags.go:64] FLAG: --vmodule="" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506430 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.506433 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:27:23.512489 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506527 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:23.513109 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506530 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:23.513109 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506533 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:23.513109 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506538 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:23.513109 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506540 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:23.513109 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506543 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:23.513109 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506546 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:23.513109 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506548 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:23.513109 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506551 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:23.513109 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506554 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:23.513109 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506556 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:23.513109 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506559 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:23.513109 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506563 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:23.513109 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506567 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:23.513109 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506570 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:23.513109 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506573 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:23.513109 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506576 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:23.513109 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506578 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:23.513109 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506581 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:23.513109 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506584 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:23.513568 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506586 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:23.513568 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506589 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:23.513568 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506591 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:23.513568 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506594 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:23.513568 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506596 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:23.513568 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506599 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:23.513568 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506601 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:23.513568 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506604 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:23.513568 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506606 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:23.513568 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506609 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:23.513568 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506612 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:23.513568 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506614 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:23.513568 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506616 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:23.513568 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506619 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:23.513568 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506622 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:23.513568 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506629 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:23.513568 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506631 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:23.513568 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506634 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:23.513568 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506637 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:23.513568 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506639 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:23.514103 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506643 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:23.514103 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506645 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:23.514103 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506648 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:23.514103 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506650 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:23.514103 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506653 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:23.514103 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506655 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:23.514103 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506658 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:23.514103 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506661 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:23.514103 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506663 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:23.514103 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506666 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:23.514103 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506668 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:23.514103 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506670 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:23.514103 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506673 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:23.514103 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506675 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:23.514103 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506678 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:23.514103 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506680 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:23.514103 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506683 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:23.514103 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506686 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:23.514103 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506688 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:23.514571 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506690 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:23.514571 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506693 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:23.514571 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506695 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:23.514571 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506698 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:23.514571 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506701 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:23.514571 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506704 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:23.514571 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506706 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:23.514571 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506709 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:23.514571 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506713 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:23.514571 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506717 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:23.514571 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506724 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:23.514571 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506727 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:23.514571 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506730 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:23.514571 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506733 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:23.514571 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506735 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:23.514571 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506738 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:23.514571 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506741 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:23.514571 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506743 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:23.514571 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506746 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:23.514571 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506749 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:23.515093 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506752 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:23.515093 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506755 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:23.515093 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506758 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:23.515093 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506760 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:23.515093 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506763 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:23.515093 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506766 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:23.515093 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.506769 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:23.515093 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.507637 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:23.515093 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.514483 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:27:23.515093 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.514503 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:27:23.515093 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514552 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:23.515093 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514557 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:23.515093 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514561 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:23.515093 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514564 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:23.515093 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514568 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:23.515093 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514571 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:23.515506 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514574 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:23.515506 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514578 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:23.515506 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514583 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:23.515506 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514586 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:23.515506 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514589 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:23.515506 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514592 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:23.515506 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514595 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:23.515506 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514597 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:23.515506 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514600 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:23.515506 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514603 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:23.515506 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514605 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:23.515506 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514608 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:23.515506 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514611 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:23.515506 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514614 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:23.515506 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514616 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:23.515506 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514619 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:23.515506 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514623 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:23.515506 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514625 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:23.515506 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514628 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:23.515506 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514631 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:23.516005 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514633 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:23.516005 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514636 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:23.516005 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514638 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:23.516005 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514641 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:23.516005 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514644 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:23.516005 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514647 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:23.516005 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514650 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:23.516005 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514653 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:23.516005 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514656 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:23.516005 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514659 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:23.516005 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514661 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:23.516005 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514664 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:23.516005 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514667 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:23.516005 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514669 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:23.516005 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514672 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:23.516005 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514674 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:23.516005 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514677 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:23.516005 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514679 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:23.516005 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514682 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:23.516005 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514684 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:23.516537 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514687 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:23.516537 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514689 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:23.516537 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514692 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:23.516537 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514694 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:23.516537 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514697 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:23.516537 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514699 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:23.516537 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514702 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:23.516537 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514705 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:23.516537 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514707 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:23.516537 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514710 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:23.516537 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514713 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:23.516537 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514716 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:23.516537 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514718 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:23.516537 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514721 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:23.516537 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514724 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:23.516537 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514726 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:23.516537 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514729 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:23.516537 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514733 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:23.516537 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514736 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:23.516537 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514739 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:23.517047 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514741 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:23.517047 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514744 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:23.517047 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514747 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:23.517047 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514749 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:23.517047 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514752 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:23.517047 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514755 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:23.517047 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514757 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:23.517047 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514760 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:23.517047 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514763 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:23.517047 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514765 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:23.517047 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514768 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:23.517047 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514770 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:23.517047 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514773 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:23.517047 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514775 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:23.517047 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514778 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:23.517047 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514780 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:23.517047 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514783 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:23.517047 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514785 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:23.517047 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514790 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:23.517546 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514793 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:23.517546 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.514799 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:23.517546 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514919 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:23.517546 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514926 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:23.517546 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514929 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:23.517546 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514932 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:23.517546 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514935 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:23.517546 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514938 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:23.517546 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514941 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:23.517546 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514944 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:23.517546 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514947 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:23.517546 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514951 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:23.517546 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514954 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:23.517546 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514957 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:23.517546 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514959 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:23.517546 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514962 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:23.517954 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514964 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:23.517954 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514967 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:23.517954 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514970 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:23.517954 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514972 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:23.517954 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514974 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:23.517954 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514977 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:23.517954 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514981 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:23.517954 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514984 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:23.517954 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514987 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:23.517954 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514990 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:23.517954 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514993 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:23.517954 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514996 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:23.517954 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.514998 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:23.517954 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515001 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:23.517954 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515004 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:23.517954 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515006 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:23.517954 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515009 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:23.517954 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515012 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:23.517954 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515014 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:23.518429 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515017 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:23.518429 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515020 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:23.518429 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515022 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:23.518429 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515025 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:23.518429 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515028 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:23.518429 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515030 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:23.518429 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515033 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:23.518429 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515036 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:23.518429 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515039 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:23.518429 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515042 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:23.518429 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515044 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:23.518429 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515047 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:23.518429 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515049 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:23.518429 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515052 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:23.518429 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515055 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:23.518429 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515057 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:23.518429 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515060 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:23.518429 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515062 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:23.518429 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515065 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:23.518429 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515067 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:23.518929 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515070 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:23.518929 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515073 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:23.518929 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515076 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:23.518929 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515079 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:23.518929 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515082 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:23.518929 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515085 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:23.518929 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515087 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:23.518929 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515090 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:23.518929 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515093 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:23.518929 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515095 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:23.518929 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515098 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:23.518929 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515101 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:23.518929 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515104 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:23.518929 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515106 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:23.518929 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515110 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:23.518929 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515112 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:23.518929 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515115 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:23.518929 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515117 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:23.518929 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515120 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:23.519396 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515123 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:23.519396 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515126 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:23.519396 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515129 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:23.519396 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515131 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:23.519396 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515134 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:23.519396 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515136 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:23.519396 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515139 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:23.519396 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515141 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:23.519396 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515144 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:23.519396 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515147 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:23.519396 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515149 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:23.519396 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515151 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:23.519396 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515154 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:23.519396 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:23.515157 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:23.519396 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.515162 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:23.519396 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.515811 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:27:23.520423 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.520405 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:27:23.521332 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.521320 2578 server.go:1019] "Starting client certificate rotation" Apr 24 21:27:23.521437 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.521418 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:23.521475 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.521462 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:23.543723 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.543698 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:23.546331 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.546305 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:23.560666 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.560642 2578 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:27:23.566795 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.566773 2578 log.go:25] "Validated CRI v1 image API" Apr 24 21:27:23.570167 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.570147 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:27:23.576355 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.576330 2578 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 836adcf9-b8d0-419d-8e03-b926eda0b800:/dev/nvme0n1p4 9344e193-852f-442a-b365-32843e6bec31:/dev/nvme0n1p3] Apr 24 21:27:23.576470 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.576356 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:27:23.580709 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.580685 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:23.582126 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.582001 2578 manager.go:217] Machine: {Timestamp:2026-04-24 21:27:23.580184977 +0000 UTC m=+0.408055104 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101555 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec22efc175ef2e861610c9e33ebf43a7 SystemUUID:ec22efc1-75ef-2e86-1610-c9e33ebf43a7 BootID:6cb8fece-dada-4117-9829-307cd10be1ab Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:46:f6:16:55:0d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:46:f6:16:55:0d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:56:1c:31:27:6c:71 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:27:23.582126 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.582120 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:27:23.582285 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.582248 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:27:23.584106 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.584073 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:27:23.584285 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.584107 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-184.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:27:23.584369 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.584301 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:27:23.584369 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.584315 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:27:23.584369 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.584333 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:23.585233 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.585220 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:23.586921 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.586908 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:23.587087 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.587076 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:27:23.589570 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.589556 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:27:23.589642 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.589585 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:27:23.589642 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.589604 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:27:23.589642 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.589619 2578 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:27:23.589642 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.589632 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:27:23.590802 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.590787 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:23.590867 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.590810 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:23.593714 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.593697 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:27:23.595356 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.595342 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:27:23.597594 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.597572 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:27:23.597594 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.597593 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:27:23.597719 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.597599 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:27:23.597719 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.597609 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:27:23.597719 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.597615 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:27:23.597719 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.597621 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:27:23.597719 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.597627 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:27:23.597719 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.597632 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:27:23.597719 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.597647 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:27:23.597719 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.597653 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:27:23.597719 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.597666 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:27:23.597719 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.597675 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:27:23.598520 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.598506 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:27:23.598562 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.598523 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:27:23.600144 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:23.600118 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-184.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:27:23.600203 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:23.600118 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:27:23.600703 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.600683 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mmtc4" Apr 24 21:27:23.602564 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.602551 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:27:23.602609 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.602592 2578 server.go:1295] "Started kubelet" Apr 24 21:27:23.602744 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.602696 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:27:23.602783 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.602699 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:27:23.602851 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.602800 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:27:23.604456 ip-10-0-139-184 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:27:23.605570 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.605399 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:27:23.605852 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.605802 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:27:23.608685 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.608665 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mmtc4" Apr 24 21:27:23.610709 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.610689 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-184.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:27:23.615655 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:23.615634 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:27:23.617845 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.617826 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:27:23.617947 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.617877 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:23.618744 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.618721 2578 factory.go:55] Registering systemd factory Apr 24 21:27:23.618853 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.618774 2578 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:27:23.618853 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.618833 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:27:23.619009 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.618994 2578 factory.go:153] Registering CRI-O factory Apr 24 21:27:23.619060 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.619012 2578 factory.go:223] Registration of the crio container factory successfully Apr 24 21:27:23.619104 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.619082 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:27:23.619148 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.619110 2578 factory.go:103] Registering Raw factory Apr 24 21:27:23.619148 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.619125 2578 manager.go:1196] Started watching for new ooms in manager Apr 24 21:27:23.619148 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.619077 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:27:23.619148 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.619142 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:27:23.619344 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.619315 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:27:23.619344 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.619322 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:27:23.619511 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:23.619481 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 24 21:27:23.619748 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.619736 2578 manager.go:319] Starting recovery of all containers Apr 24 21:27:23.620927 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.620907 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:23.628342 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:23.628321 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-184.ec2.internal\" not found" node="ip-10-0-139-184.ec2.internal" Apr 24 21:27:23.630221 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.630051 2578 manager.go:324] Recovery completed Apr 24 21:27:23.634291 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.634278 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:23.637227 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.637209 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:23.637325 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.637247 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:23.637325 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.637262 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:23.637819 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.637804 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:27:23.637819 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.637815 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:27:23.637949 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.637833 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:23.640444 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.640432 2578 policy_none.go:49] "None policy: Start" Apr 24 21:27:23.640481 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.640447 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:27:23.640481 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.640457 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:27:23.679869 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.679846 2578 manager.go:341] "Starting Device Plugin manager" Apr 24 21:27:23.680176 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:23.679906 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:27:23.680176 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.679921 2578 server.go:85] "Starting device plugin registration server" Apr 24 21:27:23.694131 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.680217 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:27:23.694131 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.680233 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:27:23.694131 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.680340 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:27:23.694131 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.680420 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:27:23.694131 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.680429 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:27:23.694131 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:23.680957 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:27:23.694131 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:23.681001 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-184.ec2.internal\" not found" Apr 24 21:27:23.756957 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.756863 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:27:23.758078 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.758062 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:27:23.758150 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.758090 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:27:23.758150 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.758113 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:27:23.758150 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.758122 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:27:23.758258 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:23.758163 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:27:23.760992 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.760967 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:23.780900 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.780863 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:23.786777 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.786759 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:23.786847 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.786795 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:23.786847 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.786807 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:23.786847 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.786830 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-184.ec2.internal" Apr 24 21:27:23.796554 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.796527 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-184.ec2.internal" Apr 24 21:27:23.796664 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:23.796560 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-184.ec2.internal\": node \"ip-10-0-139-184.ec2.internal\" not found" Apr 24 21:27:23.811762 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:23.811731 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 24 21:27:23.858595 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.858551 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-184.ec2.internal"] Apr 24 21:27:23.858735 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.858641 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:23.859613 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.859599 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:23.859685 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.859627 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:23.859685 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.859639 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:23.860923 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.860908 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:23.861074 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.861051 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" Apr 24 21:27:23.861132 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.861092 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:23.862357 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.862342 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:23.862357 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.862349 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:23.862489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.862370 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:23.862489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.862375 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:23.862489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.862385 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:23.862489 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.862389 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:23.863620 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.863606 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-184.ec2.internal" Apr 24 21:27:23.863671 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.863631 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:23.864315 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.864297 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:23.864414 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.864322 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:23.864414 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:23.864332 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:23.887695 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:23.887671 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-184.ec2.internal\" not found" node="ip-10-0-139-184.ec2.internal" Apr 24 21:27:23.892355 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:23.892337 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-184.ec2.internal\" not found" node="ip-10-0-139-184.ec2.internal" Apr 24 21:27:23.912580 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:23.912545 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 24 21:27:24.013280 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:24.013194 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 24 21:27:24.020318 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:24.020287 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/77560a24cc76a08538baa0efe4273073-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal\" (UID: \"77560a24cc76a08538baa0efe4273073\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" Apr 24 21:27:24.020420 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:24.020322 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77560a24cc76a08538baa0efe4273073-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal\" (UID: \"77560a24cc76a08538baa0efe4273073\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" Apr 24 21:27:24.020420 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:24.020347 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e3440d423eacb4ec58cf1cc320321a41-config\") pod \"kube-apiserver-proxy-ip-10-0-139-184.ec2.internal\" (UID: \"e3440d423eacb4ec58cf1cc320321a41\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-184.ec2.internal" Apr 24 21:27:24.113737 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:24.113695 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 24 21:27:24.121096 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:24.121063 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/77560a24cc76a08538baa0efe4273073-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal\" (UID: \"77560a24cc76a08538baa0efe4273073\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" Apr 24 21:27:24.121190 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:24.121103 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77560a24cc76a08538baa0efe4273073-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal\" (UID: \"77560a24cc76a08538baa0efe4273073\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" Apr 24 21:27:24.121190 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:24.121129 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e3440d423eacb4ec58cf1cc320321a41-config\") pod \"kube-apiserver-proxy-ip-10-0-139-184.ec2.internal\" (UID: \"e3440d423eacb4ec58cf1cc320321a41\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-184.ec2.internal" Apr 24 21:27:24.121190 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:24.121172 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/77560a24cc76a08538baa0efe4273073-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal\" (UID: \"77560a24cc76a08538baa0efe4273073\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" Apr 24 21:27:24.121309 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:24.121177 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77560a24cc76a08538baa0efe4273073-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal\" (UID: \"77560a24cc76a08538baa0efe4273073\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" Apr 24 21:27:24.121309 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:24.121178 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e3440d423eacb4ec58cf1cc320321a41-config\") pod \"kube-apiserver-proxy-ip-10-0-139-184.ec2.internal\" (UID: \"e3440d423eacb4ec58cf1cc320321a41\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-184.ec2.internal" Apr 24 21:27:24.190249 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:24.190216 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" Apr 24 21:27:24.196166 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:24.196142 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-184.ec2.internal" Apr 24 21:27:24.213971 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:24.213933 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 24 21:27:24.314494 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:24.314400 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 24 21:27:24.414986 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:24.414950 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 24 21:27:24.515570 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:24.515539 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 24 21:27:24.520869 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:24.520846 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:27:24.521013 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:24.520994 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:27:24.521051 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:24.521026 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:27:24.613457 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:24.613374 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:22:23 +0000 UTC" deadline="2028-02-06 02:29:23.027621312 +0000 UTC" Apr 24 21:27:24.613457 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:24.613402 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15653h1m58.41422231s" Apr 24 21:27:24.615948 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:24.615928 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 24 21:27:24.618471 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:24.618452 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:24.629610 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:24.629577 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:24.649928 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:24.649868 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-w5522" Apr 24 21:27:24.658657 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:24.658397 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-w5522" Apr 24 21:27:24.716978 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:24.716947 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 24 21:27:24.791852 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:24.791817 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77560a24cc76a08538baa0efe4273073.slice/crio-4fadac1560b4b0617fbe33e06ca23ddc2ea78097dc19e6e0354b14b69893ab48 WatchSource:0}: Error finding container 4fadac1560b4b0617fbe33e06ca23ddc2ea78097dc19e6e0354b14b69893ab48: Status 404 returned error can't find the container with id 4fadac1560b4b0617fbe33e06ca23ddc2ea78097dc19e6e0354b14b69893ab48 Apr 24 21:27:24.792141 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:24.792125 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3440d423eacb4ec58cf1cc320321a41.slice/crio-605a0d7e07391e4b656c79039ff908d82695d541011e92ffc714acaf6b06d17f WatchSource:0}: Error finding container 605a0d7e07391e4b656c79039ff908d82695d541011e92ffc714acaf6b06d17f: Status 404 returned error can't find the container with id 605a0d7e07391e4b656c79039ff908d82695d541011e92ffc714acaf6b06d17f Apr 24 21:27:24.796184 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:24.796169 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:27:24.817995 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:24.817959 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 24 21:27:24.918498 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:24.918406 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 24 21:27:24.924191 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:24.924171 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:25.018280 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.018226 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" Apr 24 21:27:25.030230 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.030208 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:25.031948 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.031930 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-184.ec2.internal" Apr 24 21:27:25.041635 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.041614 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:25.097987 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.097960 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:25.416945 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.416851 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:25.591331 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.591299 2578 apiserver.go:52] "Watching apiserver" Apr 24 21:27:25.599652 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.599621 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:27:25.600108 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.600081 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-w7q5f","openshift-image-registry/node-ca-jzqgz","openshift-multus/multus-xp275","openshift-network-operator/iptables-alerter-rvc97","kube-system/global-pull-secret-syncer-hxdgr","openshift-cluster-node-tuning-operator/tuned-brwgr","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal","openshift-multus/multus-additional-cni-plugins-xs9hq","openshift-multus/network-metrics-daemon-lkk5b","openshift-network-diagnostics/network-check-target-c56pf","openshift-ovn-kubernetes/ovnkube-node-krb9p","kube-system/konnectivity-agent-sprws","kube-system/kube-apiserver-proxy-ip-10-0-139-184.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7"] Apr 24 21:27:25.603338 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.603314 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:25.603461 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:25.603407 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:27:25.607724 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.607649 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-sprws" Apr 24 21:27:25.607839 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.607744 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xp275" Apr 24 21:27:25.609965 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.609941 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rvc97" Apr 24 21:27:25.610686 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.610602 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:27:25.610686 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.610610 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:27:25.610972 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.610923 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-57sgt\"" Apr 24 21:27:25.610972 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.610932 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:27:25.611110 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.610925 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:27:25.611110 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.611043 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:27:25.611110 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.611073 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-m5pdd\"" Apr 24 21:27:25.611302 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.611113 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:27:25.612371 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.612354 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:27:25.612481 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.612394 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2ld2q\"" Apr 24 21:27:25.612481 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.612430 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:25.612800 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.612778 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:25.614796 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.614774 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.617010 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.616991 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.617386 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.617369 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:25.617468 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.617432 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-v5lqg\"" Apr 24 21:27:25.617603 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.617584 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:25.619160 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.619116 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:27:25.619437 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.619334 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:27:25.619539 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.619483 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-5jvrh\"" Apr 24 21:27:25.619638 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.619611 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:27:25.619784 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.619756 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:25.619878 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:25.619815 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:27:25.619963 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.619932 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.622134 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.622111 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:27:25.622233 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.622173 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wtq92\"" Apr 24 21:27:25.622292 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.622119 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:27:25.625421 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.625402 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jzqgz" Apr 24 21:27:25.625583 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.625561 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w7q5f" Apr 24 21:27:25.629549 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.627581 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-zq2fn\"" Apr 24 21:27:25.629549 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.628497 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:27:25.629549 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.628683 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:27:25.629549 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.628762 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:27:25.629549 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.628819 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/beaaaa3d-069b-465d-b6fc-defae6201642-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pzgp7\" (UID: \"beaaaa3d-069b-465d-b6fc-defae6201642\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.629549 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.628852 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:27:25.629549 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.628857 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-etc-sysctl-d\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.629549 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.628907 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/15075605-e0df-4d3a-90f3-7c8811d07731-host-slash\") pod \"iptables-alerter-rvc97\" (UID: \"15075605-e0df-4d3a-90f3-7c8811d07731\") " pod="openshift-network-operator/iptables-alerter-rvc97" Apr 24 21:27:25.629549 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.628942 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wkwzl\"" Apr 24 21:27:25.629549 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.628951 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-host-run-k8s-cni-cncf-io\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.629549 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.628998 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:27:25.629549 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.629061 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-os-release\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.629549 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.629101 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-etc-kubernetes\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.629549 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.629134 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn6l6\" (UniqueName: \"kubernetes.io/projected/20a034bb-c3d2-4d05-92de-ed16d2eda707-kube-api-access-bn6l6\") pod \"network-metrics-daemon-lkk5b\" (UID: \"20a034bb-c3d2-4d05-92de-ed16d2eda707\") " pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:25.629549 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.629172 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-host-var-lib-cni-bin\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.629549 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.629204 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-sys\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.629549 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.629235 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-os-release\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.629549 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.629269 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/beaaaa3d-069b-465d-b6fc-defae6201642-registration-dir\") pod \"aws-ebs-csi-driver-node-pzgp7\" (UID: \"beaaaa3d-069b-465d-b6fc-defae6201642\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.629549 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.629294 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hqhp\" (UniqueName: \"kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp\") pod \"network-check-target-c56pf\" (UID: \"af45c5fb-e377-44ea-ad83-ad7e5bea725b\") " pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:25.629549 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.629322 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-cnibin\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.630621 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.629350 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-host-var-lib-cni-multus\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.630621 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.629385 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-host-run-multus-certs\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.630621 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.629431 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/beaaaa3d-069b-465d-b6fc-defae6201642-socket-dir\") pod \"aws-ebs-csi-driver-node-pzgp7\" (UID: \"beaaaa3d-069b-465d-b6fc-defae6201642\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.630621 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.629468 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/beaaaa3d-069b-465d-b6fc-defae6201642-etc-selinux\") pod \"aws-ebs-csi-driver-node-pzgp7\" (UID: \"beaaaa3d-069b-465d-b6fc-defae6201642\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.630621 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.629521 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-etc-sysconfig\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.630621 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.629569 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-etc-systemd\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.630621 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.629618 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.630621 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.630166 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-cni-binary-copy\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.630621 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.630196 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.630621 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.630214 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/94172220-e322-483c-ae3c-254c0bface83-cni-binary-copy\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.630621 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.630237 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/beaaaa3d-069b-465d-b6fc-defae6201642-device-dir\") pod \"aws-ebs-csi-driver-node-pzgp7\" (UID: \"beaaaa3d-069b-465d-b6fc-defae6201642\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.630621 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.630340 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljpdj\" (UniqueName: \"kubernetes.io/projected/beaaaa3d-069b-465d-b6fc-defae6201642-kube-api-access-ljpdj\") pod \"aws-ebs-csi-driver-node-pzgp7\" (UID: \"beaaaa3d-069b-465d-b6fc-defae6201642\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.630621 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.630397 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-etc-sysctl-conf\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.631238 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.630584 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ww2k\" (UniqueName: \"kubernetes.io/projected/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-kube-api-access-5ww2k\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.631238 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.630684 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5kq7\" (UniqueName: \"kubernetes.io/projected/94172220-e322-483c-ae3c-254c0bface83-kube-api-access-v5kq7\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.631238 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.630715 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/beaaaa3d-069b-465d-b6fc-defae6201642-sys-fs\") pod \"aws-ebs-csi-driver-node-pzgp7\" (UID: \"beaaaa3d-069b-465d-b6fc-defae6201642\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.631238 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.630783 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-run\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.631238 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.630818 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-cnibin\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.631238 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.630842 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-multus-cni-dir\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.631238 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.630879 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-multus-conf-dir\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.631238 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.630948 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/26ff341c-82cd-4599-9c8e-31e2436a0419-etc-tuned\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.631238 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.630983 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs\") pod \"network-metrics-daemon-lkk5b\" (UID: \"20a034bb-c3d2-4d05-92de-ed16d2eda707\") " pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:25.631238 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.631017 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/70296c6e-ab82-4b02-8f53-04c16225df28-konnectivity-ca\") pod \"konnectivity-agent-sprws\" (UID: \"70296c6e-ab82-4b02-8f53-04c16225df28\") " pod="kube-system/konnectivity-agent-sprws" Apr 24 21:27:25.631238 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.631048 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/94172220-e322-483c-ae3c-254c0bface83-multus-daemon-config\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.631238 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.631076 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-etc-kubernetes\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.631238 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.631105 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-host\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.631238 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.631135 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.631238 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.631166 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-multus-socket-dir-parent\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.631238 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.631207 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-system-cni-dir\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.632039 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.631252 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/70296c6e-ab82-4b02-8f53-04c16225df28-agent-certs\") pod \"konnectivity-agent-sprws\" (UID: \"70296c6e-ab82-4b02-8f53-04c16225df28\") " pod="kube-system/konnectivity-agent-sprws" Apr 24 21:27:25.632039 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.631313 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26ff341c-82cd-4599-9c8e-31e2436a0419-tmp\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.632039 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.631347 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz6vk\" (UniqueName: \"kubernetes.io/projected/26ff341c-82cd-4599-9c8e-31e2436a0419-kube-api-access-gz6vk\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.632039 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.631374 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-etc-modprobe-d\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.632039 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.631434 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-lib-modules\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.632039 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.631471 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-var-lib-kubelet\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.632039 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.631524 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/15075605-e0df-4d3a-90f3-7c8811d07731-iptables-alerter-script\") pod \"iptables-alerter-rvc97\" (UID: \"15075605-e0df-4d3a-90f3-7c8811d07731\") " pod="openshift-network-operator/iptables-alerter-rvc97" Apr 24 21:27:25.632039 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.631565 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-hostroot\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.632039 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.631615 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dztv\" (UniqueName: \"kubernetes.io/projected/15075605-e0df-4d3a-90f3-7c8811d07731-kube-api-access-9dztv\") pod \"iptables-alerter-rvc97\" (UID: \"15075605-e0df-4d3a-90f3-7c8811d07731\") " pod="openshift-network-operator/iptables-alerter-rvc97" Apr 24 21:27:25.632039 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.631654 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-system-cni-dir\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.632039 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.631705 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-host-run-netns\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.632673 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.632648 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-host-var-lib-kubelet\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.633908 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.633694 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:25.633908 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.633712 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.633908 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:25.633770 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:27:25.636165 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.636142 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-m4n82\"" Apr 24 21:27:25.636392 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.636371 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:27:25.636644 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.636625 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:27:25.636744 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.636676 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:27:25.636744 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.636728 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:27:25.636744 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.636737 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:27:25.636922 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.636628 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:27:25.659106 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.659077 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:24 +0000 UTC" deadline="2027-11-16 03:35:49.502413586 +0000 UTC" Apr 24 21:27:25.659106 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.659104 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13686h8m23.84331292s" Apr 24 21:27:25.720313 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.720273 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:27:25.732838 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.732806 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-cni-binary-copy\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.733030 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.732853 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.733030 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.732903 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2b9j\" (UniqueName: \"kubernetes.io/projected/d0f7d715-6263-49f8-ac7b-21d48a5b4438-kube-api-access-z2b9j\") pod \"node-ca-jzqgz\" (UID: \"d0f7d715-6263-49f8-ac7b-21d48a5b4438\") " pod="openshift-image-registry/node-ca-jzqgz" Apr 24 21:27:25.733030 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.732933 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-systemd-units\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.733030 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.732960 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-ovnkube-script-lib\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.733030 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.732988 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/94172220-e322-483c-ae3c-254c0bface83-cni-binary-copy\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.733030 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733014 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-etc-openvswitch\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.733390 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733038 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/beaaaa3d-069b-465d-b6fc-defae6201642-device-dir\") pod \"aws-ebs-csi-driver-node-pzgp7\" (UID: \"beaaaa3d-069b-465d-b6fc-defae6201642\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.733390 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733063 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljpdj\" (UniqueName: \"kubernetes.io/projected/beaaaa3d-069b-465d-b6fc-defae6201642-kube-api-access-ljpdj\") pod \"aws-ebs-csi-driver-node-pzgp7\" (UID: \"beaaaa3d-069b-465d-b6fc-defae6201642\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.733390 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733089 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-etc-sysctl-conf\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.733390 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733112 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ww2k\" (UniqueName: \"kubernetes.io/projected/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-kube-api-access-5ww2k\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.733390 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733119 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/beaaaa3d-069b-465d-b6fc-defae6201642-device-dir\") pod \"aws-ebs-csi-driver-node-pzgp7\" (UID: \"beaaaa3d-069b-465d-b6fc-defae6201642\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.733390 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733135 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5kq7\" (UniqueName: \"kubernetes.io/projected/94172220-e322-483c-ae3c-254c0bface83-kube-api-access-v5kq7\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.733390 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733186 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/beaaaa3d-069b-465d-b6fc-defae6201642-sys-fs\") pod \"aws-ebs-csi-driver-node-pzgp7\" (UID: \"beaaaa3d-069b-465d-b6fc-defae6201642\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.733390 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733214 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-run\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.733390 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733236 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-cnibin\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.733390 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733265 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.733390 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733299 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-multus-cni-dir\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.733390 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733330 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-multus-conf-dir\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.733390 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733363 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/26ff341c-82cd-4599-9c8e-31e2436a0419-etc-tuned\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.733390 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733392 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs\") pod \"network-metrics-daemon-lkk5b\" (UID: \"20a034bb-c3d2-4d05-92de-ed16d2eda707\") " pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:25.734033 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733427 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-host-run-netns\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.734033 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733445 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-run\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.734033 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733458 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/70296c6e-ab82-4b02-8f53-04c16225df28-konnectivity-ca\") pod \"konnectivity-agent-sprws\" (UID: \"70296c6e-ab82-4b02-8f53-04c16225df28\") " pod="kube-system/konnectivity-agent-sprws" Apr 24 21:27:25.734033 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733491 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-cnibin\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.734033 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733493 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/94172220-e322-483c-ae3c-254c0bface83-multus-daemon-config\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.734033 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733511 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.734033 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733532 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-etc-kubernetes\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.734033 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733560 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-host\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.734033 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733584 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.734033 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733610 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-host-kubelet\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.734033 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733635 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-run-systemd\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.734033 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733659 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-log-socket\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.734033 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733668 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/94172220-e322-483c-ae3c-254c0bface83-cni-binary-copy\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.734033 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733687 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-multus-socket-dir-parent\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.734033 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733697 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-multus-cni-dir\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.734033 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733712 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-system-cni-dir\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.734033 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733771 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-run-ovn\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.734795 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733778 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/beaaaa3d-069b-465d-b6fc-defae6201642-sys-fs\") pod \"aws-ebs-csi-driver-node-pzgp7\" (UID: \"beaaaa3d-069b-465d-b6fc-defae6201642\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.734795 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733799 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-ovn-node-metrics-cert\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.734795 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733813 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:27:25.734795 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733829 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcw9h\" (UniqueName: \"kubernetes.io/projected/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-kube-api-access-qcw9h\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.734795 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733834 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-system-cni-dir\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.734795 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733839 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-etc-kubernetes\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.734795 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733840 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-etc-sysctl-conf\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.734795 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733854 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/70296c6e-ab82-4b02-8f53-04c16225df28-agent-certs\") pod \"konnectivity-agent-sprws\" (UID: \"70296c6e-ab82-4b02-8f53-04c16225df28\") " pod="kube-system/konnectivity-agent-sprws" Apr 24 21:27:25.734795 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733881 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26ff341c-82cd-4599-9c8e-31e2436a0419-tmp\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.734795 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733916 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-multus-socket-dir-parent\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.734795 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733921 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gz6vk\" (UniqueName: \"kubernetes.io/projected/26ff341c-82cd-4599-9c8e-31e2436a0419-kube-api-access-gz6vk\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.734795 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733965 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d0f7d715-6263-49f8-ac7b-21d48a5b4438-serviceca\") pod \"node-ca-jzqgz\" (UID: \"d0f7d715-6263-49f8-ac7b-21d48a5b4438\") " pod="openshift-image-registry/node-ca-jzqgz" Apr 24 21:27:25.734795 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:25.733972 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:25.734795 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733991 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-dbus\") pod \"global-pull-secret-syncer-hxdgr\" (UID: \"7a2b19a8-7cce-48ea-a91f-3306187c2d2a\") " pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:25.734795 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734017 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2556ea37-119f-46c5-bee7-7cfb12afca0f-tmp-dir\") pod \"node-resolver-w7q5f\" (UID: \"2556ea37-119f-46c5-bee7-7cfb12afca0f\") " pod="openshift-dns/node-resolver-w7q5f" Apr 24 21:27:25.734795 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734019 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/94172220-e322-483c-ae3c-254c0bface83-multus-daemon-config\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.734795 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.733799 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-multus-conf-dir\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.735624 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:25.734084 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs podName:20a034bb-c3d2-4d05-92de-ed16d2eda707 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:26.23402853 +0000 UTC m=+3.061898646 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs") pod "network-metrics-daemon-lkk5b" (UID: "20a034bb-c3d2-4d05-92de-ed16d2eda707") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:25.735624 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734103 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-var-lib-openvswitch\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.735624 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734105 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-host\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.735624 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734124 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-host-cni-bin\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.735624 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734142 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-etc-modprobe-d\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.735624 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734159 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-lib-modules\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.735624 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734174 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-var-lib-kubelet\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.735624 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734203 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/15075605-e0df-4d3a-90f3-7c8811d07731-iptables-alerter-script\") pod \"iptables-alerter-rvc97\" (UID: \"15075605-e0df-4d3a-90f3-7c8811d07731\") " pod="openshift-network-operator/iptables-alerter-rvc97" Apr 24 21:27:25.735624 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734230 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/70296c6e-ab82-4b02-8f53-04c16225df28-konnectivity-ca\") pod \"konnectivity-agent-sprws\" (UID: \"70296c6e-ab82-4b02-8f53-04c16225df28\") " pod="kube-system/konnectivity-agent-sprws" Apr 24 21:27:25.735624 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734237 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret\") pod \"global-pull-secret-syncer-hxdgr\" (UID: \"7a2b19a8-7cce-48ea-a91f-3306187c2d2a\") " pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:25.735624 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734291 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-hostroot\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.735624 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734313 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-var-lib-kubelet\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.735624 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734329 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dztv\" (UniqueName: \"kubernetes.io/projected/15075605-e0df-4d3a-90f3-7c8811d07731-kube-api-access-9dztv\") pod \"iptables-alerter-rvc97\" (UID: \"15075605-e0df-4d3a-90f3-7c8811d07731\") " pod="openshift-network-operator/iptables-alerter-rvc97" Apr 24 21:27:25.735624 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734363 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-ovnkube-config\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.735624 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734369 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-etc-modprobe-d\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.735624 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734429 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-lib-modules\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.736371 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734435 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-hostroot\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.736371 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734465 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-system-cni-dir\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.736371 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734491 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-host-run-netns\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.736371 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734520 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-host-var-lib-kubelet\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.736371 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734546 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/beaaaa3d-069b-465d-b6fc-defae6201642-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pzgp7\" (UID: \"beaaaa3d-069b-465d-b6fc-defae6201642\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.736371 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734571 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-etc-sysctl-d\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.736371 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734595 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/15075605-e0df-4d3a-90f3-7c8811d07731-host-slash\") pod \"iptables-alerter-rvc97\" (UID: \"15075605-e0df-4d3a-90f3-7c8811d07731\") " pod="openshift-network-operator/iptables-alerter-rvc97" Apr 24 21:27:25.736371 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734627 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-kubelet-config\") pod \"global-pull-secret-syncer-hxdgr\" (UID: \"7a2b19a8-7cce-48ea-a91f-3306187c2d2a\") " pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:25.736371 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734655 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-env-overrides\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.736371 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734683 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-host-run-k8s-cni-cncf-io\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.736371 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734712 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-os-release\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.736371 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734718 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/beaaaa3d-069b-465d-b6fc-defae6201642-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pzgp7\" (UID: \"beaaaa3d-069b-465d-b6fc-defae6201642\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.736371 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734724 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-cni-binary-copy\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.736371 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734740 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-etc-kubernetes\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.736371 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734767 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bn6l6\" (UniqueName: \"kubernetes.io/projected/20a034bb-c3d2-4d05-92de-ed16d2eda707-kube-api-access-bn6l6\") pod \"network-metrics-daemon-lkk5b\" (UID: \"20a034bb-c3d2-4d05-92de-ed16d2eda707\") " pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:25.736371 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734774 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-system-cni-dir\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.736371 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734801 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0f7d715-6263-49f8-ac7b-21d48a5b4438-host\") pod \"node-ca-jzqgz\" (UID: \"d0f7d715-6263-49f8-ac7b-21d48a5b4438\") " pod="openshift-image-registry/node-ca-jzqgz" Apr 24 21:27:25.737155 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734827 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-host-slash\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.737155 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734843 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-etc-sysctl-d\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.737155 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734856 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-host-var-lib-cni-bin\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.737155 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734902 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/15075605-e0df-4d3a-90f3-7c8811d07731-host-slash\") pod \"iptables-alerter-rvc97\" (UID: \"15075605-e0df-4d3a-90f3-7c8811d07731\") " pod="openshift-network-operator/iptables-alerter-rvc97" Apr 24 21:27:25.737155 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734910 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-sys\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.737155 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734939 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-os-release\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.737155 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.734997 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-os-release\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.737155 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735012 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-os-release\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.737155 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-etc-kubernetes\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.737155 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735032 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-host-run-netns\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.737155 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735049 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-host-run-k8s-cni-cncf-io\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.737155 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735072 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdqp7\" (UniqueName: \"kubernetes.io/projected/2556ea37-119f-46c5-bee7-7cfb12afca0f-kube-api-access-bdqp7\") pod \"node-resolver-w7q5f\" (UID: \"2556ea37-119f-46c5-bee7-7cfb12afca0f\") " pod="openshift-dns/node-resolver-w7q5f" Apr 24 21:27:25.737155 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735082 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-host-var-lib-kubelet\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.737155 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735097 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-host-var-lib-cni-bin\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.737155 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735120 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-host-cni-netd\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.737155 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735134 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-sys\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.737155 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735164 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/beaaaa3d-069b-465d-b6fc-defae6201642-registration-dir\") pod \"aws-ebs-csi-driver-node-pzgp7\" (UID: \"beaaaa3d-069b-465d-b6fc-defae6201642\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.737155 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735212 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hqhp\" (UniqueName: \"kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp\") pod \"network-check-target-c56pf\" (UID: \"af45c5fb-e377-44ea-ad83-ad7e5bea725b\") " pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:25.737991 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735267 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/beaaaa3d-069b-465d-b6fc-defae6201642-registration-dir\") pod \"aws-ebs-csi-driver-node-pzgp7\" (UID: \"beaaaa3d-069b-465d-b6fc-defae6201642\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.737991 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735301 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-cnibin\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.737991 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735352 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.737991 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735410 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-cnibin\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.737991 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735450 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-host-var-lib-cni-multus\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.737991 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735488 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-host-run-multus-certs\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.737991 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735487 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/15075605-e0df-4d3a-90f3-7c8811d07731-iptables-alerter-script\") pod \"iptables-alerter-rvc97\" (UID: \"15075605-e0df-4d3a-90f3-7c8811d07731\") " pod="openshift-network-operator/iptables-alerter-rvc97" Apr 24 21:27:25.737991 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735525 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-host-run-multus-certs\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.737991 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735570 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/94172220-e322-483c-ae3c-254c0bface83-host-var-lib-cni-multus\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.737991 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735591 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/beaaaa3d-069b-465d-b6fc-defae6201642-socket-dir\") pod \"aws-ebs-csi-driver-node-pzgp7\" (UID: \"beaaaa3d-069b-465d-b6fc-defae6201642\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.737991 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735636 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/beaaaa3d-069b-465d-b6fc-defae6201642-etc-selinux\") pod \"aws-ebs-csi-driver-node-pzgp7\" (UID: \"beaaaa3d-069b-465d-b6fc-defae6201642\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.737991 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735662 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-etc-sysconfig\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.737991 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735688 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/beaaaa3d-069b-465d-b6fc-defae6201642-socket-dir\") pod \"aws-ebs-csi-driver-node-pzgp7\" (UID: \"beaaaa3d-069b-465d-b6fc-defae6201642\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.737991 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735690 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-run-openvswitch\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.737991 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735724 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-node-log\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.737991 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735748 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-etc-sysconfig\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.737991 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735752 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-etc-systemd\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.738744 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735785 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.738744 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735790 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/26ff341c-82cd-4599-9c8e-31e2436a0419-etc-systemd\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.738744 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735752 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.738744 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735813 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2556ea37-119f-46c5-bee7-7cfb12afca0f-hosts-file\") pod \"node-resolver-w7q5f\" (UID: \"2556ea37-119f-46c5-bee7-7cfb12afca0f\") " pod="openshift-dns/node-resolver-w7q5f" Apr 24 21:27:25.738744 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.735853 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/beaaaa3d-069b-465d-b6fc-defae6201642-etc-selinux\") pod \"aws-ebs-csi-driver-node-pzgp7\" (UID: \"beaaaa3d-069b-465d-b6fc-defae6201642\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.738744 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.736603 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.738744 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.737394 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/26ff341c-82cd-4599-9c8e-31e2436a0419-etc-tuned\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.738744 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.738321 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26ff341c-82cd-4599-9c8e-31e2436a0419-tmp\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.739122 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.739070 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/70296c6e-ab82-4b02-8f53-04c16225df28-agent-certs\") pod \"konnectivity-agent-sprws\" (UID: \"70296c6e-ab82-4b02-8f53-04c16225df28\") " pod="kube-system/konnectivity-agent-sprws" Apr 24 21:27:25.752709 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:25.752677 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:25.752709 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:25.752707 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:25.752933 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:25.752723 2578 projected.go:194] Error preparing data for projected volume kube-api-access-4hqhp for pod openshift-network-diagnostics/network-check-target-c56pf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:25.752933 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:25.752812 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp podName:af45c5fb-e377-44ea-ad83-ad7e5bea725b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:26.25278081 +0000 UTC m=+3.080650936 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4hqhp" (UniqueName: "kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp") pod "network-check-target-c56pf" (UID: "af45c5fb-e377-44ea-ad83-ad7e5bea725b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:25.755091 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.754987 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljpdj\" (UniqueName: \"kubernetes.io/projected/beaaaa3d-069b-465d-b6fc-defae6201642-kube-api-access-ljpdj\") pod \"aws-ebs-csi-driver-node-pzgp7\" (UID: \"beaaaa3d-069b-465d-b6fc-defae6201642\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.755091 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.755035 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5kq7\" (UniqueName: \"kubernetes.io/projected/94172220-e322-483c-ae3c-254c0bface83-kube-api-access-v5kq7\") pod \"multus-xp275\" (UID: \"94172220-e322-483c-ae3c-254c0bface83\") " pod="openshift-multus/multus-xp275" Apr 24 21:27:25.755358 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.755268 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz6vk\" (UniqueName: \"kubernetes.io/projected/26ff341c-82cd-4599-9c8e-31e2436a0419-kube-api-access-gz6vk\") pod \"tuned-brwgr\" (UID: \"26ff341c-82cd-4599-9c8e-31e2436a0419\") " pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.755641 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.755614 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dztv\" (UniqueName: \"kubernetes.io/projected/15075605-e0df-4d3a-90f3-7c8811d07731-kube-api-access-9dztv\") pod \"iptables-alerter-rvc97\" (UID: \"15075605-e0df-4d3a-90f3-7c8811d07731\") " pod="openshift-network-operator/iptables-alerter-rvc97" Apr 24 21:27:25.755857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.755798 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ww2k\" (UniqueName: \"kubernetes.io/projected/6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc-kube-api-access-5ww2k\") pod \"multus-additional-cni-plugins-xs9hq\" (UID: \"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc\") " pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.755857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.755839 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn6l6\" (UniqueName: \"kubernetes.io/projected/20a034bb-c3d2-4d05-92de-ed16d2eda707-kube-api-access-bn6l6\") pod \"network-metrics-daemon-lkk5b\" (UID: \"20a034bb-c3d2-4d05-92de-ed16d2eda707\") " pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:25.763533 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.763478 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" event={"ID":"77560a24cc76a08538baa0efe4273073","Type":"ContainerStarted","Data":"4fadac1560b4b0617fbe33e06ca23ddc2ea78097dc19e6e0354b14b69893ab48"} Apr 24 21:27:25.763665 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.763644 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-184.ec2.internal" event={"ID":"e3440d423eacb4ec58cf1cc320321a41","Type":"ContainerStarted","Data":"605a0d7e07391e4b656c79039ff908d82695d541011e92ffc714acaf6b06d17f"} Apr 24 21:27:25.817624 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.817590 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:25.836972 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.836939 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-run-openvswitch\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.836972 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.836980 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-node-log\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837194 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837009 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2556ea37-119f-46c5-bee7-7cfb12afca0f-hosts-file\") pod \"node-resolver-w7q5f\" (UID: \"2556ea37-119f-46c5-bee7-7cfb12afca0f\") " pod="openshift-dns/node-resolver-w7q5f" Apr 24 21:27:25.837194 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837071 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2556ea37-119f-46c5-bee7-7cfb12afca0f-hosts-file\") pod \"node-resolver-w7q5f\" (UID: \"2556ea37-119f-46c5-bee7-7cfb12afca0f\") " pod="openshift-dns/node-resolver-w7q5f" Apr 24 21:27:25.837194 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837078 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-node-log\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837194 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837128 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-run-openvswitch\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837194 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837175 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2b9j\" (UniqueName: \"kubernetes.io/projected/d0f7d715-6263-49f8-ac7b-21d48a5b4438-kube-api-access-z2b9j\") pod \"node-ca-jzqgz\" (UID: \"d0f7d715-6263-49f8-ac7b-21d48a5b4438\") " pod="openshift-image-registry/node-ca-jzqgz" Apr 24 21:27:25.837412 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837208 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-systemd-units\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837412 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837224 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-ovnkube-script-lib\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837412 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837249 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-etc-openvswitch\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837412 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837258 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-systemd-units\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837412 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837293 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-etc-openvswitch\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837412 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837324 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837412 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837362 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-host-run-netns\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837412 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837381 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-host-kubelet\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837412 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837401 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-run-systemd\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837806 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837426 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-log-socket\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837806 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837437 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-host-run-netns\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837806 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837439 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837806 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837454 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-run-ovn\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837806 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837453 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-host-kubelet\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837806 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837479 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-ovn-node-metrics-cert\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837806 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837482 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-log-socket\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837806 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837488 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-run-systemd\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837806 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837503 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qcw9h\" (UniqueName: \"kubernetes.io/projected/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-kube-api-access-qcw9h\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837806 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837534 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-run-ovn\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837806 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837537 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d0f7d715-6263-49f8-ac7b-21d48a5b4438-serviceca\") pod \"node-ca-jzqgz\" (UID: \"d0f7d715-6263-49f8-ac7b-21d48a5b4438\") " pod="openshift-image-registry/node-ca-jzqgz" Apr 24 21:27:25.837806 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837572 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-dbus\") pod \"global-pull-secret-syncer-hxdgr\" (UID: \"7a2b19a8-7cce-48ea-a91f-3306187c2d2a\") " pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:25.837806 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837598 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2556ea37-119f-46c5-bee7-7cfb12afca0f-tmp-dir\") pod \"node-resolver-w7q5f\" (UID: \"2556ea37-119f-46c5-bee7-7cfb12afca0f\") " pod="openshift-dns/node-resolver-w7q5f" Apr 24 21:27:25.837806 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837643 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-var-lib-openvswitch\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837806 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837674 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-host-cni-bin\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837806 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837701 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret\") pod \"global-pull-secret-syncer-hxdgr\" (UID: \"7a2b19a8-7cce-48ea-a91f-3306187c2d2a\") " pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:25.837806 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837729 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-ovnkube-config\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.837806 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837733 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-dbus\") pod \"global-pull-secret-syncer-hxdgr\" (UID: \"7a2b19a8-7cce-48ea-a91f-3306187c2d2a\") " pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:25.838566 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837761 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-kubelet-config\") pod \"global-pull-secret-syncer-hxdgr\" (UID: \"7a2b19a8-7cce-48ea-a91f-3306187c2d2a\") " pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:25.838566 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837767 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-host-cni-bin\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.838566 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837788 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-env-overrides\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.838566 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837837 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-kubelet-config\") pod \"global-pull-secret-syncer-hxdgr\" (UID: \"7a2b19a8-7cce-48ea-a91f-3306187c2d2a\") " pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:25.838566 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837843 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0f7d715-6263-49f8-ac7b-21d48a5b4438-host\") pod \"node-ca-jzqgz\" (UID: \"d0f7d715-6263-49f8-ac7b-21d48a5b4438\") " pod="openshift-image-registry/node-ca-jzqgz" Apr 24 21:27:25.838566 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837789 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-var-lib-openvswitch\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.838566 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:25.837865 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:25.838566 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837871 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-host-slash\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.838566 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837910 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0f7d715-6263-49f8-ac7b-21d48a5b4438-host\") pod \"node-ca-jzqgz\" (UID: \"d0f7d715-6263-49f8-ac7b-21d48a5b4438\") " pod="openshift-image-registry/node-ca-jzqgz" Apr 24 21:27:25.838566 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837920 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2556ea37-119f-46c5-bee7-7cfb12afca0f-tmp-dir\") pod \"node-resolver-w7q5f\" (UID: \"2556ea37-119f-46c5-bee7-7cfb12afca0f\") " pod="openshift-dns/node-resolver-w7q5f" Apr 24 21:27:25.838566 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837919 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bdqp7\" (UniqueName: \"kubernetes.io/projected/2556ea37-119f-46c5-bee7-7cfb12afca0f-kube-api-access-bdqp7\") pod \"node-resolver-w7q5f\" (UID: \"2556ea37-119f-46c5-bee7-7cfb12afca0f\") " pod="openshift-dns/node-resolver-w7q5f" Apr 24 21:27:25.838566 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:25.837948 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret podName:7a2b19a8-7cce-48ea-a91f-3306187c2d2a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:26.337928723 +0000 UTC m=+3.165798853 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret") pod "global-pull-secret-syncer-hxdgr" (UID: "7a2b19a8-7cce-48ea-a91f-3306187c2d2a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:25.838566 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837949 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-ovnkube-script-lib\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.838566 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837949 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-host-slash\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.838566 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837984 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-host-cni-netd\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.838566 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.837999 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-host-cni-netd\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.838566 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.838004 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d0f7d715-6263-49f8-ac7b-21d48a5b4438-serviceca\") pod \"node-ca-jzqgz\" (UID: \"d0f7d715-6263-49f8-ac7b-21d48a5b4438\") " pod="openshift-image-registry/node-ca-jzqgz" Apr 24 21:27:25.839373 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.838027 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.839373 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.838093 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.839373 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.838412 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-ovnkube-config\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.839373 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.838434 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-env-overrides\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.840068 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.840043 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-ovn-node-metrics-cert\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.878733 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.878704 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2b9j\" (UniqueName: \"kubernetes.io/projected/d0f7d715-6263-49f8-ac7b-21d48a5b4438-kube-api-access-z2b9j\") pod \"node-ca-jzqgz\" (UID: \"d0f7d715-6263-49f8-ac7b-21d48a5b4438\") " pod="openshift-image-registry/node-ca-jzqgz" Apr 24 21:27:25.878870 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.878709 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdqp7\" (UniqueName: \"kubernetes.io/projected/2556ea37-119f-46c5-bee7-7cfb12afca0f-kube-api-access-bdqp7\") pod \"node-resolver-w7q5f\" (UID: \"2556ea37-119f-46c5-bee7-7cfb12afca0f\") " pod="openshift-dns/node-resolver-w7q5f" Apr 24 21:27:25.884741 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.884712 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcw9h\" (UniqueName: \"kubernetes.io/projected/cde5bc87-530f-4ee7-8f38-39b875bbd4e6-kube-api-access-qcw9h\") pod \"ovnkube-node-krb9p\" (UID: \"cde5bc87-530f-4ee7-8f38-39b875bbd4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:25.920220 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.920186 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-sprws" Apr 24 21:27:25.927687 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.927627 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xp275" Apr 24 21:27:25.941957 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.941931 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rvc97" Apr 24 21:27:25.954605 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.954578 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-brwgr" Apr 24 21:27:25.962328 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.962300 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" Apr 24 21:27:25.970862 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.970837 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xs9hq" Apr 24 21:27:25.978576 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.978548 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jzqgz" Apr 24 21:27:25.985367 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.985338 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w7q5f" Apr 24 21:27:25.991090 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:25.991067 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:26.241308 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:26.241219 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs\") pod \"network-metrics-daemon-lkk5b\" (UID: \"20a034bb-c3d2-4d05-92de-ed16d2eda707\") " pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:26.241483 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:26.241394 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:26.241483 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:26.241474 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs podName:20a034bb-c3d2-4d05-92de-ed16d2eda707 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:27.241454859 +0000 UTC m=+4.069324968 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs") pod "network-metrics-daemon-lkk5b" (UID: "20a034bb-c3d2-4d05-92de-ed16d2eda707") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:26.342651 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:26.342614 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret\") pod \"global-pull-secret-syncer-hxdgr\" (UID: \"7a2b19a8-7cce-48ea-a91f-3306187c2d2a\") " pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:26.342852 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:26.342674 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hqhp\" (UniqueName: \"kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp\") pod \"network-check-target-c56pf\" (UID: \"af45c5fb-e377-44ea-ad83-ad7e5bea725b\") " pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:26.342852 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:26.342791 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:26.342994 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:26.342879 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret podName:7a2b19a8-7cce-48ea-a91f-3306187c2d2a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:27.3428554 +0000 UTC m=+4.170725515 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret") pod "global-pull-secret-syncer-hxdgr" (UID: "7a2b19a8-7cce-48ea-a91f-3306187c2d2a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:26.342994 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:26.342795 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:26.342994 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:26.342939 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:26.342994 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:26.342951 2578 projected.go:194] Error preparing data for projected volume kube-api-access-4hqhp for pod openshift-network-diagnostics/network-check-target-c56pf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:26.343145 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:26.343006 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp podName:af45c5fb-e377-44ea-ad83-ad7e5bea725b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:27.342993502 +0000 UTC m=+4.170863613 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-4hqhp" (UniqueName: "kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp") pod "network-check-target-c56pf" (UID: "af45c5fb-e377-44ea-ad83-ad7e5bea725b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:26.523065 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:26.523038 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94172220_e322_483c_ae3c_254c0bface83.slice/crio-2e6c2fff3bc30d10867ac1bc2e207b0ba6bb9db328794e2953f2f20c777c7984 WatchSource:0}: Error finding container 2e6c2fff3bc30d10867ac1bc2e207b0ba6bb9db328794e2953f2f20c777c7984: Status 404 returned error can't find the container with id 2e6c2fff3bc30d10867ac1bc2e207b0ba6bb9db328794e2953f2f20c777c7984 Apr 24 21:27:26.524582 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:26.524478 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26ff341c_82cd_4599_9c8e_31e2436a0419.slice/crio-b19a69fa5959c6cdd0679a0cb1e15f3434a6957a3b5c66a870554bf842478e83 WatchSource:0}: Error finding container b19a69fa5959c6cdd0679a0cb1e15f3434a6957a3b5c66a870554bf842478e83: Status 404 returned error can't find the container with id b19a69fa5959c6cdd0679a0cb1e15f3434a6957a3b5c66a870554bf842478e83 Apr 24 21:27:26.525816 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:26.525791 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbeaaaa3d_069b_465d_b6fc_defae6201642.slice/crio-486d08e1a21bfb436b2f865f90e48060e7bf3497d3912ab4c07edf4dcd048851 WatchSource:0}: Error finding container 486d08e1a21bfb436b2f865f90e48060e7bf3497d3912ab4c07edf4dcd048851: Status 404 returned error can't find the container with id 486d08e1a21bfb436b2f865f90e48060e7bf3497d3912ab4c07edf4dcd048851 Apr 24 21:27:26.529490 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:26.529467 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e5f4710_6ba6_44f1_ac71_c7d39ea48ffc.slice/crio-f7dd1dd5ba4764e50a96f612863de57c9e559ad07d45bcaa0f2aa59806fda936 WatchSource:0}: Error finding container f7dd1dd5ba4764e50a96f612863de57c9e559ad07d45bcaa0f2aa59806fda936: Status 404 returned error can't find the container with id f7dd1dd5ba4764e50a96f612863de57c9e559ad07d45bcaa0f2aa59806fda936 Apr 24 21:27:26.530557 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:26.530532 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70296c6e_ab82_4b02_8f53_04c16225df28.slice/crio-eb3301d251f54328d93ccab1fe7d4218c32294c0cde9d817ebd6bb45e4eda863 WatchSource:0}: Error finding container eb3301d251f54328d93ccab1fe7d4218c32294c0cde9d817ebd6bb45e4eda863: Status 404 returned error can't find the container with id eb3301d251f54328d93ccab1fe7d4218c32294c0cde9d817ebd6bb45e4eda863 Apr 24 21:27:26.532050 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:26.531835 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0f7d715_6263_49f8_ac7b_21d48a5b4438.slice/crio-3f661ee6f96b5848d501ac4e0873a3a4b406b4cf8ab66bc88b21460ef9d7cf48 WatchSource:0}: Error finding container 3f661ee6f96b5848d501ac4e0873a3a4b406b4cf8ab66bc88b21460ef9d7cf48: Status 404 returned error can't find the container with id 3f661ee6f96b5848d501ac4e0873a3a4b406b4cf8ab66bc88b21460ef9d7cf48 Apr 24 21:27:26.532277 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:26.532149 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2556ea37_119f_46c5_bee7_7cfb12afca0f.slice/crio-33926b5d7578f8001cda5a0944ad0aafe63fb06d1be361030f455aea19c3a8b0 WatchSource:0}: Error finding container 33926b5d7578f8001cda5a0944ad0aafe63fb06d1be361030f455aea19c3a8b0: Status 404 returned error can't find the container with id 33926b5d7578f8001cda5a0944ad0aafe63fb06d1be361030f455aea19c3a8b0 Apr 24 21:27:26.533273 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:26.533252 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcde5bc87_530f_4ee7_8f38_39b875bbd4e6.slice/crio-286630f3b4f07aa6d1a69a43583f234f8fb7d31b4085b5dba4714fa0cc1b145d WatchSource:0}: Error finding container 286630f3b4f07aa6d1a69a43583f234f8fb7d31b4085b5dba4714fa0cc1b145d: Status 404 returned error can't find the container with id 286630f3b4f07aa6d1a69a43583f234f8fb7d31b4085b5dba4714fa0cc1b145d Apr 24 21:27:26.536923 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:27:26.535424 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15075605_e0df_4d3a_90f3_7c8811d07731.slice/crio-736d0e64563d958f735d60d9ee61a70da3d66b3e2ee4d6242f7c2f0b403d827b WatchSource:0}: Error finding container 736d0e64563d958f735d60d9ee61a70da3d66b3e2ee4d6242f7c2f0b403d827b: Status 404 returned error can't find the container with id 736d0e64563d958f735d60d9ee61a70da3d66b3e2ee4d6242f7c2f0b403d827b Apr 24 21:27:26.660186 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:26.659993 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:24 +0000 UTC" deadline="2027-10-13 08:23:51.066940744 +0000 UTC" Apr 24 21:27:26.660186 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:26.660178 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12874h56m24.406765762s" Apr 24 21:27:26.758833 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:26.758727 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:26.758833 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:26.758731 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:26.759017 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:26.758953 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:27:26.759017 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:26.758831 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:27:26.765718 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:26.765691 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rvc97" event={"ID":"15075605-e0df-4d3a-90f3-7c8811d07731","Type":"ContainerStarted","Data":"736d0e64563d958f735d60d9ee61a70da3d66b3e2ee4d6242f7c2f0b403d827b"} Apr 24 21:27:26.766590 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:26.766569 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" event={"ID":"cde5bc87-530f-4ee7-8f38-39b875bbd4e6","Type":"ContainerStarted","Data":"286630f3b4f07aa6d1a69a43583f234f8fb7d31b4085b5dba4714fa0cc1b145d"} Apr 24 21:27:26.767544 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:26.767526 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jzqgz" event={"ID":"d0f7d715-6263-49f8-ac7b-21d48a5b4438","Type":"ContainerStarted","Data":"3f661ee6f96b5848d501ac4e0873a3a4b406b4cf8ab66bc88b21460ef9d7cf48"} Apr 24 21:27:26.769503 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:26.769483 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xs9hq" event={"ID":"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc","Type":"ContainerStarted","Data":"f7dd1dd5ba4764e50a96f612863de57c9e559ad07d45bcaa0f2aa59806fda936"} Apr 24 21:27:26.771406 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:26.771377 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" event={"ID":"beaaaa3d-069b-465d-b6fc-defae6201642","Type":"ContainerStarted","Data":"486d08e1a21bfb436b2f865f90e48060e7bf3497d3912ab4c07edf4dcd048851"} Apr 24 21:27:26.772916 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:26.772879 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-184.ec2.internal" event={"ID":"e3440d423eacb4ec58cf1cc320321a41","Type":"ContainerStarted","Data":"0ea3b91108b6ecb34dba6162350a6d1aec4c0bd0993f25625ddacf9ed8d66c1d"} Apr 24 21:27:26.773826 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:26.773798 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w7q5f" event={"ID":"2556ea37-119f-46c5-bee7-7cfb12afca0f","Type":"ContainerStarted","Data":"33926b5d7578f8001cda5a0944ad0aafe63fb06d1be361030f455aea19c3a8b0"} Apr 24 21:27:26.777023 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:26.777000 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-sprws" event={"ID":"70296c6e-ab82-4b02-8f53-04c16225df28","Type":"ContainerStarted","Data":"eb3301d251f54328d93ccab1fe7d4218c32294c0cde9d817ebd6bb45e4eda863"} Apr 24 21:27:26.778510 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:26.778485 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-brwgr" event={"ID":"26ff341c-82cd-4599-9c8e-31e2436a0419","Type":"ContainerStarted","Data":"b19a69fa5959c6cdd0679a0cb1e15f3434a6957a3b5c66a870554bf842478e83"} Apr 24 21:27:26.780827 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:26.780802 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xp275" event={"ID":"94172220-e322-483c-ae3c-254c0bface83","Type":"ContainerStarted","Data":"2e6c2fff3bc30d10867ac1bc2e207b0ba6bb9db328794e2953f2f20c777c7984"} Apr 24 21:27:26.788088 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:26.788043 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-184.ec2.internal" podStartSLOduration=1.7880294650000002 podStartE2EDuration="1.788029465s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:26.787694845 +0000 UTC m=+3.615564977" watchObservedRunningTime="2026-04-24 21:27:26.788029465 +0000 UTC m=+3.615899597" Apr 24 21:27:27.250863 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:27.250765 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs\") pod \"network-metrics-daemon-lkk5b\" (UID: \"20a034bb-c3d2-4d05-92de-ed16d2eda707\") " pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:27.251044 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:27.250951 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:27.251044 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:27.251019 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs podName:20a034bb-c3d2-4d05-92de-ed16d2eda707 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:29.250998786 +0000 UTC m=+6.078868899 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs") pod "network-metrics-daemon-lkk5b" (UID: "20a034bb-c3d2-4d05-92de-ed16d2eda707") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:27.351723 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:27.351687 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret\") pod \"global-pull-secret-syncer-hxdgr\" (UID: \"7a2b19a8-7cce-48ea-a91f-3306187c2d2a\") " pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:27.351910 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:27.351753 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hqhp\" (UniqueName: \"kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp\") pod \"network-check-target-c56pf\" (UID: \"af45c5fb-e377-44ea-ad83-ad7e5bea725b\") " pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:27.351910 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:27.351858 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:27.351910 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:27.351902 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:27.352076 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:27.351924 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:27.352076 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:27.351938 2578 projected.go:194] Error preparing data for projected volume kube-api-access-4hqhp for pod openshift-network-diagnostics/network-check-target-c56pf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:27.352076 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:27.351951 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret podName:7a2b19a8-7cce-48ea-a91f-3306187c2d2a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:29.351930129 +0000 UTC m=+6.179800242 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret") pod "global-pull-secret-syncer-hxdgr" (UID: "7a2b19a8-7cce-48ea-a91f-3306187c2d2a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:27.352076 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:27.351989 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp podName:af45c5fb-e377-44ea-ad83-ad7e5bea725b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:29.351973297 +0000 UTC m=+6.179843427 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-4hqhp" (UniqueName: "kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp") pod "network-check-target-c56pf" (UID: "af45c5fb-e377-44ea-ad83-ad7e5bea725b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:27.760662 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:27.760631 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:27.761108 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:27.760747 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:27:27.819426 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:27.819325 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" event={"ID":"77560a24cc76a08538baa0efe4273073","Type":"ContainerStarted","Data":"570f7a1718189cc31f8cc7b55a6203b623bf7c0ea825a24665ed23bbfbb4434b"} Apr 24 21:27:28.758920 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:28.758273 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:28.758920 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:28.758397 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:27:28.758920 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:28.758784 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:28.759310 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:28.759256 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:27:28.846948 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:28.846521 2578 generic.go:358] "Generic (PLEG): container finished" podID="77560a24cc76a08538baa0efe4273073" containerID="570f7a1718189cc31f8cc7b55a6203b623bf7c0ea825a24665ed23bbfbb4434b" exitCode=0 Apr 24 21:27:28.846948 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:28.846578 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" event={"ID":"77560a24cc76a08538baa0efe4273073","Type":"ContainerDied","Data":"570f7a1718189cc31f8cc7b55a6203b623bf7c0ea825a24665ed23bbfbb4434b"} Apr 24 21:27:29.268621 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:29.268492 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs\") pod \"network-metrics-daemon-lkk5b\" (UID: \"20a034bb-c3d2-4d05-92de-ed16d2eda707\") " pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:29.268799 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:29.268673 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:29.268799 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:29.268746 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs podName:20a034bb-c3d2-4d05-92de-ed16d2eda707 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:33.268726954 +0000 UTC m=+10.096597071 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs") pod "network-metrics-daemon-lkk5b" (UID: "20a034bb-c3d2-4d05-92de-ed16d2eda707") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:29.369175 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:29.369081 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret\") pod \"global-pull-secret-syncer-hxdgr\" (UID: \"7a2b19a8-7cce-48ea-a91f-3306187c2d2a\") " pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:29.369175 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:29.369147 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hqhp\" (UniqueName: \"kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp\") pod \"network-check-target-c56pf\" (UID: \"af45c5fb-e377-44ea-ad83-ad7e5bea725b\") " pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:29.369360 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:29.369309 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:29.369360 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:29.369326 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:29.369360 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:29.369339 2578 projected.go:194] Error preparing data for projected volume kube-api-access-4hqhp for pod openshift-network-diagnostics/network-check-target-c56pf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:29.369488 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:29.369397 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp podName:af45c5fb-e377-44ea-ad83-ad7e5bea725b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:33.36937919 +0000 UTC m=+10.197249304 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-4hqhp" (UniqueName: "kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp") pod "network-check-target-c56pf" (UID: "af45c5fb-e377-44ea-ad83-ad7e5bea725b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:29.369871 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:29.369789 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:29.369871 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:29.369840 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret podName:7a2b19a8-7cce-48ea-a91f-3306187c2d2a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:33.369824663 +0000 UTC m=+10.197694779 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret") pod "global-pull-secret-syncer-hxdgr" (UID: "7a2b19a8-7cce-48ea-a91f-3306187c2d2a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:29.759531 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:29.758935 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:29.759531 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:29.759068 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:27:30.759190 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:30.759146 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:30.759655 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:30.759370 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:27:30.759824 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:30.759146 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:30.759963 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:30.759830 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:27:31.759157 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:31.759121 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:31.759346 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:31.759274 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:27:32.758993 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:32.758958 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:32.759166 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:32.758971 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:32.759166 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:32.759089 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:27:32.759166 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:32.759143 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:27:33.302304 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:33.302264 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs\") pod \"network-metrics-daemon-lkk5b\" (UID: \"20a034bb-c3d2-4d05-92de-ed16d2eda707\") " pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:33.302755 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:33.302422 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:33.302755 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:33.302515 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs podName:20a034bb-c3d2-4d05-92de-ed16d2eda707 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:41.30249306 +0000 UTC m=+18.130363189 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs") pod "network-metrics-daemon-lkk5b" (UID: "20a034bb-c3d2-4d05-92de-ed16d2eda707") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:33.403418 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:33.403375 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret\") pod \"global-pull-secret-syncer-hxdgr\" (UID: \"7a2b19a8-7cce-48ea-a91f-3306187c2d2a\") " pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:33.403594 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:33.403448 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hqhp\" (UniqueName: \"kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp\") pod \"network-check-target-c56pf\" (UID: \"af45c5fb-e377-44ea-ad83-ad7e5bea725b\") " pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:33.403594 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:33.403540 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:33.403594 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:33.403555 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:33.403594 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:33.403575 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:33.403594 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:33.403588 2578 projected.go:194] Error preparing data for projected volume kube-api-access-4hqhp for pod openshift-network-diagnostics/network-check-target-c56pf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:33.403846 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:33.403614 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret podName:7a2b19a8-7cce-48ea-a91f-3306187c2d2a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:41.403593604 +0000 UTC m=+18.231463720 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret") pod "global-pull-secret-syncer-hxdgr" (UID: "7a2b19a8-7cce-48ea-a91f-3306187c2d2a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:33.403846 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:33.403638 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp podName:af45c5fb-e377-44ea-ad83-ad7e5bea725b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:41.403622492 +0000 UTC m=+18.231492603 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-4hqhp" (UniqueName: "kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp") pod "network-check-target-c56pf" (UID: "af45c5fb-e377-44ea-ad83-ad7e5bea725b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:33.759636 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:33.759115 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:33.759636 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:33.759233 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:27:34.758904 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:34.758857 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:34.759401 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:34.758863 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:34.759401 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:34.759032 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:27:34.759401 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:34.759096 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:27:35.759316 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:35.759233 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:35.759736 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:35.759373 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:27:36.759396 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:36.759354 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:36.759865 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:36.759363 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:36.759865 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:36.759498 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:27:36.759865 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:36.759608 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:27:37.758986 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:37.758950 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:37.759180 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:37.759064 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:27:38.759265 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:38.759235 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:38.759265 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:38.759261 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:38.759714 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:38.759370 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:27:38.759714 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:38.759488 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:27:39.759366 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:39.759337 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:39.759775 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:39.759442 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:27:40.758798 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:40.758755 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:40.758987 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:40.758760 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:40.758987 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:40.758918 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:27:40.759081 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:40.758980 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:27:41.361695 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:41.361652 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs\") pod \"network-metrics-daemon-lkk5b\" (UID: \"20a034bb-c3d2-4d05-92de-ed16d2eda707\") " pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:41.362186 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:41.361828 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:41.362186 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:41.361932 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs podName:20a034bb-c3d2-4d05-92de-ed16d2eda707 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:57.361908621 +0000 UTC m=+34.189778742 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs") pod "network-metrics-daemon-lkk5b" (UID: "20a034bb-c3d2-4d05-92de-ed16d2eda707") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:41.462081 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:41.462043 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret\") pod \"global-pull-secret-syncer-hxdgr\" (UID: \"7a2b19a8-7cce-48ea-a91f-3306187c2d2a\") " pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:41.462279 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:41.462102 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hqhp\" (UniqueName: \"kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp\") pod \"network-check-target-c56pf\" (UID: \"af45c5fb-e377-44ea-ad83-ad7e5bea725b\") " pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:41.462279 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:41.462220 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:41.462279 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:41.462235 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:41.462279 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:41.462248 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:41.462279 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:41.462262 2578 projected.go:194] Error preparing data for projected volume kube-api-access-4hqhp for pod openshift-network-diagnostics/network-check-target-c56pf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:41.462474 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:41.462300 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret podName:7a2b19a8-7cce-48ea-a91f-3306187c2d2a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:57.462278185 +0000 UTC m=+34.290148299 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret") pod "global-pull-secret-syncer-hxdgr" (UID: "7a2b19a8-7cce-48ea-a91f-3306187c2d2a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:41.462474 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:41.462316 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp podName:af45c5fb-e377-44ea-ad83-ad7e5bea725b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:57.462309639 +0000 UTC m=+34.290179749 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-4hqhp" (UniqueName: "kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp") pod "network-check-target-c56pf" (UID: "af45c5fb-e377-44ea-ad83-ad7e5bea725b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:41.759079 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:41.758996 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:41.759236 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:41.759141 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:27:42.758606 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:42.758565 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:42.759066 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:42.758565 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:42.759066 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:42.758707 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:27:42.759066 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:42.758756 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:27:43.759562 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:43.759538 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:43.759901 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:43.759673 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:27:43.872289 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:43.872253 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" event={"ID":"beaaaa3d-069b-465d-b6fc-defae6201642","Type":"ContainerStarted","Data":"9ae653eabd6fb6d330c4d63c8b72c37ebf3b77a499d18aeab0ffb5ce5bb7ab75"} Apr 24 21:27:43.873818 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:43.873786 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" event={"ID":"77560a24cc76a08538baa0efe4273073","Type":"ContainerStarted","Data":"fb10a57ef78113ba08d54176db6da3802eea7d4fab6e70500b024b6f2bea6d4e"} Apr 24 21:27:43.874902 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:43.874866 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w7q5f" event={"ID":"2556ea37-119f-46c5-bee7-7cfb12afca0f","Type":"ContainerStarted","Data":"4a4039c553e433a88e557ef97810b53ceb8c649634bf9973fde13d6d94342c06"} Apr 24 21:27:43.875961 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:43.875939 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-sprws" event={"ID":"70296c6e-ab82-4b02-8f53-04c16225df28","Type":"ContainerStarted","Data":"e945951ec65a94b97c68283ec5df87f6b77858a540e3dc68c429179da0143149"} Apr 24 21:27:43.876910 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:43.876874 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-brwgr" event={"ID":"26ff341c-82cd-4599-9c8e-31e2436a0419","Type":"ContainerStarted","Data":"6e0b40cdc87fad2057914a570aa1d2eac2d79efb97b2feee3ede8487c6bc76b8"} Apr 24 21:27:43.877834 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:43.877811 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xp275" event={"ID":"94172220-e322-483c-ae3c-254c0bface83","Type":"ContainerStarted","Data":"d1a32624b3f2ae62eaa359cd1368176081e6db86c024df722cfd787c300d7bcb"} Apr 24 21:27:43.878954 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:43.878936 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" event={"ID":"cde5bc87-530f-4ee7-8f38-39b875bbd4e6","Type":"ContainerStarted","Data":"4a0a46112fdb2729a44752620083e642c234ad6da6aa73b04fb781a69ccbc7ff"} Apr 24 21:27:43.880084 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:43.880058 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jzqgz" event={"ID":"d0f7d715-6263-49f8-ac7b-21d48a5b4438","Type":"ContainerStarted","Data":"935f3584bd28e896db1f8d848ea519c7e917708ef83783b0350f77f897236a96"} Apr 24 21:27:43.881130 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:43.881111 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xs9hq" event={"ID":"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc","Type":"ContainerStarted","Data":"69c1e9b4e26edea7f0d83e7eef50fac97e9d9b215a7b573aa469a21a7790fa6e"} Apr 24 21:27:43.920307 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:43.920260 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-sprws" podStartSLOduration=4.035056034 podStartE2EDuration="20.920245816s" podCreationTimestamp="2026-04-24 21:27:23 +0000 UTC" firstStartedPulling="2026-04-24 21:27:26.533035362 +0000 UTC m=+3.360905485" lastFinishedPulling="2026-04-24 21:27:43.418225156 +0000 UTC m=+20.246095267" observedRunningTime="2026-04-24 21:27:43.919931442 +0000 UTC m=+20.747801576" watchObservedRunningTime="2026-04-24 21:27:43.920245816 +0000 UTC m=+20.748115947" Apr 24 21:27:43.937427 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:43.937380 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xp275" podStartSLOduration=3.977011526 podStartE2EDuration="20.937363886s" podCreationTimestamp="2026-04-24 21:27:23 +0000 UTC" firstStartedPulling="2026-04-24 21:27:26.525091373 +0000 UTC m=+3.352961498" lastFinishedPulling="2026-04-24 21:27:43.485443739 +0000 UTC m=+20.313313858" observedRunningTime="2026-04-24 21:27:43.93677108 +0000 UTC m=+20.764641224" watchObservedRunningTime="2026-04-24 21:27:43.937363886 +0000 UTC m=+20.765234018" Apr 24 21:27:43.952282 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:43.952234 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-brwgr" podStartSLOduration=4.060924755 podStartE2EDuration="20.9522202s" podCreationTimestamp="2026-04-24 21:27:23 +0000 UTC" firstStartedPulling="2026-04-24 21:27:26.527047962 +0000 UTC m=+3.354918072" lastFinishedPulling="2026-04-24 21:27:43.418343393 +0000 UTC m=+20.246213517" observedRunningTime="2026-04-24 21:27:43.952021338 +0000 UTC m=+20.779891492" watchObservedRunningTime="2026-04-24 21:27:43.9522202 +0000 UTC m=+20.780090331" Apr 24 21:27:43.967974 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:43.967932 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jzqgz" podStartSLOduration=4.085548872 podStartE2EDuration="20.967916199s" podCreationTimestamp="2026-04-24 21:27:23 +0000 UTC" firstStartedPulling="2026-04-24 21:27:26.536004459 +0000 UTC m=+3.363874576" lastFinishedPulling="2026-04-24 21:27:43.418371778 +0000 UTC m=+20.246241903" observedRunningTime="2026-04-24 21:27:43.967667206 +0000 UTC m=+20.795537338" watchObservedRunningTime="2026-04-24 21:27:43.967916199 +0000 UTC m=+20.795786330" Apr 24 21:27:43.985389 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:43.985341 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w7q5f" podStartSLOduration=4.060443637 podStartE2EDuration="20.985325324s" podCreationTimestamp="2026-04-24 21:27:23 +0000 UTC" firstStartedPulling="2026-04-24 21:27:26.535302591 +0000 UTC m=+3.363172702" lastFinishedPulling="2026-04-24 21:27:43.460184266 +0000 UTC m=+20.288054389" observedRunningTime="2026-04-24 21:27:43.985154629 +0000 UTC m=+20.813024762" watchObservedRunningTime="2026-04-24 21:27:43.985325324 +0000 UTC m=+20.813195460" Apr 24 21:27:44.005942 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:44.005875 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" podStartSLOduration=19.005860449 podStartE2EDuration="19.005860449s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:44.005334173 +0000 UTC m=+20.833204304" watchObservedRunningTime="2026-04-24 21:27:44.005860449 +0000 UTC m=+20.833730581" Apr 24 21:27:44.759447 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:44.759254 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:44.759618 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:44.759279 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:44.759618 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:44.759513 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:27:44.760256 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:44.759615 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:27:44.886401 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:44.886373 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krb9p_cde5bc87-530f-4ee7-8f38-39b875bbd4e6/ovn-acl-logging/0.log" Apr 24 21:27:44.887089 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:44.887058 2578 generic.go:358] "Generic (PLEG): container finished" podID="cde5bc87-530f-4ee7-8f38-39b875bbd4e6" containerID="b114224d5a7a6f27c3977b1e5c3a0ed3526061d1276c304d4b5e5ee61917cbe8" exitCode=1 Apr 24 21:27:44.887225 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:44.887125 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" event={"ID":"cde5bc87-530f-4ee7-8f38-39b875bbd4e6","Type":"ContainerStarted","Data":"1e41f323fd0039680c6ee8273182ba0d63de8a978b195ab2a7e1b6e04d9b297c"} Apr 24 21:27:44.887225 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:44.887177 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" event={"ID":"cde5bc87-530f-4ee7-8f38-39b875bbd4e6","Type":"ContainerStarted","Data":"4d2ecbfb7efad07007d3611ab994badd2ecff149f76f7e8744a2ed6ba2e171bb"} Apr 24 21:27:44.887225 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:44.887192 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" event={"ID":"cde5bc87-530f-4ee7-8f38-39b875bbd4e6","Type":"ContainerStarted","Data":"eb54adca09b7e25d4885f39d48945abd2427f60e108c3fea9e19a0548aacbb54"} Apr 24 21:27:44.887225 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:44.887205 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" event={"ID":"cde5bc87-530f-4ee7-8f38-39b875bbd4e6","Type":"ContainerStarted","Data":"4a4384d40e8f874c7dcd8c4c83e0309b2ff59b656ad2741f28a40722cf972b3c"} Apr 24 21:27:44.887225 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:44.887217 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" event={"ID":"cde5bc87-530f-4ee7-8f38-39b875bbd4e6","Type":"ContainerDied","Data":"b114224d5a7a6f27c3977b1e5c3a0ed3526061d1276c304d4b5e5ee61917cbe8"} Apr 24 21:27:44.888638 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:44.888613 2578 generic.go:358] "Generic (PLEG): container finished" podID="6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc" containerID="69c1e9b4e26edea7f0d83e7eef50fac97e9d9b215a7b573aa469a21a7790fa6e" exitCode=0 Apr 24 21:27:44.888759 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:44.888695 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xs9hq" event={"ID":"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc","Type":"ContainerDied","Data":"69c1e9b4e26edea7f0d83e7eef50fac97e9d9b215a7b573aa469a21a7790fa6e"} Apr 24 21:27:45.099243 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:45.099209 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:27:45.692080 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:45.691918 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:27:45.099234916Z","UUID":"006f8179-9c1b-42fc-a31a-9566e602f3a0","Handler":null,"Name":"","Endpoint":""} Apr 24 21:27:45.695280 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:45.695254 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:27:45.695431 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:45.695301 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:27:45.759222 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:45.758738 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:45.759222 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:45.758871 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:27:45.892144 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:45.892111 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rvc97" event={"ID":"15075605-e0df-4d3a-90f3-7c8811d07731","Type":"ContainerStarted","Data":"90826c290a7d8dbba99fab0e968d53c84e68df396cdd1c840950d370c4c80dcf"} Apr 24 21:27:45.893876 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:45.893851 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" event={"ID":"beaaaa3d-069b-465d-b6fc-defae6201642","Type":"ContainerStarted","Data":"f1823ab4d09bfc484bf9d7ad480bf8fb2746a18709c5d1417d215efee5fd4195"} Apr 24 21:27:46.440188 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:46.440153 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-sprws" Apr 24 21:27:46.441136 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:46.441109 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-sprws" Apr 24 21:27:46.453351 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:46.453287 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-sprws" Apr 24 21:27:46.453822 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:46.453797 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-sprws" Apr 24 21:27:46.458318 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:46.458272 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rvc97" podStartSLOduration=6.536933754 podStartE2EDuration="23.458255735s" podCreationTimestamp="2026-04-24 21:27:23 +0000 UTC" firstStartedPulling="2026-04-24 21:27:26.538613051 +0000 UTC m=+3.366483168" lastFinishedPulling="2026-04-24 21:27:43.459935033 +0000 UTC m=+20.287805149" observedRunningTime="2026-04-24 21:27:45.91126753 +0000 UTC m=+22.739137664" watchObservedRunningTime="2026-04-24 21:27:46.458255735 +0000 UTC m=+23.286125866" Apr 24 21:27:46.759483 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:46.759235 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:46.759642 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:46.759235 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:46.759642 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:46.759550 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:27:46.759746 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:46.759634 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:27:46.898881 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:46.898838 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krb9p_cde5bc87-530f-4ee7-8f38-39b875bbd4e6/ovn-acl-logging/0.log" Apr 24 21:27:46.899381 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:46.899356 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" event={"ID":"cde5bc87-530f-4ee7-8f38-39b875bbd4e6","Type":"ContainerStarted","Data":"0a80665cc947b349d5ac24193cef1c974dc30655b8021239e8c8f05d4158d2c6"} Apr 24 21:27:46.901319 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:46.901290 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" event={"ID":"beaaaa3d-069b-465d-b6fc-defae6201642","Type":"ContainerStarted","Data":"edf781f8059e5ff11b9878dc6bb0a6e6fd99c401bd58c7e634e8a5786f532a2d"} Apr 24 21:27:47.758961 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:47.758922 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:47.759182 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:47.759060 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:27:48.758781 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:48.758746 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:48.759366 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:48.758746 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:48.759366 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:48.758854 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:27:48.759366 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:48.758978 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:27:49.759069 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:49.758840 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:49.759679 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:49.759175 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:27:49.911462 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:49.911429 2578 generic.go:358] "Generic (PLEG): container finished" podID="6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc" containerID="eb6217bc78bf31032bea6ed98427040ecb1d78ae5523f45164f1ced75d434c7c" exitCode=0 Apr 24 21:27:49.911620 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:49.911512 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xs9hq" event={"ID":"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc","Type":"ContainerDied","Data":"eb6217bc78bf31032bea6ed98427040ecb1d78ae5523f45164f1ced75d434c7c"} Apr 24 21:27:49.914536 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:49.914517 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krb9p_cde5bc87-530f-4ee7-8f38-39b875bbd4e6/ovn-acl-logging/0.log" Apr 24 21:27:49.914851 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:49.914834 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" event={"ID":"cde5bc87-530f-4ee7-8f38-39b875bbd4e6","Type":"ContainerStarted","Data":"92be4b84c6ae5cd17c2321a0c33688bb6020cb3d1c1c5625c21102afe3bc81f7"} Apr 24 21:27:49.915251 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:49.915233 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:49.915311 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:49.915260 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:49.915362 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:49.915346 2578 scope.go:117] "RemoveContainer" containerID="b114224d5a7a6f27c3977b1e5c3a0ed3526061d1276c304d4b5e5ee61917cbe8" Apr 24 21:27:49.933534 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:49.933510 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:49.933619 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:49.933600 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:49.936463 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:49.936420 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pzgp7" podStartSLOduration=7.315068952 podStartE2EDuration="26.936406094s" podCreationTimestamp="2026-04-24 21:27:23 +0000 UTC" firstStartedPulling="2026-04-24 21:27:26.528466533 +0000 UTC m=+3.356336646" lastFinishedPulling="2026-04-24 21:27:46.149803671 +0000 UTC m=+22.977673788" observedRunningTime="2026-04-24 21:27:46.921258778 +0000 UTC m=+23.749128913" watchObservedRunningTime="2026-04-24 21:27:49.936406094 +0000 UTC m=+26.764276285" Apr 24 21:27:50.084982 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:50.084867 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:27:50.759103 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:50.759065 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:50.759103 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:50.759093 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:50.759621 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:50.759211 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:27:50.759621 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:50.759323 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:27:50.860479 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:50.860397 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lkk5b"] Apr 24 21:27:50.864527 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:50.864495 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-c56pf"] Apr 24 21:27:50.864665 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:50.864643 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:50.864755 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:50.864732 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:27:50.873776 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:50.873745 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hxdgr"] Apr 24 21:27:50.920243 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:50.920220 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krb9p_cde5bc87-530f-4ee7-8f38-39b875bbd4e6/ovn-acl-logging/0.log" Apr 24 21:27:50.920567 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:50.920546 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" event={"ID":"cde5bc87-530f-4ee7-8f38-39b875bbd4e6","Type":"ContainerStarted","Data":"7f287a8d243072c729a8582b97e527465275a86829b426941c6edd739060f701"} Apr 24 21:27:50.922365 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:50.922342 2578 generic.go:358] "Generic (PLEG): container finished" podID="6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc" containerID="e6d933adbe0639bfc29632b3119c7e7e4b252d908b3df806b403b8613cef8a1f" exitCode=0 Apr 24 21:27:50.922467 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:50.922416 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xs9hq" event={"ID":"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc","Type":"ContainerDied","Data":"e6d933adbe0639bfc29632b3119c7e7e4b252d908b3df806b403b8613cef8a1f"} Apr 24 21:27:50.922507 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:50.922471 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:50.922575 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:50.922556 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:27:50.922615 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:50.922596 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:50.922687 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:50.922672 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:27:50.953453 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:50.953394 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" podStartSLOduration=10.756645565 podStartE2EDuration="27.953378074s" podCreationTimestamp="2026-04-24 21:27:23 +0000 UTC" firstStartedPulling="2026-04-24 21:27:26.538248803 +0000 UTC m=+3.366118912" lastFinishedPulling="2026-04-24 21:27:43.734981297 +0000 UTC m=+20.562851421" observedRunningTime="2026-04-24 21:27:50.95325234 +0000 UTC m=+27.781122475" watchObservedRunningTime="2026-04-24 21:27:50.953378074 +0000 UTC m=+27.781248202" Apr 24 21:27:51.926470 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:51.926262 2578 generic.go:358] "Generic (PLEG): container finished" podID="6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc" containerID="33da2f21484a320079989f856cdaf79cee9496acda3cea5133dad7fbb9b1a244" exitCode=0 Apr 24 21:27:51.926838 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:51.926347 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xs9hq" event={"ID":"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc","Type":"ContainerDied","Data":"33da2f21484a320079989f856cdaf79cee9496acda3cea5133dad7fbb9b1a244"} Apr 24 21:27:52.759349 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:52.759312 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:52.759529 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:52.759379 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:52.759529 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:52.759483 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:52.759529 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:52.759480 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:27:52.759702 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:52.759579 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:27:52.759702 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:52.759652 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:27:53.305455 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:53.305429 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w7q5f_2556ea37-119f-46c5-bee7-7cfb12afca0f/dns-node-resolver/0.log" Apr 24 21:27:54.288533 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:54.288502 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jzqgz_d0f7d715-6263-49f8-ac7b-21d48a5b4438/node-ca/0.log" Apr 24 21:27:54.758955 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:54.758916 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:54.759509 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:54.758922 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:54.759509 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:54.759043 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:54.759509 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:54.759037 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:27:54.759509 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:54.759158 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:27:54.759509 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:54.759230 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:27:56.759056 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:56.759025 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:56.759686 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:56.759025 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:56.759686 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:56.759149 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:27:56.759686 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:56.759032 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:56.759686 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:56.759238 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:27:56.759686 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:56.759326 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:27:57.380348 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:57.380315 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs\") pod \"network-metrics-daemon-lkk5b\" (UID: \"20a034bb-c3d2-4d05-92de-ed16d2eda707\") " pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:57.380500 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:57.380483 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:57.380558 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:57.380549 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs podName:20a034bb-c3d2-4d05-92de-ed16d2eda707 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:29.380533071 +0000 UTC m=+66.208403185 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs") pod "network-metrics-daemon-lkk5b" (UID: "20a034bb-c3d2-4d05-92de-ed16d2eda707") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:57.481490 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:57.481458 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret\") pod \"global-pull-secret-syncer-hxdgr\" (UID: \"7a2b19a8-7cce-48ea-a91f-3306187c2d2a\") " pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:57.481637 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:57.481499 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hqhp\" (UniqueName: \"kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp\") pod \"network-check-target-c56pf\" (UID: \"af45c5fb-e377-44ea-ad83-ad7e5bea725b\") " pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:57.481637 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:57.481597 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:57.481714 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:57.481652 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret podName:7a2b19a8-7cce-48ea-a91f-3306187c2d2a nodeName:}" failed. No retries permitted until 2026-04-24 21:28:29.481638608 +0000 UTC m=+66.309508718 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret") pod "global-pull-secret-syncer-hxdgr" (UID: "7a2b19a8-7cce-48ea-a91f-3306187c2d2a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:57.481714 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:57.481600 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:57.481714 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:57.481697 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:57.481714 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:57.481706 2578 projected.go:194] Error preparing data for projected volume kube-api-access-4hqhp for pod openshift-network-diagnostics/network-check-target-c56pf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:57.481841 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:57.481748 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp podName:af45c5fb-e377-44ea-ad83-ad7e5bea725b nodeName:}" failed. No retries permitted until 2026-04-24 21:28:29.481734976 +0000 UTC m=+66.309605086 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-4hqhp" (UniqueName: "kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp") pod "network-check-target-c56pf" (UID: "af45c5fb-e377-44ea-ad83-ad7e5bea725b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:58.759298 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:58.759266 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:27:58.759298 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:58.759281 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:27:58.759832 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:58.759406 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:27:58.759832 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:58.759416 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:27:58.759832 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:58.759489 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:27:58.759832 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:27:58.759579 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:27:58.945990 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:58.945943 2578 generic.go:358] "Generic (PLEG): container finished" podID="6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc" containerID="587e7e28d599d8f306c37031fd2efa07cc88980bbcee35ec05ea044f93482f24" exitCode=0 Apr 24 21:27:58.946224 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:58.946204 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xs9hq" event={"ID":"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc","Type":"ContainerDied","Data":"587e7e28d599d8f306c37031fd2efa07cc88980bbcee35ec05ea044f93482f24"} Apr 24 21:27:59.950511 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:59.950481 2578 generic.go:358] "Generic (PLEG): container finished" podID="6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc" containerID="3d30820e77574cb241823f53efab56b346423e04e21bd53bbd671a377ab7c289" exitCode=0 Apr 24 21:27:59.950927 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:27:59.950521 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xs9hq" event={"ID":"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc","Type":"ContainerDied","Data":"3d30820e77574cb241823f53efab56b346423e04e21bd53bbd671a377ab7c289"} Apr 24 21:28:00.759048 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:00.759012 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:28:00.759229 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:00.759118 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:28:00.759229 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:00.759141 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:28:00.759229 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:00.759202 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:28:00.759351 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:00.759243 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:28:00.759351 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:00.759324 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:28:00.955516 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:00.955476 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xs9hq" event={"ID":"6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc","Type":"ContainerStarted","Data":"dc8d5459c206be46162f81cc674f8a88e0e809553481da0f6ecb26a414e2e093"} Apr 24 21:28:02.759097 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:02.758884 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:28:02.759498 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:02.758912 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:28:02.759498 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:02.759183 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:28:02.759498 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:02.758938 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:28:02.759498 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:02.759257 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:28:02.759498 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:02.759328 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:28:04.758940 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:04.758910 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:28:04.759314 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:04.758916 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:28:04.759314 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:04.759000 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:28:04.759314 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:04.759113 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:28:04.759314 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:04.758916 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:28:04.759314 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:04.759231 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:28:06.758800 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:06.758763 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:28:06.759204 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:06.758771 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:28:06.759204 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:06.758874 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:28:06.759204 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:06.758771 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:28:06.759204 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:06.758957 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:28:06.759204 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:06.759021 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:28:08.759126 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:08.759083 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:28:08.759537 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:08.759144 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:28:08.759537 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:08.759223 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:28:08.759537 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:08.759260 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:28:08.759537 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:08.759287 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:28:08.759537 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:08.759352 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:28:10.758947 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:10.758909 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:28:10.759342 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:10.758961 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:28:10.759342 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:10.759026 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:28:10.759342 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:10.759076 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:28:10.759342 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:10.759097 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:28:10.759342 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:10.759182 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:28:12.758604 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:12.758569 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:28:12.759166 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:12.758580 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:28:12.759166 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:12.758694 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:28:12.759166 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:12.758781 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:28:12.759166 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:12.758580 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:28:12.759166 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:12.758910 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:28:14.759000 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:14.758972 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:28:14.759427 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:14.759008 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:28:14.759427 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:14.759042 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:28:14.759427 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:14.759139 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:28:14.759427 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:14.759242 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:28:14.759427 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:14.759319 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:28:16.758789 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:16.758755 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:28:16.759195 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:16.758759 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:28:16.759195 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:16.758854 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-c56pf" podUID="af45c5fb-e377-44ea-ad83-ad7e5bea725b" Apr 24 21:28:16.759195 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:16.758952 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lkk5b" podUID="20a034bb-c3d2-4d05-92de-ed16d2eda707" Apr 24 21:28:16.759195 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:16.758759 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:28:16.759195 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:16.759035 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxdgr" podUID="7a2b19a8-7cce-48ea-a91f-3306187c2d2a" Apr 24 21:28:17.529575 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.529550 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeReady" Apr 24 21:28:17.529741 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.529664 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:28:17.575863 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.575777 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xs9hq" podStartSLOduration=23.311547721 podStartE2EDuration="54.575750305s" podCreationTimestamp="2026-04-24 21:27:23 +0000 UTC" firstStartedPulling="2026-04-24 21:27:26.532049382 +0000 UTC m=+3.359919493" lastFinishedPulling="2026-04-24 21:27:57.796251964 +0000 UTC m=+34.624122077" observedRunningTime="2026-04-24 21:28:00.991271921 +0000 UTC m=+37.819142053" watchObservedRunningTime="2026-04-24 21:28:17.575750305 +0000 UTC m=+54.403620431" Apr 24 21:28:17.576477 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.576460 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-thx4s"] Apr 24 21:28:17.601531 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.601507 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-hdm54"] Apr 24 21:28:17.601710 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.601692 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-thx4s" Apr 24 21:28:17.604348 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.604328 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:28:17.604508 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.604491 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:28:17.604779 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.604587 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m29p7\"" Apr 24 21:28:17.616767 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.616745 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-thx4s"] Apr 24 21:28:17.616767 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.616774 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hdm54"] Apr 24 21:28:17.616946 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.616788 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9nl9q"] Apr 24 21:28:17.617215 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.617190 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hdm54" Apr 24 21:28:17.619776 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.619759 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:28:17.619872 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.619782 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-hhbdt\"" Apr 24 21:28:17.619872 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.619811 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:28:17.619872 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.619822 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:28:17.620204 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.620177 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:28:17.627264 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.627245 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/af5e838b-1b18-4b92-ba27-f6f304af2d94-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hdm54\" (UID: \"af5e838b-1b18-4b92-ba27-f6f304af2d94\") " pod="openshift-insights/insights-runtime-extractor-hdm54" Apr 24 21:28:17.627342 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.627274 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/af5e838b-1b18-4b92-ba27-f6f304af2d94-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hdm54\" (UID: \"af5e838b-1b18-4b92-ba27-f6f304af2d94\") " pod="openshift-insights/insights-runtime-extractor-hdm54" Apr 24 21:28:17.627342 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.627291 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b99tp\" (UniqueName: \"kubernetes.io/projected/af5e838b-1b18-4b92-ba27-f6f304af2d94-kube-api-access-b99tp\") pod \"insights-runtime-extractor-hdm54\" (UID: \"af5e838b-1b18-4b92-ba27-f6f304af2d94\") " pod="openshift-insights/insights-runtime-extractor-hdm54" Apr 24 21:28:17.627342 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.627305 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/632b9ab1-38e5-4787-8970-d57a96875bdb-metrics-tls\") pod \"dns-default-thx4s\" (UID: \"632b9ab1-38e5-4787-8970-d57a96875bdb\") " pod="openshift-dns/dns-default-thx4s" Apr 24 21:28:17.627443 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.627365 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djghs\" (UniqueName: \"kubernetes.io/projected/632b9ab1-38e5-4787-8970-d57a96875bdb-kube-api-access-djghs\") pod \"dns-default-thx4s\" (UID: \"632b9ab1-38e5-4787-8970-d57a96875bdb\") " pod="openshift-dns/dns-default-thx4s" Apr 24 21:28:17.627443 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.627419 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/af5e838b-1b18-4b92-ba27-f6f304af2d94-data-volume\") pod \"insights-runtime-extractor-hdm54\" (UID: \"af5e838b-1b18-4b92-ba27-f6f304af2d94\") " pod="openshift-insights/insights-runtime-extractor-hdm54" Apr 24 21:28:17.627503 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.627442 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/632b9ab1-38e5-4787-8970-d57a96875bdb-config-volume\") pod \"dns-default-thx4s\" (UID: \"632b9ab1-38e5-4787-8970-d57a96875bdb\") " pod="openshift-dns/dns-default-thx4s" Apr 24 21:28:17.627503 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.627478 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/af5e838b-1b18-4b92-ba27-f6f304af2d94-crio-socket\") pod \"insights-runtime-extractor-hdm54\" (UID: \"af5e838b-1b18-4b92-ba27-f6f304af2d94\") " pod="openshift-insights/insights-runtime-extractor-hdm54" Apr 24 21:28:17.627503 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.627500 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/632b9ab1-38e5-4787-8970-d57a96875bdb-tmp-dir\") pod \"dns-default-thx4s\" (UID: \"632b9ab1-38e5-4787-8970-d57a96875bdb\") " pod="openshift-dns/dns-default-thx4s" Apr 24 21:28:17.628285 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.628262 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9nl9q"] Apr 24 21:28:17.628378 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.628367 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9nl9q" Apr 24 21:28:17.630940 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.630909 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-j9cg5\"" Apr 24 21:28:17.631053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.630949 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:28:17.631053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.630965 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:28:17.631053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.630919 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:28:17.727829 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.727798 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/917804a7-0eca-4b76-8504-d8f2e2dc5b74-cert\") pod \"ingress-canary-9nl9q\" (UID: \"917804a7-0eca-4b76-8504-d8f2e2dc5b74\") " pod="openshift-ingress-canary/ingress-canary-9nl9q" Apr 24 21:28:17.727829 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.727833 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/af5e838b-1b18-4b92-ba27-f6f304af2d94-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hdm54\" (UID: \"af5e838b-1b18-4b92-ba27-f6f304af2d94\") " pod="openshift-insights/insights-runtime-extractor-hdm54" Apr 24 21:28:17.728069 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.727851 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/af5e838b-1b18-4b92-ba27-f6f304af2d94-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hdm54\" (UID: \"af5e838b-1b18-4b92-ba27-f6f304af2d94\") " pod="openshift-insights/insights-runtime-extractor-hdm54" Apr 24 21:28:17.728069 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.727868 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b99tp\" (UniqueName: \"kubernetes.io/projected/af5e838b-1b18-4b92-ba27-f6f304af2d94-kube-api-access-b99tp\") pod \"insights-runtime-extractor-hdm54\" (UID: \"af5e838b-1b18-4b92-ba27-f6f304af2d94\") " pod="openshift-insights/insights-runtime-extractor-hdm54" Apr 24 21:28:17.728069 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.727883 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/632b9ab1-38e5-4787-8970-d57a96875bdb-metrics-tls\") pod \"dns-default-thx4s\" (UID: \"632b9ab1-38e5-4787-8970-d57a96875bdb\") " pod="openshift-dns/dns-default-thx4s" Apr 24 21:28:17.728069 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.727932 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djghs\" (UniqueName: \"kubernetes.io/projected/632b9ab1-38e5-4787-8970-d57a96875bdb-kube-api-access-djghs\") pod \"dns-default-thx4s\" (UID: \"632b9ab1-38e5-4787-8970-d57a96875bdb\") " pod="openshift-dns/dns-default-thx4s" Apr 24 21:28:17.728069 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.727959 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/af5e838b-1b18-4b92-ba27-f6f304af2d94-data-volume\") pod \"insights-runtime-extractor-hdm54\" (UID: \"af5e838b-1b18-4b92-ba27-f6f304af2d94\") " pod="openshift-insights/insights-runtime-extractor-hdm54" Apr 24 21:28:17.728069 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.727986 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/632b9ab1-38e5-4787-8970-d57a96875bdb-config-volume\") pod \"dns-default-thx4s\" (UID: \"632b9ab1-38e5-4787-8970-d57a96875bdb\") " pod="openshift-dns/dns-default-thx4s" Apr 24 21:28:17.728318 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.728194 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8cxb\" (UniqueName: \"kubernetes.io/projected/917804a7-0eca-4b76-8504-d8f2e2dc5b74-kube-api-access-h8cxb\") pod \"ingress-canary-9nl9q\" (UID: \"917804a7-0eca-4b76-8504-d8f2e2dc5b74\") " pod="openshift-ingress-canary/ingress-canary-9nl9q" Apr 24 21:28:17.728318 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.728248 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/af5e838b-1b18-4b92-ba27-f6f304af2d94-crio-socket\") pod \"insights-runtime-extractor-hdm54\" (UID: \"af5e838b-1b18-4b92-ba27-f6f304af2d94\") " pod="openshift-insights/insights-runtime-extractor-hdm54" Apr 24 21:28:17.728318 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.728279 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/632b9ab1-38e5-4787-8970-d57a96875bdb-tmp-dir\") pod \"dns-default-thx4s\" (UID: \"632b9ab1-38e5-4787-8970-d57a96875bdb\") " pod="openshift-dns/dns-default-thx4s" Apr 24 21:28:17.728417 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.728323 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/af5e838b-1b18-4b92-ba27-f6f304af2d94-crio-socket\") pod \"insights-runtime-extractor-hdm54\" (UID: \"af5e838b-1b18-4b92-ba27-f6f304af2d94\") " pod="openshift-insights/insights-runtime-extractor-hdm54" Apr 24 21:28:17.728518 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.728496 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/632b9ab1-38e5-4787-8970-d57a96875bdb-tmp-dir\") pod \"dns-default-thx4s\" (UID: \"632b9ab1-38e5-4787-8970-d57a96875bdb\") " pod="openshift-dns/dns-default-thx4s" Apr 24 21:28:17.728558 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.728536 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/632b9ab1-38e5-4787-8970-d57a96875bdb-config-volume\") pod \"dns-default-thx4s\" (UID: \"632b9ab1-38e5-4787-8970-d57a96875bdb\") " pod="openshift-dns/dns-default-thx4s" Apr 24 21:28:17.730702 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.730679 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/af5e838b-1b18-4b92-ba27-f6f304af2d94-data-volume\") pod \"insights-runtime-extractor-hdm54\" (UID: \"af5e838b-1b18-4b92-ba27-f6f304af2d94\") " pod="openshift-insights/insights-runtime-extractor-hdm54" Apr 24 21:28:17.730840 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.730821 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/af5e838b-1b18-4b92-ba27-f6f304af2d94-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hdm54\" (UID: \"af5e838b-1b18-4b92-ba27-f6f304af2d94\") " pod="openshift-insights/insights-runtime-extractor-hdm54" Apr 24 21:28:17.732140 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.732117 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/af5e838b-1b18-4b92-ba27-f6f304af2d94-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hdm54\" (UID: \"af5e838b-1b18-4b92-ba27-f6f304af2d94\") " pod="openshift-insights/insights-runtime-extractor-hdm54" Apr 24 21:28:17.732218 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.732143 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/632b9ab1-38e5-4787-8970-d57a96875bdb-metrics-tls\") pod \"dns-default-thx4s\" (UID: \"632b9ab1-38e5-4787-8970-d57a96875bdb\") " pod="openshift-dns/dns-default-thx4s" Apr 24 21:28:17.738881 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.738857 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b99tp\" (UniqueName: \"kubernetes.io/projected/af5e838b-1b18-4b92-ba27-f6f304af2d94-kube-api-access-b99tp\") pod \"insights-runtime-extractor-hdm54\" (UID: \"af5e838b-1b18-4b92-ba27-f6f304af2d94\") " pod="openshift-insights/insights-runtime-extractor-hdm54" Apr 24 21:28:17.740476 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.740452 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djghs\" (UniqueName: \"kubernetes.io/projected/632b9ab1-38e5-4787-8970-d57a96875bdb-kube-api-access-djghs\") pod \"dns-default-thx4s\" (UID: \"632b9ab1-38e5-4787-8970-d57a96875bdb\") " pod="openshift-dns/dns-default-thx4s" Apr 24 21:28:17.828637 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.828556 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8cxb\" (UniqueName: \"kubernetes.io/projected/917804a7-0eca-4b76-8504-d8f2e2dc5b74-kube-api-access-h8cxb\") pod \"ingress-canary-9nl9q\" (UID: \"917804a7-0eca-4b76-8504-d8f2e2dc5b74\") " pod="openshift-ingress-canary/ingress-canary-9nl9q" Apr 24 21:28:17.828637 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.828609 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/917804a7-0eca-4b76-8504-d8f2e2dc5b74-cert\") pod \"ingress-canary-9nl9q\" (UID: \"917804a7-0eca-4b76-8504-d8f2e2dc5b74\") " pod="openshift-ingress-canary/ingress-canary-9nl9q" Apr 24 21:28:17.831008 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.830988 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/917804a7-0eca-4b76-8504-d8f2e2dc5b74-cert\") pod \"ingress-canary-9nl9q\" (UID: \"917804a7-0eca-4b76-8504-d8f2e2dc5b74\") " pod="openshift-ingress-canary/ingress-canary-9nl9q" Apr 24 21:28:17.842290 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.842268 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8cxb\" (UniqueName: \"kubernetes.io/projected/917804a7-0eca-4b76-8504-d8f2e2dc5b74-kube-api-access-h8cxb\") pod \"ingress-canary-9nl9q\" (UID: \"917804a7-0eca-4b76-8504-d8f2e2dc5b74\") " pod="openshift-ingress-canary/ingress-canary-9nl9q" Apr 24 21:28:17.911207 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.911171 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-thx4s" Apr 24 21:28:17.926035 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.926008 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hdm54" Apr 24 21:28:17.942053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:17.941918 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9nl9q" Apr 24 21:28:18.089610 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:18.089528 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-thx4s"] Apr 24 21:28:18.094936 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:28:18.094900 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod632b9ab1_38e5_4787_8970_d57a96875bdb.slice/crio-1330b0bd662ef3e19f534df4a40a5b3acd49fa1c64b7121fa65da5031e53157e WatchSource:0}: Error finding container 1330b0bd662ef3e19f534df4a40a5b3acd49fa1c64b7121fa65da5031e53157e: Status 404 returned error can't find the container with id 1330b0bd662ef3e19f534df4a40a5b3acd49fa1c64b7121fa65da5031e53157e Apr 24 21:28:18.106050 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:18.106017 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hdm54"] Apr 24 21:28:18.119884 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:18.119863 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9nl9q"] Apr 24 21:28:18.122567 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:28:18.122543 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod917804a7_0eca_4b76_8504_d8f2e2dc5b74.slice/crio-1dad9153f8b05b79eda4c001840da120582c0ef1051207e0c0cb18b2025f4c81 WatchSource:0}: Error finding container 1dad9153f8b05b79eda4c001840da120582c0ef1051207e0c0cb18b2025f4c81: Status 404 returned error can't find the container with id 1dad9153f8b05b79eda4c001840da120582c0ef1051207e0c0cb18b2025f4c81 Apr 24 21:28:18.759373 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:18.759333 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:28:18.759546 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:18.759384 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:28:18.759546 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:18.759362 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:28:18.763058 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:18.763033 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:28:18.763198 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:18.763109 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xwzc6\"" Apr 24 21:28:18.763698 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:18.763347 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tshq6\"" Apr 24 21:28:18.763698 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:18.763545 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:28:18.763698 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:18.763580 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:28:18.763992 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:18.763788 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:28:18.994111 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:18.994069 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-thx4s" event={"ID":"632b9ab1-38e5-4787-8970-d57a96875bdb","Type":"ContainerStarted","Data":"1330b0bd662ef3e19f534df4a40a5b3acd49fa1c64b7121fa65da5031e53157e"} Apr 24 21:28:18.995366 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:18.995326 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9nl9q" event={"ID":"917804a7-0eca-4b76-8504-d8f2e2dc5b74","Type":"ContainerStarted","Data":"1dad9153f8b05b79eda4c001840da120582c0ef1051207e0c0cb18b2025f4c81"} Apr 24 21:28:18.996875 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:18.996852 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hdm54" event={"ID":"af5e838b-1b18-4b92-ba27-f6f304af2d94","Type":"ContainerStarted","Data":"6e1c72e3dc1be19140e48059b6f15663951ae1fe604f9de90854427f88f1b7a3"} Apr 24 21:28:18.997016 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:18.996899 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hdm54" event={"ID":"af5e838b-1b18-4b92-ba27-f6f304af2d94","Type":"ContainerStarted","Data":"af38937f41b5ae588013d01dd884157634092f673f629147945b64572b3ae0a6"} Apr 24 21:28:19.873756 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:19.873720 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-zjcvd"] Apr 24 21:28:19.876944 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:19.876921 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-zjcvd" Apr 24 21:28:19.881104 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:19.881070 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:28:19.881104 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:19.881089 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-6nx5p\"" Apr 24 21:28:19.881300 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:19.881074 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:28:19.881300 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:19.881148 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:28:19.881300 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:19.881077 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 21:28:19.881300 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:19.881074 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 21:28:19.885085 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:19.885052 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-zjcvd"] Apr 24 21:28:19.941946 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:19.941907 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9m2l\" (UniqueName: \"kubernetes.io/projected/f5b38541-c21b-4c73-a0c6-d8a6907a689c-kube-api-access-g9m2l\") pod \"prometheus-operator-5676c8c784-zjcvd\" (UID: \"f5b38541-c21b-4c73-a0c6-d8a6907a689c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjcvd" Apr 24 21:28:19.942132 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:19.941965 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f5b38541-c21b-4c73-a0c6-d8a6907a689c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-zjcvd\" (UID: \"f5b38541-c21b-4c73-a0c6-d8a6907a689c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjcvd" Apr 24 21:28:19.942132 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:19.942029 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f5b38541-c21b-4c73-a0c6-d8a6907a689c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-zjcvd\" (UID: \"f5b38541-c21b-4c73-a0c6-d8a6907a689c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjcvd" Apr 24 21:28:19.942132 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:19.942063 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f5b38541-c21b-4c73-a0c6-d8a6907a689c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-zjcvd\" (UID: \"f5b38541-c21b-4c73-a0c6-d8a6907a689c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjcvd" Apr 24 21:28:20.043021 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:20.042922 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9m2l\" (UniqueName: \"kubernetes.io/projected/f5b38541-c21b-4c73-a0c6-d8a6907a689c-kube-api-access-g9m2l\") pod \"prometheus-operator-5676c8c784-zjcvd\" (UID: \"f5b38541-c21b-4c73-a0c6-d8a6907a689c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjcvd" Apr 24 21:28:20.043021 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:20.043002 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f5b38541-c21b-4c73-a0c6-d8a6907a689c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-zjcvd\" (UID: \"f5b38541-c21b-4c73-a0c6-d8a6907a689c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjcvd" Apr 24 21:28:20.043507 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:20.043059 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f5b38541-c21b-4c73-a0c6-d8a6907a689c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-zjcvd\" (UID: \"f5b38541-c21b-4c73-a0c6-d8a6907a689c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjcvd" Apr 24 21:28:20.043507 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:20.043087 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f5b38541-c21b-4c73-a0c6-d8a6907a689c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-zjcvd\" (UID: \"f5b38541-c21b-4c73-a0c6-d8a6907a689c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjcvd" Apr 24 21:28:20.043507 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:20.043202 2578 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 24 21:28:20.043507 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:20.043269 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5b38541-c21b-4c73-a0c6-d8a6907a689c-prometheus-operator-tls podName:f5b38541-c21b-4c73-a0c6-d8a6907a689c nodeName:}" failed. No retries permitted until 2026-04-24 21:28:20.543249562 +0000 UTC m=+57.371119673 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/f5b38541-c21b-4c73-a0c6-d8a6907a689c-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-zjcvd" (UID: "f5b38541-c21b-4c73-a0c6-d8a6907a689c") : secret "prometheus-operator-tls" not found Apr 24 21:28:20.043906 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:20.043868 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f5b38541-c21b-4c73-a0c6-d8a6907a689c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-zjcvd\" (UID: \"f5b38541-c21b-4c73-a0c6-d8a6907a689c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjcvd" Apr 24 21:28:20.045778 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:20.045751 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f5b38541-c21b-4c73-a0c6-d8a6907a689c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-zjcvd\" (UID: \"f5b38541-c21b-4c73-a0c6-d8a6907a689c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjcvd" Apr 24 21:28:20.054456 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:20.054433 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9m2l\" (UniqueName: \"kubernetes.io/projected/f5b38541-c21b-4c73-a0c6-d8a6907a689c-kube-api-access-g9m2l\") pod \"prometheus-operator-5676c8c784-zjcvd\" (UID: \"f5b38541-c21b-4c73-a0c6-d8a6907a689c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjcvd" Apr 24 21:28:20.547167 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:20.547088 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f5b38541-c21b-4c73-a0c6-d8a6907a689c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-zjcvd\" (UID: \"f5b38541-c21b-4c73-a0c6-d8a6907a689c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjcvd" Apr 24 21:28:20.549370 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:20.549342 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f5b38541-c21b-4c73-a0c6-d8a6907a689c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-zjcvd\" (UID: \"f5b38541-c21b-4c73-a0c6-d8a6907a689c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zjcvd" Apr 24 21:28:20.788108 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:20.788063 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-zjcvd" Apr 24 21:28:20.921305 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:20.921268 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-zjcvd"] Apr 24 21:28:20.924576 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:28:20.924546 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5b38541_c21b_4c73_a0c6_d8a6907a689c.slice/crio-5066bfd778766909dd5a29ea55286ee4f85829dbb5217f8e861f64f64cb1e83d WatchSource:0}: Error finding container 5066bfd778766909dd5a29ea55286ee4f85829dbb5217f8e861f64f64cb1e83d: Status 404 returned error can't find the container with id 5066bfd778766909dd5a29ea55286ee4f85829dbb5217f8e861f64f64cb1e83d Apr 24 21:28:21.004694 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:21.004652 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-thx4s" event={"ID":"632b9ab1-38e5-4787-8970-d57a96875bdb","Type":"ContainerStarted","Data":"74f734d937f02fdecfbd4f670b950e3cab046eb857c46c867a8bd86ca9a89c33"} Apr 24 21:28:21.004694 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:21.004699 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-thx4s" event={"ID":"632b9ab1-38e5-4787-8970-d57a96875bdb","Type":"ContainerStarted","Data":"f40c04432ab40d2498403339bad14eb07344cad7d496c3a26f8345e3041b350c"} Apr 24 21:28:21.004965 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:21.004789 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-thx4s" Apr 24 21:28:21.006290 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:21.006248 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9nl9q" event={"ID":"917804a7-0eca-4b76-8504-d8f2e2dc5b74","Type":"ContainerStarted","Data":"afbdffd6c6d5c331682fc061a28b7d184e7c130fe1ee31f576eba3e52834193a"} Apr 24 21:28:21.008015 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:21.007987 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hdm54" event={"ID":"af5e838b-1b18-4b92-ba27-f6f304af2d94","Type":"ContainerStarted","Data":"46f73dc3a5f50d9f750c602bfaafda025e8406007bf0a1515c9deb440428fb0c"} Apr 24 21:28:21.009140 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:21.009106 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-zjcvd" event={"ID":"f5b38541-c21b-4c73-a0c6-d8a6907a689c","Type":"ContainerStarted","Data":"5066bfd778766909dd5a29ea55286ee4f85829dbb5217f8e861f64f64cb1e83d"} Apr 24 21:28:21.022596 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:21.022528 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-thx4s" podStartSLOduration=1.819479437 podStartE2EDuration="4.022508248s" podCreationTimestamp="2026-04-24 21:28:17 +0000 UTC" firstStartedPulling="2026-04-24 21:28:18.096660454 +0000 UTC m=+54.924530568" lastFinishedPulling="2026-04-24 21:28:20.299689265 +0000 UTC m=+57.127559379" observedRunningTime="2026-04-24 21:28:21.022258943 +0000 UTC m=+57.850129101" watchObservedRunningTime="2026-04-24 21:28:21.022508248 +0000 UTC m=+57.850378378" Apr 24 21:28:21.038584 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:21.038533 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9nl9q" podStartSLOduration=1.8597798380000001 podStartE2EDuration="4.03850837s" podCreationTimestamp="2026-04-24 21:28:17 +0000 UTC" firstStartedPulling="2026-04-24 21:28:18.12426045 +0000 UTC m=+54.952130559" lastFinishedPulling="2026-04-24 21:28:20.302988977 +0000 UTC m=+57.130859091" observedRunningTime="2026-04-24 21:28:21.037731587 +0000 UTC m=+57.865601720" watchObservedRunningTime="2026-04-24 21:28:21.03850837 +0000 UTC m=+57.866378503" Apr 24 21:28:21.938200 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:21.938083 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-krb9p" Apr 24 21:28:22.013946 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:22.013900 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hdm54" event={"ID":"af5e838b-1b18-4b92-ba27-f6f304af2d94","Type":"ContainerStarted","Data":"ea612b607cf43c3d82eb5175782dba018afe1e24673d465bc023444e99156432"} Apr 24 21:28:22.031360 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:22.031302 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-hdm54" podStartSLOduration=1.6399393629999999 podStartE2EDuration="5.03128625s" podCreationTimestamp="2026-04-24 21:28:17 +0000 UTC" firstStartedPulling="2026-04-24 21:28:18.252643547 +0000 UTC m=+55.080513660" lastFinishedPulling="2026-04-24 21:28:21.643990437 +0000 UTC m=+58.471860547" observedRunningTime="2026-04-24 21:28:22.0308805 +0000 UTC m=+58.858750633" watchObservedRunningTime="2026-04-24 21:28:22.03128625 +0000 UTC m=+58.859156388" Apr 24 21:28:23.019204 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:23.019161 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-zjcvd" event={"ID":"f5b38541-c21b-4c73-a0c6-d8a6907a689c","Type":"ContainerStarted","Data":"12a8bff093e46f31ffef9e2d9f8217953ef46c64b413e39c42d35867e35f367f"} Apr 24 21:28:23.019204 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:23.019204 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-zjcvd" event={"ID":"f5b38541-c21b-4c73-a0c6-d8a6907a689c","Type":"ContainerStarted","Data":"de520d47f100fab9d472d672f772482135328c51f7d3ad02b9570eca451ef0e3"} Apr 24 21:28:23.037129 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:23.037080 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-zjcvd" podStartSLOduration=2.620682253 podStartE2EDuration="4.037066074s" podCreationTimestamp="2026-04-24 21:28:19 +0000 UTC" firstStartedPulling="2026-04-24 21:28:20.926867854 +0000 UTC m=+57.754737964" lastFinishedPulling="2026-04-24 21:28:22.343251659 +0000 UTC m=+59.171121785" observedRunningTime="2026-04-24 21:28:23.035805028 +0000 UTC m=+59.863675182" watchObservedRunningTime="2026-04-24 21:28:23.037066074 +0000 UTC m=+59.864936206" Apr 24 21:28:25.395620 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.395586 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-4nrh7"] Apr 24 21:28:25.399879 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.399857 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" Apr 24 21:28:25.402816 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.402794 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 21:28:25.403034 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.403019 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 21:28:25.403199 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.403180 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-qfggp\"" Apr 24 21:28:25.403452 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.403439 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 21:28:25.411832 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.411805 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-4nrh7"] Apr 24 21:28:25.441129 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.441089 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-88w6j"] Apr 24 21:28:25.444149 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.444129 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.446691 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.446669 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:28:25.446878 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.446722 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:28:25.446878 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.446745 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:28:25.446878 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.446769 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-5pldf\"" Apr 24 21:28:25.481712 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.481679 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b016b4cc-60cb-4132-b4b1-86f46bcd2620-node-exporter-textfile\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.481712 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.481718 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/21d7eb0f-d8da-459c-b217-8af29243babb-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-4nrh7\" (UID: \"21d7eb0f-d8da-459c-b217-8af29243babb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" Apr 24 21:28:25.481958 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.481746 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21d7eb0f-d8da-459c-b217-8af29243babb-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-4nrh7\" (UID: \"21d7eb0f-d8da-459c-b217-8af29243babb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" Apr 24 21:28:25.481958 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.481763 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ppwp\" (UniqueName: \"kubernetes.io/projected/21d7eb0f-d8da-459c-b217-8af29243babb-kube-api-access-5ppwp\") pod \"kube-state-metrics-69db897b98-4nrh7\" (UID: \"21d7eb0f-d8da-459c-b217-8af29243babb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" Apr 24 21:28:25.481958 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.481824 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b016b4cc-60cb-4132-b4b1-86f46bcd2620-sys\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.481958 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.481873 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/21d7eb0f-d8da-459c-b217-8af29243babb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-4nrh7\" (UID: \"21d7eb0f-d8da-459c-b217-8af29243babb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" Apr 24 21:28:25.481958 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.481927 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b016b4cc-60cb-4132-b4b1-86f46bcd2620-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.481958 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.481957 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b016b4cc-60cb-4132-b4b1-86f46bcd2620-metrics-client-ca\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.482156 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.481973 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgjzj\" (UniqueName: \"kubernetes.io/projected/b016b4cc-60cb-4132-b4b1-86f46bcd2620-kube-api-access-qgjzj\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.482156 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.481992 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b016b4cc-60cb-4132-b4b1-86f46bcd2620-root\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.482156 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.482008 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b016b4cc-60cb-4132-b4b1-86f46bcd2620-node-exporter-accelerators-collector-config\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.482156 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.482070 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21d7eb0f-d8da-459c-b217-8af29243babb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-4nrh7\" (UID: \"21d7eb0f-d8da-459c-b217-8af29243babb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" Apr 24 21:28:25.482156 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.482127 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b016b4cc-60cb-4132-b4b1-86f46bcd2620-node-exporter-wtmp\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.482298 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.482158 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b016b4cc-60cb-4132-b4b1-86f46bcd2620-node-exporter-tls\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.482298 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.482184 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/21d7eb0f-d8da-459c-b217-8af29243babb-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-4nrh7\" (UID: \"21d7eb0f-d8da-459c-b217-8af29243babb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" Apr 24 21:28:25.583103 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.583066 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/21d7eb0f-d8da-459c-b217-8af29243babb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-4nrh7\" (UID: \"21d7eb0f-d8da-459c-b217-8af29243babb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" Apr 24 21:28:25.583294 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.583119 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b016b4cc-60cb-4132-b4b1-86f46bcd2620-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.583294 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.583163 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b016b4cc-60cb-4132-b4b1-86f46bcd2620-metrics-client-ca\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.583294 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.583183 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgjzj\" (UniqueName: \"kubernetes.io/projected/b016b4cc-60cb-4132-b4b1-86f46bcd2620-kube-api-access-qgjzj\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.583294 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.583209 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b016b4cc-60cb-4132-b4b1-86f46bcd2620-root\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.583294 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.583270 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b016b4cc-60cb-4132-b4b1-86f46bcd2620-root\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.583511 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.583435 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b016b4cc-60cb-4132-b4b1-86f46bcd2620-node-exporter-accelerators-collector-config\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.583511 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.583462 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21d7eb0f-d8da-459c-b217-8af29243babb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-4nrh7\" (UID: \"21d7eb0f-d8da-459c-b217-8af29243babb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" Apr 24 21:28:25.583511 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.583480 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b016b4cc-60cb-4132-b4b1-86f46bcd2620-node-exporter-wtmp\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.583659 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.583525 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b016b4cc-60cb-4132-b4b1-86f46bcd2620-node-exporter-tls\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.583659 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.583559 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/21d7eb0f-d8da-459c-b217-8af29243babb-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-4nrh7\" (UID: \"21d7eb0f-d8da-459c-b217-8af29243babb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" Apr 24 21:28:25.583659 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.583599 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b016b4cc-60cb-4132-b4b1-86f46bcd2620-node-exporter-textfile\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.583659 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.583613 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b016b4cc-60cb-4132-b4b1-86f46bcd2620-node-exporter-wtmp\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.583659 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.583640 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/21d7eb0f-d8da-459c-b217-8af29243babb-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-4nrh7\" (UID: \"21d7eb0f-d8da-459c-b217-8af29243babb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" Apr 24 21:28:25.583937 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.583679 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21d7eb0f-d8da-459c-b217-8af29243babb-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-4nrh7\" (UID: \"21d7eb0f-d8da-459c-b217-8af29243babb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" Apr 24 21:28:25.583937 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.583704 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ppwp\" (UniqueName: \"kubernetes.io/projected/21d7eb0f-d8da-459c-b217-8af29243babb-kube-api-access-5ppwp\") pod \"kube-state-metrics-69db897b98-4nrh7\" (UID: \"21d7eb0f-d8da-459c-b217-8af29243babb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" Apr 24 21:28:25.583937 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:25.583729 2578 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 24 21:28:25.583937 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.583772 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b016b4cc-60cb-4132-b4b1-86f46bcd2620-sys\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.583937 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.583870 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/21d7eb0f-d8da-459c-b217-8af29243babb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-4nrh7\" (UID: \"21d7eb0f-d8da-459c-b217-8af29243babb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" Apr 24 21:28:25.583937 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.583928 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b016b4cc-60cb-4132-b4b1-86f46bcd2620-metrics-client-ca\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.584186 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.584056 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b016b4cc-60cb-4132-b4b1-86f46bcd2620-node-exporter-textfile\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.584186 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.584155 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b016b4cc-60cb-4132-b4b1-86f46bcd2620-node-exporter-accelerators-collector-config\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.584274 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.583734 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b016b4cc-60cb-4132-b4b1-86f46bcd2620-sys\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.584388 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:25.584375 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21d7eb0f-d8da-459c-b217-8af29243babb-kube-state-metrics-tls podName:21d7eb0f-d8da-459c-b217-8af29243babb nodeName:}" failed. No retries permitted until 2026-04-24 21:28:26.084352741 +0000 UTC m=+62.912222865 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/21d7eb0f-d8da-459c-b217-8af29243babb-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-4nrh7" (UID: "21d7eb0f-d8da-459c-b217-8af29243babb") : secret "kube-state-metrics-tls" not found Apr 24 21:28:25.584494 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.584473 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21d7eb0f-d8da-459c-b217-8af29243babb-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-4nrh7\" (UID: \"21d7eb0f-d8da-459c-b217-8af29243babb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" Apr 24 21:28:25.584685 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.584662 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/21d7eb0f-d8da-459c-b217-8af29243babb-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-4nrh7\" (UID: \"21d7eb0f-d8da-459c-b217-8af29243babb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" Apr 24 21:28:25.585800 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.585766 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b016b4cc-60cb-4132-b4b1-86f46bcd2620-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.585970 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.585954 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b016b4cc-60cb-4132-b4b1-86f46bcd2620-node-exporter-tls\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.586333 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.586315 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21d7eb0f-d8da-459c-b217-8af29243babb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-4nrh7\" (UID: \"21d7eb0f-d8da-459c-b217-8af29243babb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" Apr 24 21:28:25.595722 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.595697 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ppwp\" (UniqueName: \"kubernetes.io/projected/21d7eb0f-d8da-459c-b217-8af29243babb-kube-api-access-5ppwp\") pod \"kube-state-metrics-69db897b98-4nrh7\" (UID: \"21d7eb0f-d8da-459c-b217-8af29243babb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" Apr 24 21:28:25.595722 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.595708 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgjzj\" (UniqueName: \"kubernetes.io/projected/b016b4cc-60cb-4132-b4b1-86f46bcd2620-kube-api-access-qgjzj\") pod \"node-exporter-88w6j\" (UID: \"b016b4cc-60cb-4132-b4b1-86f46bcd2620\") " pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.752763 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:25.752725 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-88w6j" Apr 24 21:28:25.761153 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:28:25.761120 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb016b4cc_60cb_4132_b4b1_86f46bcd2620.slice/crio-2b53be2dcc85a3232cd246d1f0c26c5b560b9cdd98a757c136b34f4c583ebaad WatchSource:0}: Error finding container 2b53be2dcc85a3232cd246d1f0c26c5b560b9cdd98a757c136b34f4c583ebaad: Status 404 returned error can't find the container with id 2b53be2dcc85a3232cd246d1f0c26c5b560b9cdd98a757c136b34f4c583ebaad Apr 24 21:28:26.028527 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.028430 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-88w6j" event={"ID":"b016b4cc-60cb-4132-b4b1-86f46bcd2620","Type":"ContainerStarted","Data":"2b53be2dcc85a3232cd246d1f0c26c5b560b9cdd98a757c136b34f4c583ebaad"} Apr 24 21:28:26.088600 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.088563 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/21d7eb0f-d8da-459c-b217-8af29243babb-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-4nrh7\" (UID: \"21d7eb0f-d8da-459c-b217-8af29243babb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" Apr 24 21:28:26.090998 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.090972 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/21d7eb0f-d8da-459c-b217-8af29243babb-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-4nrh7\" (UID: \"21d7eb0f-d8da-459c-b217-8af29243babb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" Apr 24 21:28:26.308810 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.308715 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" Apr 24 21:28:26.394077 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.394037 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:28:26.399014 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.398982 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.404018 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.403986 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 21:28:26.404222 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.403985 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 21:28:26.404222 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.404056 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 21:28:26.404222 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.404056 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-wjtxz\"" Apr 24 21:28:26.404436 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.404348 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 21:28:26.404573 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.404557 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 21:28:26.404779 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.404637 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 21:28:26.404779 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.404650 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 21:28:26.404779 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.404647 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 21:28:26.404779 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.404644 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 21:28:26.420225 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.420189 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:28:26.491718 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.491680 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.491718 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.491724 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.491970 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.491750 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-config-volume\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.491970 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.491771 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.491970 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.491793 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.491970 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.491812 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-web-config\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.491970 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.491877 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-config-out\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.491970 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.491953 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.492236 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.491984 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.492236 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.492030 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.492236 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.492060 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.492236 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.492096 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.492236 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.492126 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cxn7\" (UniqueName: \"kubernetes.io/projected/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-kube-api-access-9cxn7\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.591939 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.591910 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-4nrh7"] Apr 24 21:28:26.593382 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.593364 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.593462 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.593394 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.593462 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.593416 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.593462 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.593434 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cxn7\" (UniqueName: \"kubernetes.io/projected/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-kube-api-access-9cxn7\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.593561 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.593499 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.593561 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.593526 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.593653 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.593562 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-config-volume\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.593653 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.593591 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.593653 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.593620 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.593653 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.593647 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-web-config\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.593833 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.593674 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-config-out\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.593833 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.593697 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.593833 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.593720 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.594007 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:26.593977 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-alertmanager-trusted-ca-bundle podName:ebd75cff-1e6f-42c9-825f-055e9c04ab8e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:27.093952356 +0000 UTC m=+63.921822485 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "ebd75cff-1e6f-42c9-825f-055e9c04ab8e") : configmap references non-existent config key: ca-bundle.crt Apr 24 21:28:26.597048 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.597022 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.597580 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.597175 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.597580 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.597429 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.598244 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:28:26.598214 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21d7eb0f_d8da_459c_b217_8af29243babb.slice/crio-8583d3cd3adaa0d3f62619d547b1857a0193ccff7c95ab6763f1b75fa657fe45 WatchSource:0}: Error finding container 8583d3cd3adaa0d3f62619d547b1857a0193ccff7c95ab6763f1b75fa657fe45: Status 404 returned error can't find the container with id 8583d3cd3adaa0d3f62619d547b1857a0193ccff7c95ab6763f1b75fa657fe45 Apr 24 21:28:26.598968 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.598949 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-web-config\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.599194 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.599169 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.599353 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.599323 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.599481 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.599460 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.600092 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.600018 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-config-out\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.600296 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.600142 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.600296 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.600178 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.601086 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.600998 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-config-volume\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.604248 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.604225 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cxn7\" (UniqueName: \"kubernetes.io/projected/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-kube-api-access-9cxn7\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:26.837970 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.837873 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-9fb6f6b58-xtm46"] Apr 24 21:28:26.840772 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.840748 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:26.843755 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.843717 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 21:28:26.843755 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.843733 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-nqhcp\"" Apr 24 21:28:26.843961 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.843733 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 21:28:26.843961 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.843829 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 21:28:26.843961 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.843734 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 21:28:26.843961 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.843921 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 21:28:26.844097 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.844083 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 21:28:26.844280 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.844258 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 21:28:26.852036 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.852008 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9fb6f6b58-xtm46"] Apr 24 21:28:26.896711 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.896670 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48670ceb-a251-45ea-8e95-b12bab903ddd-console-oauth-config\") pod \"console-9fb6f6b58-xtm46\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:26.896926 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.896722 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48670ceb-a251-45ea-8e95-b12bab903ddd-console-serving-cert\") pod \"console-9fb6f6b58-xtm46\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:26.896926 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.896750 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48670ceb-a251-45ea-8e95-b12bab903ddd-service-ca\") pod \"console-9fb6f6b58-xtm46\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:26.896926 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.896787 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48670ceb-a251-45ea-8e95-b12bab903ddd-console-config\") pod \"console-9fb6f6b58-xtm46\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:26.896926 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.896819 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48670ceb-a251-45ea-8e95-b12bab903ddd-oauth-serving-cert\") pod \"console-9fb6f6b58-xtm46\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:26.896926 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.896882 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj52l\" (UniqueName: \"kubernetes.io/projected/48670ceb-a251-45ea-8e95-b12bab903ddd-kube-api-access-xj52l\") pod \"console-9fb6f6b58-xtm46\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:26.998093 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.998047 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48670ceb-a251-45ea-8e95-b12bab903ddd-console-serving-cert\") pod \"console-9fb6f6b58-xtm46\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:26.998288 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.998109 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48670ceb-a251-45ea-8e95-b12bab903ddd-service-ca\") pod \"console-9fb6f6b58-xtm46\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:26.998288 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.998178 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48670ceb-a251-45ea-8e95-b12bab903ddd-console-config\") pod \"console-9fb6f6b58-xtm46\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:26.998288 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.998226 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48670ceb-a251-45ea-8e95-b12bab903ddd-oauth-serving-cert\") pod \"console-9fb6f6b58-xtm46\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:26.998439 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.998341 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xj52l\" (UniqueName: \"kubernetes.io/projected/48670ceb-a251-45ea-8e95-b12bab903ddd-kube-api-access-xj52l\") pod \"console-9fb6f6b58-xtm46\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:26.999089 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.998569 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48670ceb-a251-45ea-8e95-b12bab903ddd-console-oauth-config\") pod \"console-9fb6f6b58-xtm46\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:26.999089 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.998957 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48670ceb-a251-45ea-8e95-b12bab903ddd-service-ca\") pod \"console-9fb6f6b58-xtm46\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:26.999089 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.999012 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48670ceb-a251-45ea-8e95-b12bab903ddd-console-config\") pod \"console-9fb6f6b58-xtm46\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:26.999089 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:26.999080 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48670ceb-a251-45ea-8e95-b12bab903ddd-oauth-serving-cert\") pod \"console-9fb6f6b58-xtm46\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:27.001571 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:27.001547 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48670ceb-a251-45ea-8e95-b12bab903ddd-console-serving-cert\") pod \"console-9fb6f6b58-xtm46\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:27.001679 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:27.001600 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48670ceb-a251-45ea-8e95-b12bab903ddd-console-oauth-config\") pod \"console-9fb6f6b58-xtm46\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:27.009751 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:27.009725 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj52l\" (UniqueName: \"kubernetes.io/projected/48670ceb-a251-45ea-8e95-b12bab903ddd-kube-api-access-xj52l\") pod \"console-9fb6f6b58-xtm46\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:27.033758 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:27.033719 2578 generic.go:358] "Generic (PLEG): container finished" podID="b016b4cc-60cb-4132-b4b1-86f46bcd2620" containerID="52c2ee0deabbfa280d44f1fa82a3dbc8a73b5e700550261031bed0928bf93184" exitCode=0 Apr 24 21:28:27.033954 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:27.033814 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-88w6j" event={"ID":"b016b4cc-60cb-4132-b4b1-86f46bcd2620","Type":"ContainerDied","Data":"52c2ee0deabbfa280d44f1fa82a3dbc8a73b5e700550261031bed0928bf93184"} Apr 24 21:28:27.037461 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:27.037396 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" event={"ID":"21d7eb0f-d8da-459c-b217-8af29243babb","Type":"ContainerStarted","Data":"8583d3cd3adaa0d3f62619d547b1857a0193ccff7c95ab6763f1b75fa657fe45"} Apr 24 21:28:27.099546 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:27.099303 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:27.100458 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:27.100428 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:27.150053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:27.149834 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:27.297148 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:27.297110 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9fb6f6b58-xtm46"] Apr 24 21:28:27.301187 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:28:27.301153 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48670ceb_a251_45ea_8e95_b12bab903ddd.slice/crio-be3a4dc69c203e19fe29d11de33a11142ca1be4c9abde967603197d27f878abf WatchSource:0}: Error finding container be3a4dc69c203e19fe29d11de33a11142ca1be4c9abde967603197d27f878abf: Status 404 returned error can't find the container with id be3a4dc69c203e19fe29d11de33a11142ca1be4c9abde967603197d27f878abf Apr 24 21:28:27.310804 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:27.310774 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:27.449015 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:27.448959 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:28:27.641448 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:28:27.641361 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebd75cff_1e6f_42c9_825f_055e9c04ab8e.slice/crio-727948ca1daa43cdca172c4612dd67174ad2b90fec3c0d7acb03ffd095203fa4 WatchSource:0}: Error finding container 727948ca1daa43cdca172c4612dd67174ad2b90fec3c0d7acb03ffd095203fa4: Status 404 returned error can't find the container with id 727948ca1daa43cdca172c4612dd67174ad2b90fec3c0d7acb03ffd095203fa4 Apr 24 21:28:28.042970 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:28.042877 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-88w6j" event={"ID":"b016b4cc-60cb-4132-b4b1-86f46bcd2620","Type":"ContainerStarted","Data":"eba66a0063b4df11c591418574aa515b7bdec02bc746dab8b611c92c7da95ce8"} Apr 24 21:28:28.042970 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:28.042943 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-88w6j" event={"ID":"b016b4cc-60cb-4132-b4b1-86f46bcd2620","Type":"ContainerStarted","Data":"1d0693c96268f8f8775f8ff4211adacd736a75a9fbdcd74897d605451386c865"} Apr 24 21:28:28.045222 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:28.045184 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" event={"ID":"21d7eb0f-d8da-459c-b217-8af29243babb","Type":"ContainerStarted","Data":"83699c6ae2cb547eeb6c504019ac0eb2c4a61e7299938029a0a0d40370c1f9ea"} Apr 24 21:28:28.045222 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:28.045218 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" event={"ID":"21d7eb0f-d8da-459c-b217-8af29243babb","Type":"ContainerStarted","Data":"5facdb06f376bb8a9dc273ea99062d2ef41a4d60bcaf505c5d405c9ad2ca50f1"} Apr 24 21:28:28.045425 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:28.045232 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" event={"ID":"21d7eb0f-d8da-459c-b217-8af29243babb","Type":"ContainerStarted","Data":"d6fa1cc6499891e2f1db19398fef42395c51b506a11a73a5376fca1a6697e0a1"} Apr 24 21:28:28.050401 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:28.047123 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ebd75cff-1e6f-42c9-825f-055e9c04ab8e","Type":"ContainerStarted","Data":"727948ca1daa43cdca172c4612dd67174ad2b90fec3c0d7acb03ffd095203fa4"} Apr 24 21:28:28.052287 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:28.052255 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9fb6f6b58-xtm46" event={"ID":"48670ceb-a251-45ea-8e95-b12bab903ddd","Type":"ContainerStarted","Data":"be3a4dc69c203e19fe29d11de33a11142ca1be4c9abde967603197d27f878abf"} Apr 24 21:28:28.066307 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:28.066247 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-88w6j" podStartSLOduration=2.320320349 podStartE2EDuration="3.066228214s" podCreationTimestamp="2026-04-24 21:28:25 +0000 UTC" firstStartedPulling="2026-04-24 21:28:25.762842153 +0000 UTC m=+62.590712263" lastFinishedPulling="2026-04-24 21:28:26.508750007 +0000 UTC m=+63.336620128" observedRunningTime="2026-04-24 21:28:28.064881743 +0000 UTC m=+64.892751913" watchObservedRunningTime="2026-04-24 21:28:28.066228214 +0000 UTC m=+64.894098342" Apr 24 21:28:28.083998 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:28.083945 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-4nrh7" podStartSLOduration=1.9989165230000001 podStartE2EDuration="3.083926317s" podCreationTimestamp="2026-04-24 21:28:25 +0000 UTC" firstStartedPulling="2026-04-24 21:28:26.600708275 +0000 UTC m=+63.428578385" lastFinishedPulling="2026-04-24 21:28:27.685718065 +0000 UTC m=+64.513588179" observedRunningTime="2026-04-24 21:28:28.083726644 +0000 UTC m=+64.911596773" watchObservedRunningTime="2026-04-24 21:28:28.083926317 +0000 UTC m=+64.911796450" Apr 24 21:28:29.056927 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:29.056827 2578 generic.go:358] "Generic (PLEG): container finished" podID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerID="7b71c124071bf0f362b1af67cf1935edcc40f62f8d6d28b1e27aa6d063fb44d0" exitCode=0 Apr 24 21:28:29.056927 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:29.056904 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ebd75cff-1e6f-42c9-825f-055e9c04ab8e","Type":"ContainerDied","Data":"7b71c124071bf0f362b1af67cf1935edcc40f62f8d6d28b1e27aa6d063fb44d0"} Apr 24 21:28:29.420112 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:29.420019 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs\") pod \"network-metrics-daemon-lkk5b\" (UID: \"20a034bb-c3d2-4d05-92de-ed16d2eda707\") " pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:28:29.422732 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:29.422702 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:28:29.433498 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:29.433465 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20a034bb-c3d2-4d05-92de-ed16d2eda707-metrics-certs\") pod \"network-metrics-daemon-lkk5b\" (UID: \"20a034bb-c3d2-4d05-92de-ed16d2eda707\") " pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:28:29.520683 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:29.520635 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret\") pod \"global-pull-secret-syncer-hxdgr\" (UID: \"7a2b19a8-7cce-48ea-a91f-3306187c2d2a\") " pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:28:29.520877 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:29.520752 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hqhp\" (UniqueName: \"kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp\") pod \"network-check-target-c56pf\" (UID: \"af45c5fb-e377-44ea-ad83-ad7e5bea725b\") " pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:28:29.524366 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:29.524337 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:28:29.524527 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:29.524335 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:28:29.534358 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:29.534316 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7a2b19a8-7cce-48ea-a91f-3306187c2d2a-original-pull-secret\") pod \"global-pull-secret-syncer-hxdgr\" (UID: \"7a2b19a8-7cce-48ea-a91f-3306187c2d2a\") " pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:28:29.534531 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:29.534472 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:28:29.544622 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:29.544583 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hqhp\" (UniqueName: \"kubernetes.io/projected/af45c5fb-e377-44ea-ad83-ad7e5bea725b-kube-api-access-4hqhp\") pod \"network-check-target-c56pf\" (UID: \"af45c5fb-e377-44ea-ad83-ad7e5bea725b\") " pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:28:29.574421 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:29.574379 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxdgr" Apr 24 21:28:29.585826 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:29.585793 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xwzc6\"" Apr 24 21:28:29.591642 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:29.591610 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tshq6\"" Apr 24 21:28:29.593630 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:29.593607 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:28:29.599441 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:29.599414 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lkk5b" Apr 24 21:28:30.034761 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:30.034721 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jq98m"] Apr 24 21:28:30.050283 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:30.050248 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jq98m"] Apr 24 21:28:30.050463 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:30.050393 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jq98m" Apr 24 21:28:30.052848 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:30.052789 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 21:28:30.053019 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:30.052960 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-lfbbt\"" Apr 24 21:28:30.126730 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:30.126574 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c27bccf5-72a8-41fb-9c10-af53b192d121-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jq98m\" (UID: \"c27bccf5-72a8-41fb-9c10-af53b192d121\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jq98m" Apr 24 21:28:30.207274 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:30.207205 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-c56pf"] Apr 24 21:28:30.210529 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:28:30.210508 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf45c5fb_e377_44ea_ad83_ad7e5bea725b.slice/crio-87914fdddcc92adc343049517a3614279f2ed09af328b76b235b02e8616a0f98 WatchSource:0}: Error finding container 87914fdddcc92adc343049517a3614279f2ed09af328b76b235b02e8616a0f98: Status 404 returned error can't find the container with id 87914fdddcc92adc343049517a3614279f2ed09af328b76b235b02e8616a0f98 Apr 24 21:28:30.227733 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:30.227699 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c27bccf5-72a8-41fb-9c10-af53b192d121-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jq98m\" (UID: \"c27bccf5-72a8-41fb-9c10-af53b192d121\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jq98m" Apr 24 21:28:30.227986 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:30.227911 2578 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 24 21:28:30.227986 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:28:30.227980 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c27bccf5-72a8-41fb-9c10-af53b192d121-monitoring-plugin-cert podName:c27bccf5-72a8-41fb-9c10-af53b192d121 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:30.727963628 +0000 UTC m=+67.555833742 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/c27bccf5-72a8-41fb-9c10-af53b192d121-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-jq98m" (UID: "c27bccf5-72a8-41fb-9c10-af53b192d121") : secret "monitoring-plugin-cert" not found Apr 24 21:28:30.437279 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:28:30.436152 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a2b19a8_7cce_48ea_a91f_3306187c2d2a.slice/crio-f1cdb1e5bd8ea96730c807593bcfa5108c3b5c68d5c9c10f65fafcd13673f89b WatchSource:0}: Error finding container f1cdb1e5bd8ea96730c807593bcfa5108c3b5c68d5c9c10f65fafcd13673f89b: Status 404 returned error can't find the container with id f1cdb1e5bd8ea96730c807593bcfa5108c3b5c68d5c9c10f65fafcd13673f89b Apr 24 21:28:30.437279 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:30.437037 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hxdgr"] Apr 24 21:28:30.438648 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:30.438612 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lkk5b"] Apr 24 21:28:30.440987 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:28:30.440961 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20a034bb_c3d2_4d05_92de_ed16d2eda707.slice/crio-bea30e5355b35bdd0121f6ce77e1643b519bc7019b27aca535594c458e71b011 WatchSource:0}: Error finding container bea30e5355b35bdd0121f6ce77e1643b519bc7019b27aca535594c458e71b011: Status 404 returned error can't find the container with id bea30e5355b35bdd0121f6ce77e1643b519bc7019b27aca535594c458e71b011 Apr 24 21:28:30.733718 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:30.733539 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c27bccf5-72a8-41fb-9c10-af53b192d121-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jq98m\" (UID: \"c27bccf5-72a8-41fb-9c10-af53b192d121\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jq98m" Apr 24 21:28:30.738676 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:30.738642 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c27bccf5-72a8-41fb-9c10-af53b192d121-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jq98m\" (UID: \"c27bccf5-72a8-41fb-9c10-af53b192d121\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jq98m" Apr 24 21:28:30.962394 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:30.962353 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jq98m" Apr 24 21:28:31.017192 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.016997 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-thx4s" Apr 24 21:28:31.065078 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.065036 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9fb6f6b58-xtm46" event={"ID":"48670ceb-a251-45ea-8e95-b12bab903ddd","Type":"ContainerStarted","Data":"98f1b0e046123f2cca53ac321d3229d85f95602d0e1ebc299d60edaba67191c8"} Apr 24 21:28:31.066670 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.066635 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hxdgr" event={"ID":"7a2b19a8-7cce-48ea-a91f-3306187c2d2a","Type":"ContainerStarted","Data":"f1cdb1e5bd8ea96730c807593bcfa5108c3b5c68d5c9c10f65fafcd13673f89b"} Apr 24 21:28:31.068011 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.067962 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lkk5b" event={"ID":"20a034bb-c3d2-4d05-92de-ed16d2eda707","Type":"ContainerStarted","Data":"bea30e5355b35bdd0121f6ce77e1643b519bc7019b27aca535594c458e71b011"} Apr 24 21:28:31.069875 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.069834 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-c56pf" event={"ID":"af45c5fb-e377-44ea-ad83-ad7e5bea725b","Type":"ContainerStarted","Data":"87914fdddcc92adc343049517a3614279f2ed09af328b76b235b02e8616a0f98"} Apr 24 21:28:31.085495 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.085432 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9fb6f6b58-xtm46" podStartSLOduration=2.274439046 podStartE2EDuration="5.085412214s" podCreationTimestamp="2026-04-24 21:28:26 +0000 UTC" firstStartedPulling="2026-04-24 21:28:27.30519888 +0000 UTC m=+64.133068992" lastFinishedPulling="2026-04-24 21:28:30.116172049 +0000 UTC m=+66.944042160" observedRunningTime="2026-04-24 21:28:31.084546094 +0000 UTC m=+67.912416242" watchObservedRunningTime="2026-04-24 21:28:31.085412214 +0000 UTC m=+67.913282347" Apr 24 21:28:31.354952 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.354912 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jq98m"] Apr 24 21:28:31.520556 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:28:31.520516 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc27bccf5_72a8_41fb_9c10_af53b192d121.slice/crio-b8ea8c73a04614afa64ef5ba61e066c59b0d3662dcb557f68f5e806fd217dc61 WatchSource:0}: Error finding container b8ea8c73a04614afa64ef5ba61e066c59b0d3662dcb557f68f5e806fd217dc61: Status 404 returned error can't find the container with id b8ea8c73a04614afa64ef5ba61e066c59b0d3662dcb557f68f5e806fd217dc61 Apr 24 21:28:31.571244 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.570829 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:28:31.603978 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.603088 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:28:31.603978 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.603296 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.607570 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.606295 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 21:28:31.607570 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.606548 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 21:28:31.607570 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.606742 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-gs4tm\"" Apr 24 21:28:31.607570 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.606941 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 21:28:31.607570 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.607176 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 21:28:31.607570 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.607294 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 21:28:31.607570 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.607497 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 21:28:31.608027 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.607615 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 21:28:31.608027 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.607732 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 21:28:31.608027 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.607800 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 21:28:31.608027 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.607856 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 21:28:31.610372 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.608964 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 21:28:31.610519 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.610509 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 21:28:31.610875 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.610513 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 21:28:31.610875 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.610796 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5ir6j5tecvjda\"" Apr 24 21:28:31.642923 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.642558 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.642923 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.642592 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75481aac-d973-499e-8800-c6a68ac5be43-config-out\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.642923 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.642611 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.642923 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.642633 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.642923 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.642650 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.642923 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.642668 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75481aac-d973-499e-8800-c6a68ac5be43-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.642923 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.642683 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.642923 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.642701 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.642923 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.642716 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75481aac-d973-499e-8800-c6a68ac5be43-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.642923 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.642735 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.642923 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.642755 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.642923 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.642770 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.642923 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.642788 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn95z\" (UniqueName: \"kubernetes.io/projected/75481aac-d973-499e-8800-c6a68ac5be43-kube-api-access-nn95z\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.642923 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.642807 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.642923 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.642826 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-config\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.642923 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.642854 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-web-config\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.643559 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.642884 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.643559 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.642959 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.743734 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.743641 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.743734 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.743692 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.743734 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.743729 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.743916 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.743760 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75481aac-d973-499e-8800-c6a68ac5be43-config-out\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.743916 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.743788 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.743916 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.743820 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.743916 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.743845 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.743916 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.743873 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75481aac-d973-499e-8800-c6a68ac5be43-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.744172 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.743918 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.744172 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.743951 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.744172 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.743976 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75481aac-d973-499e-8800-c6a68ac5be43-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.745050 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.744762 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.745339 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.745313 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.745460 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.745371 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.745460 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.745405 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.745460 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.745441 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nn95z\" (UniqueName: \"kubernetes.io/projected/75481aac-d973-499e-8800-c6a68ac5be43-kube-api-access-nn95z\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.745614 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.745475 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.745614 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.745507 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-config\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.745614 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.745544 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-web-config\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.747988 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.745952 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.749210 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.748880 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.749210 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.749160 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75481aac-d973-499e-8800-c6a68ac5be43-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.750715 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.749947 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.750715 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.749960 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-web-config\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.750715 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.750545 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.752072 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.752046 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.756126 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.756068 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-config\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.758949 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.758847 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75481aac-d973-499e-8800-c6a68ac5be43-config-out\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.759599 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.759556 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.760580 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.760541 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.761103 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.761057 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75481aac-d973-499e-8800-c6a68ac5be43-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.764063 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.764021 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.764838 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.764454 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn95z\" (UniqueName: \"kubernetes.io/projected/75481aac-d973-499e-8800-c6a68ac5be43-kube-api-access-nn95z\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.764838 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.764778 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.767719 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.765156 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.770751 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.770649 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:31.920451 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:31.919967 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:32.077766 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:32.077598 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ebd75cff-1e6f-42c9-825f-055e9c04ab8e","Type":"ContainerStarted","Data":"2279ea933f2d75faebc1f6ab23eed3b49c2a64bd874e5e109e6cd76e2ebf9073"} Apr 24 21:28:32.077766 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:32.077647 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ebd75cff-1e6f-42c9-825f-055e9c04ab8e","Type":"ContainerStarted","Data":"e04bf3bc19c68717255762bf051d0cc1c22662b75f0b0b181a0ba82b31c6e707"} Apr 24 21:28:32.081391 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:32.081339 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jq98m" event={"ID":"c27bccf5-72a8-41fb-9c10-af53b192d121","Type":"ContainerStarted","Data":"b8ea8c73a04614afa64ef5ba61e066c59b0d3662dcb557f68f5e806fd217dc61"} Apr 24 21:28:32.109938 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:32.108972 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:28:33.086142 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:33.086052 2578 generic.go:358] "Generic (PLEG): container finished" podID="75481aac-d973-499e-8800-c6a68ac5be43" containerID="8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b" exitCode=0 Apr 24 21:28:33.086614 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:33.086147 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75481aac-d973-499e-8800-c6a68ac5be43","Type":"ContainerDied","Data":"8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b"} Apr 24 21:28:33.086614 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:33.086187 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75481aac-d973-499e-8800-c6a68ac5be43","Type":"ContainerStarted","Data":"3ae6e01462843d78822f0dcaf0997b1f7d71c190ce513173b8430096b47710bb"} Apr 24 21:28:33.089578 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:33.089462 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ebd75cff-1e6f-42c9-825f-055e9c04ab8e","Type":"ContainerStarted","Data":"b87006557774619b58cd8f29570688698f88f69aa38b90fce3c1bf83457e7b6e"} Apr 24 21:28:33.089578 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:33.089496 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ebd75cff-1e6f-42c9-825f-055e9c04ab8e","Type":"ContainerStarted","Data":"f1a8a9475caff9db2d6d0da98d444ca60c79fd8711bc7615321439b47b5eb5ef"} Apr 24 21:28:33.089578 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:33.089510 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ebd75cff-1e6f-42c9-825f-055e9c04ab8e","Type":"ContainerStarted","Data":"ef143715f35d87c57c1c4a73b7f46eb79e46948a77a971f771167ddfdd9a6ecc"} Apr 24 21:28:37.128615 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:37.128491 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jq98m" Apr 24 21:28:37.135041 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:37.135011 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jq98m" Apr 24 21:28:37.149268 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:37.149216 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jq98m" podStartSLOduration=1.716553523 podStartE2EDuration="7.149195996s" podCreationTimestamp="2026-04-24 21:28:30 +0000 UTC" firstStartedPulling="2026-04-24 21:28:31.545451901 +0000 UTC m=+68.373322042" lastFinishedPulling="2026-04-24 21:28:36.978094401 +0000 UTC m=+73.805964515" observedRunningTime="2026-04-24 21:28:37.147222078 +0000 UTC m=+73.975092193" watchObservedRunningTime="2026-04-24 21:28:37.149195996 +0000 UTC m=+73.977066129" Apr 24 21:28:37.150255 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:37.149997 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:37.150255 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:37.150030 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:37.159165 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:37.159139 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:38.132688 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:38.132627 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-c56pf" event={"ID":"af45c5fb-e377-44ea-ad83-ad7e5bea725b","Type":"ContainerStarted","Data":"9f512b6a05e256808415a71a8cfe101334a10566681659b599ba1f4a63cc3d76"} Apr 24 21:28:38.133204 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:38.132873 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:28:38.135300 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:38.135271 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75481aac-d973-499e-8800-c6a68ac5be43","Type":"ContainerStarted","Data":"e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755"} Apr 24 21:28:38.135440 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:38.135305 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75481aac-d973-499e-8800-c6a68ac5be43","Type":"ContainerStarted","Data":"e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686"} Apr 24 21:28:38.140058 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:38.140022 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ebd75cff-1e6f-42c9-825f-055e9c04ab8e","Type":"ContainerStarted","Data":"61bf95adf0df8a2fc0942b6d83a0d166480c1f7a4ebab6d433fbd1c42475b941"} Apr 24 21:28:38.147507 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:38.147452 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hxdgr" event={"ID":"7a2b19a8-7cce-48ea-a91f-3306187c2d2a","Type":"ContainerStarted","Data":"ecf690a03f6f8d8144299aef6039f8484a8116515df6e5c9b0be07bce60d6cc0"} Apr 24 21:28:38.149404 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:38.149380 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lkk5b" event={"ID":"20a034bb-c3d2-4d05-92de-ed16d2eda707","Type":"ContainerStarted","Data":"1952573c349881cd62d1afdf8d21e90f80d3f29ac2dfa7c6ed5f0bed99dbf06c"} Apr 24 21:28:38.149404 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:38.149408 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lkk5b" event={"ID":"20a034bb-c3d2-4d05-92de-ed16d2eda707","Type":"ContainerStarted","Data":"7c8e87f80810a432a0e66f3fe9389f31c1ffa895cfa96d6e474b996d1d88d71d"} Apr 24 21:28:38.151327 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:38.151260 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-c56pf" podStartSLOduration=68.434199219 podStartE2EDuration="1m15.151245582s" podCreationTimestamp="2026-04-24 21:27:23 +0000 UTC" firstStartedPulling="2026-04-24 21:28:30.213227159 +0000 UTC m=+67.041097271" lastFinishedPulling="2026-04-24 21:28:36.93027352 +0000 UTC m=+73.758143634" observedRunningTime="2026-04-24 21:28:38.149588375 +0000 UTC m=+74.977458520" watchObservedRunningTime="2026-04-24 21:28:38.151245582 +0000 UTC m=+74.979115716" Apr 24 21:28:38.153086 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:38.152779 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jq98m" event={"ID":"c27bccf5-72a8-41fb-9c10-af53b192d121","Type":"ContainerStarted","Data":"c00a809477bf875da88d0dee550205d2cbdeadbdf076bfca9500c40e459a9666"} Apr 24 21:28:38.158005 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:38.157965 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:28:38.209361 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:38.209249 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.869301512 podStartE2EDuration="12.209229899s" podCreationTimestamp="2026-04-24 21:28:26 +0000 UTC" firstStartedPulling="2026-04-24 21:28:27.643299875 +0000 UTC m=+64.471169985" lastFinishedPulling="2026-04-24 21:28:36.98322826 +0000 UTC m=+73.811098372" observedRunningTime="2026-04-24 21:28:38.207822371 +0000 UTC m=+75.035692503" watchObservedRunningTime="2026-04-24 21:28:38.209229899 +0000 UTC m=+75.037100033" Apr 24 21:28:38.211606 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:38.211114 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hxdgr" podStartSLOduration=66.671757846 podStartE2EDuration="1m13.211097314s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:28:30.438753342 +0000 UTC m=+67.266623451" lastFinishedPulling="2026-04-24 21:28:36.9780928 +0000 UTC m=+73.805962919" observedRunningTime="2026-04-24 21:28:38.167259181 +0000 UTC m=+74.995129347" watchObservedRunningTime="2026-04-24 21:28:38.211097314 +0000 UTC m=+75.038967446" Apr 24 21:28:38.233522 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:38.233465 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lkk5b" podStartSLOduration=68.698633081 podStartE2EDuration="1m15.23344929s" podCreationTimestamp="2026-04-24 21:27:23 +0000 UTC" firstStartedPulling="2026-04-24 21:28:30.443377752 +0000 UTC m=+67.271247866" lastFinishedPulling="2026-04-24 21:28:36.978193958 +0000 UTC m=+73.806064075" observedRunningTime="2026-04-24 21:28:38.233029342 +0000 UTC m=+75.060899475" watchObservedRunningTime="2026-04-24 21:28:38.23344929 +0000 UTC m=+75.061319449" Apr 24 21:28:39.158524 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:39.158405 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75481aac-d973-499e-8800-c6a68ac5be43","Type":"ContainerStarted","Data":"b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d"} Apr 24 21:28:39.158524 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:39.158455 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75481aac-d973-499e-8800-c6a68ac5be43","Type":"ContainerStarted","Data":"5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95"} Apr 24 21:28:39.158524 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:39.158473 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75481aac-d973-499e-8800-c6a68ac5be43","Type":"ContainerStarted","Data":"250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d"} Apr 24 21:28:39.158524 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:39.158483 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75481aac-d973-499e-8800-c6a68ac5be43","Type":"ContainerStarted","Data":"a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3"} Apr 24 21:28:39.192300 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:39.192237 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.389982711 podStartE2EDuration="8.192218489s" podCreationTimestamp="2026-04-24 21:28:31 +0000 UTC" firstStartedPulling="2026-04-24 21:28:33.087740964 +0000 UTC m=+69.915611077" lastFinishedPulling="2026-04-24 21:28:38.889976731 +0000 UTC m=+75.717846855" observedRunningTime="2026-04-24 21:28:39.189848482 +0000 UTC m=+76.017718613" watchObservedRunningTime="2026-04-24 21:28:39.192218489 +0000 UTC m=+76.020088622" Apr 24 21:28:41.920869 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:41.920830 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:49.820820 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:28:49.820780 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9fb6f6b58-xtm46"] Apr 24 21:29:09.162078 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:09.162044 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-c56pf" Apr 24 21:29:14.847544 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:14.847295 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-9fb6f6b58-xtm46" podUID="48670ceb-a251-45ea-8e95-b12bab903ddd" containerName="console" containerID="cri-o://98f1b0e046123f2cca53ac321d3229d85f95602d0e1ebc299d60edaba67191c8" gracePeriod=15 Apr 24 21:29:15.112128 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.112104 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9fb6f6b58-xtm46_48670ceb-a251-45ea-8e95-b12bab903ddd/console/0.log" Apr 24 21:29:15.112269 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.112176 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:29:15.141973 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.141770 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48670ceb-a251-45ea-8e95-b12bab903ddd-console-oauth-config\") pod \"48670ceb-a251-45ea-8e95-b12bab903ddd\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " Apr 24 21:29:15.141973 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.141835 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48670ceb-a251-45ea-8e95-b12bab903ddd-oauth-serving-cert\") pod \"48670ceb-a251-45ea-8e95-b12bab903ddd\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " Apr 24 21:29:15.141973 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.141871 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj52l\" (UniqueName: \"kubernetes.io/projected/48670ceb-a251-45ea-8e95-b12bab903ddd-kube-api-access-xj52l\") pod \"48670ceb-a251-45ea-8e95-b12bab903ddd\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " Apr 24 21:29:15.141973 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.141953 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48670ceb-a251-45ea-8e95-b12bab903ddd-console-serving-cert\") pod \"48670ceb-a251-45ea-8e95-b12bab903ddd\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " Apr 24 21:29:15.142249 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.142002 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48670ceb-a251-45ea-8e95-b12bab903ddd-service-ca\") pod \"48670ceb-a251-45ea-8e95-b12bab903ddd\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " Apr 24 21:29:15.142295 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.142261 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48670ceb-a251-45ea-8e95-b12bab903ddd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "48670ceb-a251-45ea-8e95-b12bab903ddd" (UID: "48670ceb-a251-45ea-8e95-b12bab903ddd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:15.142446 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.142423 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48670ceb-a251-45ea-8e95-b12bab903ddd-service-ca" (OuterVolumeSpecName: "service-ca") pod "48670ceb-a251-45ea-8e95-b12bab903ddd" (UID: "48670ceb-a251-45ea-8e95-b12bab903ddd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:15.144483 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.144452 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48670ceb-a251-45ea-8e95-b12bab903ddd-kube-api-access-xj52l" (OuterVolumeSpecName: "kube-api-access-xj52l") pod "48670ceb-a251-45ea-8e95-b12bab903ddd" (UID: "48670ceb-a251-45ea-8e95-b12bab903ddd"). InnerVolumeSpecName "kube-api-access-xj52l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:15.144594 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.144486 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48670ceb-a251-45ea-8e95-b12bab903ddd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "48670ceb-a251-45ea-8e95-b12bab903ddd" (UID: "48670ceb-a251-45ea-8e95-b12bab903ddd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:15.144594 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.144541 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48670ceb-a251-45ea-8e95-b12bab903ddd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "48670ceb-a251-45ea-8e95-b12bab903ddd" (UID: "48670ceb-a251-45ea-8e95-b12bab903ddd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:15.242777 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.242747 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48670ceb-a251-45ea-8e95-b12bab903ddd-console-config\") pod \"48670ceb-a251-45ea-8e95-b12bab903ddd\" (UID: \"48670ceb-a251-45ea-8e95-b12bab903ddd\") " Apr 24 21:29:15.242950 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.242868 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48670ceb-a251-45ea-8e95-b12bab903ddd-console-oauth-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:15.242950 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.242879 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48670ceb-a251-45ea-8e95-b12bab903ddd-oauth-serving-cert\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:15.242950 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.242907 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xj52l\" (UniqueName: \"kubernetes.io/projected/48670ceb-a251-45ea-8e95-b12bab903ddd-kube-api-access-xj52l\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:15.242950 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.242917 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48670ceb-a251-45ea-8e95-b12bab903ddd-console-serving-cert\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:15.242950 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.242926 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48670ceb-a251-45ea-8e95-b12bab903ddd-service-ca\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:15.243123 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.243054 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48670ceb-a251-45ea-8e95-b12bab903ddd-console-config" (OuterVolumeSpecName: "console-config") pod "48670ceb-a251-45ea-8e95-b12bab903ddd" (UID: "48670ceb-a251-45ea-8e95-b12bab903ddd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:15.268118 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.268094 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9fb6f6b58-xtm46_48670ceb-a251-45ea-8e95-b12bab903ddd/console/0.log" Apr 24 21:29:15.268282 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.268136 2578 generic.go:358] "Generic (PLEG): container finished" podID="48670ceb-a251-45ea-8e95-b12bab903ddd" containerID="98f1b0e046123f2cca53ac321d3229d85f95602d0e1ebc299d60edaba67191c8" exitCode=2 Apr 24 21:29:15.268282 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.268199 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9fb6f6b58-xtm46" Apr 24 21:29:15.268393 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.268199 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9fb6f6b58-xtm46" event={"ID":"48670ceb-a251-45ea-8e95-b12bab903ddd","Type":"ContainerDied","Data":"98f1b0e046123f2cca53ac321d3229d85f95602d0e1ebc299d60edaba67191c8"} Apr 24 21:29:15.268393 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.268311 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9fb6f6b58-xtm46" event={"ID":"48670ceb-a251-45ea-8e95-b12bab903ddd","Type":"ContainerDied","Data":"be3a4dc69c203e19fe29d11de33a11142ca1be4c9abde967603197d27f878abf"} Apr 24 21:29:15.268393 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.268338 2578 scope.go:117] "RemoveContainer" containerID="98f1b0e046123f2cca53ac321d3229d85f95602d0e1ebc299d60edaba67191c8" Apr 24 21:29:15.279828 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.279805 2578 scope.go:117] "RemoveContainer" containerID="98f1b0e046123f2cca53ac321d3229d85f95602d0e1ebc299d60edaba67191c8" Apr 24 21:29:15.280426 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:29:15.280394 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98f1b0e046123f2cca53ac321d3229d85f95602d0e1ebc299d60edaba67191c8\": container with ID starting with 98f1b0e046123f2cca53ac321d3229d85f95602d0e1ebc299d60edaba67191c8 not found: ID does not exist" containerID="98f1b0e046123f2cca53ac321d3229d85f95602d0e1ebc299d60edaba67191c8" Apr 24 21:29:15.280512 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.280436 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98f1b0e046123f2cca53ac321d3229d85f95602d0e1ebc299d60edaba67191c8"} err="failed to get container status \"98f1b0e046123f2cca53ac321d3229d85f95602d0e1ebc299d60edaba67191c8\": rpc error: code = NotFound desc = could not find container \"98f1b0e046123f2cca53ac321d3229d85f95602d0e1ebc299d60edaba67191c8\": container with ID starting with 98f1b0e046123f2cca53ac321d3229d85f95602d0e1ebc299d60edaba67191c8 not found: ID does not exist" Apr 24 21:29:15.309565 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.309540 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9fb6f6b58-xtm46"] Apr 24 21:29:15.314707 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.314684 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-9fb6f6b58-xtm46"] Apr 24 21:29:15.344059 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.344039 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48670ceb-a251-45ea-8e95-b12bab903ddd-console-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:15.763352 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:15.763316 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48670ceb-a251-45ea-8e95-b12bab903ddd" path="/var/lib/kubelet/pods/48670ceb-a251-45ea-8e95-b12bab903ddd/volumes" Apr 24 21:29:31.920787 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:31.920733 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:31.939420 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:31.939395 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:32.330085 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:32.330060 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:45.887670 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:45.887633 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:29:45.888134 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:45.888109 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="alertmanager" containerID="cri-o://e04bf3bc19c68717255762bf051d0cc1c22662b75f0b0b181a0ba82b31c6e707" gracePeriod=120 Apr 24 21:29:45.888232 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:45.888185 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="kube-rbac-proxy-metric" containerID="cri-o://b87006557774619b58cd8f29570688698f88f69aa38b90fce3c1bf83457e7b6e" gracePeriod=120 Apr 24 21:29:45.888440 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:45.888191 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="kube-rbac-proxy-web" containerID="cri-o://ef143715f35d87c57c1c4a73b7f46eb79e46948a77a971f771167ddfdd9a6ecc" gracePeriod=120 Apr 24 21:29:45.888563 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:45.888203 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="kube-rbac-proxy" containerID="cri-o://f1a8a9475caff9db2d6d0da98d444ca60c79fd8711bc7615321439b47b5eb5ef" gracePeriod=120 Apr 24 21:29:45.888563 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:45.888235 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="config-reloader" containerID="cri-o://2279ea933f2d75faebc1f6ab23eed3b49c2a64bd874e5e109e6cd76e2ebf9073" gracePeriod=120 Apr 24 21:29:45.888563 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:45.888254 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="prom-label-proxy" containerID="cri-o://61bf95adf0df8a2fc0942b6d83a0d166480c1f7a4ebab6d433fbd1c42475b941" gracePeriod=120 Apr 24 21:29:46.355470 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:46.355438 2578 generic.go:358] "Generic (PLEG): container finished" podID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerID="61bf95adf0df8a2fc0942b6d83a0d166480c1f7a4ebab6d433fbd1c42475b941" exitCode=0 Apr 24 21:29:46.355470 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:46.355464 2578 generic.go:358] "Generic (PLEG): container finished" podID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerID="b87006557774619b58cd8f29570688698f88f69aa38b90fce3c1bf83457e7b6e" exitCode=0 Apr 24 21:29:46.355470 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:46.355470 2578 generic.go:358] "Generic (PLEG): container finished" podID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerID="f1a8a9475caff9db2d6d0da98d444ca60c79fd8711bc7615321439b47b5eb5ef" exitCode=0 Apr 24 21:29:46.355470 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:46.355476 2578 generic.go:358] "Generic (PLEG): container finished" podID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerID="2279ea933f2d75faebc1f6ab23eed3b49c2a64bd874e5e109e6cd76e2ebf9073" exitCode=0 Apr 24 21:29:46.355470 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:46.355482 2578 generic.go:358] "Generic (PLEG): container finished" podID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerID="e04bf3bc19c68717255762bf051d0cc1c22662b75f0b0b181a0ba82b31c6e707" exitCode=0 Apr 24 21:29:46.355756 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:46.355500 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ebd75cff-1e6f-42c9-825f-055e9c04ab8e","Type":"ContainerDied","Data":"61bf95adf0df8a2fc0942b6d83a0d166480c1f7a4ebab6d433fbd1c42475b941"} Apr 24 21:29:46.355756 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:46.355532 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ebd75cff-1e6f-42c9-825f-055e9c04ab8e","Type":"ContainerDied","Data":"b87006557774619b58cd8f29570688698f88f69aa38b90fce3c1bf83457e7b6e"} Apr 24 21:29:46.355756 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:46.355542 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ebd75cff-1e6f-42c9-825f-055e9c04ab8e","Type":"ContainerDied","Data":"f1a8a9475caff9db2d6d0da98d444ca60c79fd8711bc7615321439b47b5eb5ef"} Apr 24 21:29:46.355756 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:46.355551 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ebd75cff-1e6f-42c9-825f-055e9c04ab8e","Type":"ContainerDied","Data":"2279ea933f2d75faebc1f6ab23eed3b49c2a64bd874e5e109e6cd76e2ebf9073"} Apr 24 21:29:46.355756 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:46.355560 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ebd75cff-1e6f-42c9-825f-055e9c04ab8e","Type":"ContainerDied","Data":"e04bf3bc19c68717255762bf051d0cc1c22662b75f0b0b181a0ba82b31c6e707"} Apr 24 21:29:47.135246 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.135222 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.171763 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.171309 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-config-volume\") pod \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " Apr 24 21:29:47.171763 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.171358 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " Apr 24 21:29:47.171763 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.171387 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-config-out\") pod \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " Apr 24 21:29:47.171763 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.171418 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-web-config\") pod \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " Apr 24 21:29:47.171763 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.171442 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-alertmanager-main-db\") pod \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " Apr 24 21:29:47.171763 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.171495 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-tls-assets\") pod \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " Apr 24 21:29:47.171763 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.171531 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-kube-rbac-proxy\") pod \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " Apr 24 21:29:47.171763 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.171555 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cxn7\" (UniqueName: \"kubernetes.io/projected/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-kube-api-access-9cxn7\") pod \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " Apr 24 21:29:47.171763 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.171600 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-main-tls\") pod \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " Apr 24 21:29:47.171763 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.171633 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-kube-rbac-proxy-web\") pod \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " Apr 24 21:29:47.171763 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.171656 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-alertmanager-trusted-ca-bundle\") pod \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " Apr 24 21:29:47.171763 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.171683 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-metrics-client-ca\") pod \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " Apr 24 21:29:47.171763 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.171705 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-cluster-tls-config\") pod \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\" (UID: \"ebd75cff-1e6f-42c9-825f-055e9c04ab8e\") " Apr 24 21:29:47.174530 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.173590 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "ebd75cff-1e6f-42c9-825f-055e9c04ab8e" (UID: "ebd75cff-1e6f-42c9-825f-055e9c04ab8e"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:29:47.174530 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.173741 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "ebd75cff-1e6f-42c9-825f-055e9c04ab8e" (UID: "ebd75cff-1e6f-42c9-825f-055e9c04ab8e"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:47.175099 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.175017 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "ebd75cff-1e6f-42c9-825f-055e9c04ab8e" (UID: "ebd75cff-1e6f-42c9-825f-055e9c04ab8e"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:47.175204 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.175166 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "ebd75cff-1e6f-42c9-825f-055e9c04ab8e" (UID: "ebd75cff-1e6f-42c9-825f-055e9c04ab8e"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:47.176780 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.176723 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "ebd75cff-1e6f-42c9-825f-055e9c04ab8e" (UID: "ebd75cff-1e6f-42c9-825f-055e9c04ab8e"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:47.177272 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.177230 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-config-out" (OuterVolumeSpecName: "config-out") pod "ebd75cff-1e6f-42c9-825f-055e9c04ab8e" (UID: "ebd75cff-1e6f-42c9-825f-055e9c04ab8e"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:29:47.177565 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.177541 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "ebd75cff-1e6f-42c9-825f-055e9c04ab8e" (UID: "ebd75cff-1e6f-42c9-825f-055e9c04ab8e"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:47.177736 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.177710 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "ebd75cff-1e6f-42c9-825f-055e9c04ab8e" (UID: "ebd75cff-1e6f-42c9-825f-055e9c04ab8e"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:47.177821 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.177711 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-kube-api-access-9cxn7" (OuterVolumeSpecName: "kube-api-access-9cxn7") pod "ebd75cff-1e6f-42c9-825f-055e9c04ab8e" (UID: "ebd75cff-1e6f-42c9-825f-055e9c04ab8e"). InnerVolumeSpecName "kube-api-access-9cxn7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:47.177821 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.177783 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "ebd75cff-1e6f-42c9-825f-055e9c04ab8e" (UID: "ebd75cff-1e6f-42c9-825f-055e9c04ab8e"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:47.178414 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.178372 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-config-volume" (OuterVolumeSpecName: "config-volume") pod "ebd75cff-1e6f-42c9-825f-055e9c04ab8e" (UID: "ebd75cff-1e6f-42c9-825f-055e9c04ab8e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:47.182951 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.182590 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "ebd75cff-1e6f-42c9-825f-055e9c04ab8e" (UID: "ebd75cff-1e6f-42c9-825f-055e9c04ab8e"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:47.189193 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.189168 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-web-config" (OuterVolumeSpecName: "web-config") pod "ebd75cff-1e6f-42c9-825f-055e9c04ab8e" (UID: "ebd75cff-1e6f-42c9-825f-055e9c04ab8e"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:47.272640 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.272594 2578 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-config-volume\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:47.272640 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.272639 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:47.272640 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.272649 2578 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-config-out\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:47.272640 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.272659 2578 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-web-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:47.272930 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.272668 2578 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-alertmanager-main-db\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:47.272930 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.272676 2578 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-tls-assets\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:47.272930 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.272686 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:47.272930 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.272695 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9cxn7\" (UniqueName: \"kubernetes.io/projected/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-kube-api-access-9cxn7\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:47.272930 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.272703 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-main-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:47.272930 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.272712 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:47.272930 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.272721 2578 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:47.272930 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.272729 2578 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-metrics-client-ca\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:47.272930 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.272737 2578 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ebd75cff-1e6f-42c9-825f-055e9c04ab8e-cluster-tls-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:47.361798 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.361764 2578 generic.go:358] "Generic (PLEG): container finished" podID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerID="ef143715f35d87c57c1c4a73b7f46eb79e46948a77a971f771167ddfdd9a6ecc" exitCode=0 Apr 24 21:29:47.361988 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.361851 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ebd75cff-1e6f-42c9-825f-055e9c04ab8e","Type":"ContainerDied","Data":"ef143715f35d87c57c1c4a73b7f46eb79e46948a77a971f771167ddfdd9a6ecc"} Apr 24 21:29:47.361988 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.361882 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.361988 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.361914 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ebd75cff-1e6f-42c9-825f-055e9c04ab8e","Type":"ContainerDied","Data":"727948ca1daa43cdca172c4612dd67174ad2b90fec3c0d7acb03ffd095203fa4"} Apr 24 21:29:47.361988 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.361938 2578 scope.go:117] "RemoveContainer" containerID="61bf95adf0df8a2fc0942b6d83a0d166480c1f7a4ebab6d433fbd1c42475b941" Apr 24 21:29:47.370234 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.370218 2578 scope.go:117] "RemoveContainer" containerID="b87006557774619b58cd8f29570688698f88f69aa38b90fce3c1bf83457e7b6e" Apr 24 21:29:47.376456 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.376441 2578 scope.go:117] "RemoveContainer" containerID="f1a8a9475caff9db2d6d0da98d444ca60c79fd8711bc7615321439b47b5eb5ef" Apr 24 21:29:47.382166 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.382146 2578 scope.go:117] "RemoveContainer" containerID="ef143715f35d87c57c1c4a73b7f46eb79e46948a77a971f771167ddfdd9a6ecc" Apr 24 21:29:47.388668 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.388636 2578 scope.go:117] "RemoveContainer" containerID="2279ea933f2d75faebc1f6ab23eed3b49c2a64bd874e5e109e6cd76e2ebf9073" Apr 24 21:29:47.389200 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.389179 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:29:47.395620 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.395601 2578 scope.go:117] "RemoveContainer" containerID="e04bf3bc19c68717255762bf051d0cc1c22662b75f0b0b181a0ba82b31c6e707" Apr 24 21:29:47.395705 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.395666 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:29:47.401771 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.401755 2578 scope.go:117] "RemoveContainer" containerID="7b71c124071bf0f362b1af67cf1935edcc40f62f8d6d28b1e27aa6d063fb44d0" Apr 24 21:29:47.407732 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.407714 2578 scope.go:117] "RemoveContainer" containerID="61bf95adf0df8a2fc0942b6d83a0d166480c1f7a4ebab6d433fbd1c42475b941" Apr 24 21:29:47.407979 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:29:47.407961 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61bf95adf0df8a2fc0942b6d83a0d166480c1f7a4ebab6d433fbd1c42475b941\": container with ID starting with 61bf95adf0df8a2fc0942b6d83a0d166480c1f7a4ebab6d433fbd1c42475b941 not found: ID does not exist" containerID="61bf95adf0df8a2fc0942b6d83a0d166480c1f7a4ebab6d433fbd1c42475b941" Apr 24 21:29:47.408034 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.407989 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61bf95adf0df8a2fc0942b6d83a0d166480c1f7a4ebab6d433fbd1c42475b941"} err="failed to get container status \"61bf95adf0df8a2fc0942b6d83a0d166480c1f7a4ebab6d433fbd1c42475b941\": rpc error: code = NotFound desc = could not find container \"61bf95adf0df8a2fc0942b6d83a0d166480c1f7a4ebab6d433fbd1c42475b941\": container with ID starting with 61bf95adf0df8a2fc0942b6d83a0d166480c1f7a4ebab6d433fbd1c42475b941 not found: ID does not exist" Apr 24 21:29:47.408034 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.408008 2578 scope.go:117] "RemoveContainer" containerID="b87006557774619b58cd8f29570688698f88f69aa38b90fce3c1bf83457e7b6e" Apr 24 21:29:47.408236 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:29:47.408207 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b87006557774619b58cd8f29570688698f88f69aa38b90fce3c1bf83457e7b6e\": container with ID starting with b87006557774619b58cd8f29570688698f88f69aa38b90fce3c1bf83457e7b6e not found: ID does not exist" containerID="b87006557774619b58cd8f29570688698f88f69aa38b90fce3c1bf83457e7b6e" Apr 24 21:29:47.408303 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.408235 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b87006557774619b58cd8f29570688698f88f69aa38b90fce3c1bf83457e7b6e"} err="failed to get container status \"b87006557774619b58cd8f29570688698f88f69aa38b90fce3c1bf83457e7b6e\": rpc error: code = NotFound desc = could not find container \"b87006557774619b58cd8f29570688698f88f69aa38b90fce3c1bf83457e7b6e\": container with ID starting with b87006557774619b58cd8f29570688698f88f69aa38b90fce3c1bf83457e7b6e not found: ID does not exist" Apr 24 21:29:47.408303 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.408261 2578 scope.go:117] "RemoveContainer" containerID="f1a8a9475caff9db2d6d0da98d444ca60c79fd8711bc7615321439b47b5eb5ef" Apr 24 21:29:47.408479 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:29:47.408462 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1a8a9475caff9db2d6d0da98d444ca60c79fd8711bc7615321439b47b5eb5ef\": container with ID starting with f1a8a9475caff9db2d6d0da98d444ca60c79fd8711bc7615321439b47b5eb5ef not found: ID does not exist" containerID="f1a8a9475caff9db2d6d0da98d444ca60c79fd8711bc7615321439b47b5eb5ef" Apr 24 21:29:47.408526 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.408483 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1a8a9475caff9db2d6d0da98d444ca60c79fd8711bc7615321439b47b5eb5ef"} err="failed to get container status \"f1a8a9475caff9db2d6d0da98d444ca60c79fd8711bc7615321439b47b5eb5ef\": rpc error: code = NotFound desc = could not find container \"f1a8a9475caff9db2d6d0da98d444ca60c79fd8711bc7615321439b47b5eb5ef\": container with ID starting with f1a8a9475caff9db2d6d0da98d444ca60c79fd8711bc7615321439b47b5eb5ef not found: ID does not exist" Apr 24 21:29:47.408526 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.408497 2578 scope.go:117] "RemoveContainer" containerID="ef143715f35d87c57c1c4a73b7f46eb79e46948a77a971f771167ddfdd9a6ecc" Apr 24 21:29:47.408698 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:29:47.408683 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef143715f35d87c57c1c4a73b7f46eb79e46948a77a971f771167ddfdd9a6ecc\": container with ID starting with ef143715f35d87c57c1c4a73b7f46eb79e46948a77a971f771167ddfdd9a6ecc not found: ID does not exist" containerID="ef143715f35d87c57c1c4a73b7f46eb79e46948a77a971f771167ddfdd9a6ecc" Apr 24 21:29:47.408735 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.408702 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef143715f35d87c57c1c4a73b7f46eb79e46948a77a971f771167ddfdd9a6ecc"} err="failed to get container status \"ef143715f35d87c57c1c4a73b7f46eb79e46948a77a971f771167ddfdd9a6ecc\": rpc error: code = NotFound desc = could not find container \"ef143715f35d87c57c1c4a73b7f46eb79e46948a77a971f771167ddfdd9a6ecc\": container with ID starting with ef143715f35d87c57c1c4a73b7f46eb79e46948a77a971f771167ddfdd9a6ecc not found: ID does not exist" Apr 24 21:29:47.408735 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.408713 2578 scope.go:117] "RemoveContainer" containerID="2279ea933f2d75faebc1f6ab23eed3b49c2a64bd874e5e109e6cd76e2ebf9073" Apr 24 21:29:47.408911 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:29:47.408875 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2279ea933f2d75faebc1f6ab23eed3b49c2a64bd874e5e109e6cd76e2ebf9073\": container with ID starting with 2279ea933f2d75faebc1f6ab23eed3b49c2a64bd874e5e109e6cd76e2ebf9073 not found: ID does not exist" containerID="2279ea933f2d75faebc1f6ab23eed3b49c2a64bd874e5e109e6cd76e2ebf9073" Apr 24 21:29:47.408980 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.408913 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2279ea933f2d75faebc1f6ab23eed3b49c2a64bd874e5e109e6cd76e2ebf9073"} err="failed to get container status \"2279ea933f2d75faebc1f6ab23eed3b49c2a64bd874e5e109e6cd76e2ebf9073\": rpc error: code = NotFound desc = could not find container \"2279ea933f2d75faebc1f6ab23eed3b49c2a64bd874e5e109e6cd76e2ebf9073\": container with ID starting with 2279ea933f2d75faebc1f6ab23eed3b49c2a64bd874e5e109e6cd76e2ebf9073 not found: ID does not exist" Apr 24 21:29:47.408980 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.408932 2578 scope.go:117] "RemoveContainer" containerID="e04bf3bc19c68717255762bf051d0cc1c22662b75f0b0b181a0ba82b31c6e707" Apr 24 21:29:47.409153 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:29:47.409137 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e04bf3bc19c68717255762bf051d0cc1c22662b75f0b0b181a0ba82b31c6e707\": container with ID starting with e04bf3bc19c68717255762bf051d0cc1c22662b75f0b0b181a0ba82b31c6e707 not found: ID does not exist" containerID="e04bf3bc19c68717255762bf051d0cc1c22662b75f0b0b181a0ba82b31c6e707" Apr 24 21:29:47.409195 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.409158 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04bf3bc19c68717255762bf051d0cc1c22662b75f0b0b181a0ba82b31c6e707"} err="failed to get container status \"e04bf3bc19c68717255762bf051d0cc1c22662b75f0b0b181a0ba82b31c6e707\": rpc error: code = NotFound desc = could not find container \"e04bf3bc19c68717255762bf051d0cc1c22662b75f0b0b181a0ba82b31c6e707\": container with ID starting with e04bf3bc19c68717255762bf051d0cc1c22662b75f0b0b181a0ba82b31c6e707 not found: ID does not exist" Apr 24 21:29:47.409195 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.409171 2578 scope.go:117] "RemoveContainer" containerID="7b71c124071bf0f362b1af67cf1935edcc40f62f8d6d28b1e27aa6d063fb44d0" Apr 24 21:29:47.409404 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:29:47.409387 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b71c124071bf0f362b1af67cf1935edcc40f62f8d6d28b1e27aa6d063fb44d0\": container with ID starting with 7b71c124071bf0f362b1af67cf1935edcc40f62f8d6d28b1e27aa6d063fb44d0 not found: ID does not exist" containerID="7b71c124071bf0f362b1af67cf1935edcc40f62f8d6d28b1e27aa6d063fb44d0" Apr 24 21:29:47.409442 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.409410 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b71c124071bf0f362b1af67cf1935edcc40f62f8d6d28b1e27aa6d063fb44d0"} err="failed to get container status \"7b71c124071bf0f362b1af67cf1935edcc40f62f8d6d28b1e27aa6d063fb44d0\": rpc error: code = NotFound desc = could not find container \"7b71c124071bf0f362b1af67cf1935edcc40f62f8d6d28b1e27aa6d063fb44d0\": container with ID starting with 7b71c124071bf0f362b1af67cf1935edcc40f62f8d6d28b1e27aa6d063fb44d0 not found: ID does not exist" Apr 24 21:29:47.435651 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.435630 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:29:47.435930 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.435917 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48670ceb-a251-45ea-8e95-b12bab903ddd" containerName="console" Apr 24 21:29:47.435982 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.435932 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="48670ceb-a251-45ea-8e95-b12bab903ddd" containerName="console" Apr 24 21:29:47.435982 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.435943 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="kube-rbac-proxy-metric" Apr 24 21:29:47.435982 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.435950 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="kube-rbac-proxy-metric" Apr 24 21:29:47.435982 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.435961 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="init-config-reloader" Apr 24 21:29:47.435982 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.435967 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="init-config-reloader" Apr 24 21:29:47.435982 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.435974 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="config-reloader" Apr 24 21:29:47.435982 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.435979 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="config-reloader" Apr 24 21:29:47.435982 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.435984 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="prom-label-proxy" Apr 24 21:29:47.436204 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.435990 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="prom-label-proxy" Apr 24 21:29:47.436204 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.435998 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="alertmanager" Apr 24 21:29:47.436204 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.436003 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="alertmanager" Apr 24 21:29:47.436204 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.436010 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="kube-rbac-proxy" Apr 24 21:29:47.436204 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.436015 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="kube-rbac-proxy" Apr 24 21:29:47.436204 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.436023 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="kube-rbac-proxy-web" Apr 24 21:29:47.436204 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.436028 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="kube-rbac-proxy-web" Apr 24 21:29:47.436204 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.436067 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="48670ceb-a251-45ea-8e95-b12bab903ddd" containerName="console" Apr 24 21:29:47.436204 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.436075 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="kube-rbac-proxy-web" Apr 24 21:29:47.436204 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.436081 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="kube-rbac-proxy-metric" Apr 24 21:29:47.436204 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.436088 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="config-reloader" Apr 24 21:29:47.436204 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.436094 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="kube-rbac-proxy" Apr 24 21:29:47.436204 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.436099 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="prom-label-proxy" Apr 24 21:29:47.436204 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.436105 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" containerName="alertmanager" Apr 24 21:29:47.440955 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.440939 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.449422 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.449406 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 21:29:47.449562 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.449544 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 21:29:47.449623 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.449576 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 21:29:47.449675 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.449661 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-wjtxz\"" Apr 24 21:29:47.450110 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.450093 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 21:29:47.450443 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.450385 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 21:29:47.450443 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.450438 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 21:29:47.450591 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.450453 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 21:29:47.450591 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.450463 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 21:29:47.456606 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.456587 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 21:29:47.465141 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.465118 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:29:47.474147 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.474127 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d98a752-54c8-4291-8475-dcb2e5810620-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.474248 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.474162 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4d98a752-54c8-4291-8475-dcb2e5810620-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.474248 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.474197 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d98a752-54c8-4291-8475-dcb2e5810620-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.474248 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.474227 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4d98a752-54c8-4291-8475-dcb2e5810620-web-config\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.474399 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.474307 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4d98a752-54c8-4291-8475-dcb2e5810620-config-volume\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.474399 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.474332 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4d98a752-54c8-4291-8475-dcb2e5810620-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.474399 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.474371 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4d98a752-54c8-4291-8475-dcb2e5810620-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.474521 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.474399 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4d98a752-54c8-4291-8475-dcb2e5810620-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.474521 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.474425 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4d98a752-54c8-4291-8475-dcb2e5810620-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.474521 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.474488 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhltm\" (UniqueName: \"kubernetes.io/projected/4d98a752-54c8-4291-8475-dcb2e5810620-kube-api-access-xhltm\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.474521 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.474514 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4d98a752-54c8-4291-8475-dcb2e5810620-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.474660 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.474544 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4d98a752-54c8-4291-8475-dcb2e5810620-config-out\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.474660 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.474571 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4d98a752-54c8-4291-8475-dcb2e5810620-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.575564 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.575537 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d98a752-54c8-4291-8475-dcb2e5810620-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.575663 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.575569 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4d98a752-54c8-4291-8475-dcb2e5810620-web-config\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.575663 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.575593 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4d98a752-54c8-4291-8475-dcb2e5810620-config-volume\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.575663 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.575610 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4d98a752-54c8-4291-8475-dcb2e5810620-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.575663 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.575632 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4d98a752-54c8-4291-8475-dcb2e5810620-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.575663 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.575648 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4d98a752-54c8-4291-8475-dcb2e5810620-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.575663 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.575662 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4d98a752-54c8-4291-8475-dcb2e5810620-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.576021 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.575682 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhltm\" (UniqueName: \"kubernetes.io/projected/4d98a752-54c8-4291-8475-dcb2e5810620-kube-api-access-xhltm\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.576021 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.575703 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4d98a752-54c8-4291-8475-dcb2e5810620-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.576021 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.575730 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4d98a752-54c8-4291-8475-dcb2e5810620-config-out\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.576021 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.575767 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4d98a752-54c8-4291-8475-dcb2e5810620-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.576021 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.575804 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d98a752-54c8-4291-8475-dcb2e5810620-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.576021 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.575833 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4d98a752-54c8-4291-8475-dcb2e5810620-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.576316 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.576160 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4d98a752-54c8-4291-8475-dcb2e5810620-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.577978 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.577948 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d98a752-54c8-4291-8475-dcb2e5810620-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.578646 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.578537 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4d98a752-54c8-4291-8475-dcb2e5810620-web-config\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.578646 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.578549 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4d98a752-54c8-4291-8475-dcb2e5810620-config-out\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.578793 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.578641 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4d98a752-54c8-4291-8475-dcb2e5810620-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.578855 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.578792 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d98a752-54c8-4291-8475-dcb2e5810620-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.578994 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.578972 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4d98a752-54c8-4291-8475-dcb2e5810620-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.578994 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.578982 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4d98a752-54c8-4291-8475-dcb2e5810620-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.579091 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.578985 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4d98a752-54c8-4291-8475-dcb2e5810620-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.579091 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.579029 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4d98a752-54c8-4291-8475-dcb2e5810620-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.579411 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.579391 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4d98a752-54c8-4291-8475-dcb2e5810620-config-volume\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.580305 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.580290 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4d98a752-54c8-4291-8475-dcb2e5810620-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.589744 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.589727 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhltm\" (UniqueName: \"kubernetes.io/projected/4d98a752-54c8-4291-8475-dcb2e5810620-kube-api-access-xhltm\") pod \"alertmanager-main-0\" (UID: \"4d98a752-54c8-4291-8475-dcb2e5810620\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.749814 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.749728 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:47.762624 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.762598 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebd75cff-1e6f-42c9-825f-055e9c04ab8e" path="/var/lib/kubelet/pods/ebd75cff-1e6f-42c9-825f-055e9c04ab8e/volumes" Apr 24 21:29:47.881020 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:47.881001 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:29:47.882819 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:29:47.882787 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d98a752_54c8_4291_8475_dcb2e5810620.slice/crio-5c968f7aec4feb7a572825cce9419fb3dd02a31ea5dde63d879f403b018ed40a WatchSource:0}: Error finding container 5c968f7aec4feb7a572825cce9419fb3dd02a31ea5dde63d879f403b018ed40a: Status 404 returned error can't find the container with id 5c968f7aec4feb7a572825cce9419fb3dd02a31ea5dde63d879f403b018ed40a Apr 24 21:29:48.371053 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:48.371022 2578 generic.go:358] "Generic (PLEG): container finished" podID="4d98a752-54c8-4291-8475-dcb2e5810620" containerID="18fe4c0a40bd1281cf9ba155c9445f6eeaed111df1a213a74b5f1270d3bf988e" exitCode=0 Apr 24 21:29:48.371382 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:48.371085 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4d98a752-54c8-4291-8475-dcb2e5810620","Type":"ContainerDied","Data":"18fe4c0a40bd1281cf9ba155c9445f6eeaed111df1a213a74b5f1270d3bf988e"} Apr 24 21:29:48.371382 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:48.371105 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4d98a752-54c8-4291-8475-dcb2e5810620","Type":"ContainerStarted","Data":"5c968f7aec4feb7a572825cce9419fb3dd02a31ea5dde63d879f403b018ed40a"} Apr 24 21:29:49.376806 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:49.376771 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4d98a752-54c8-4291-8475-dcb2e5810620","Type":"ContainerStarted","Data":"924c4b93e66dd4a3e2fdea52fab625678808ba850f5cae28a16b1d5eff517780"} Apr 24 21:29:49.376806 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:49.376806 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4d98a752-54c8-4291-8475-dcb2e5810620","Type":"ContainerStarted","Data":"bec57753ca57d3412ed6ffc44d68e8d7069aefbe969a856ecf81ed21813f7767"} Apr 24 21:29:49.376806 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:49.376816 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4d98a752-54c8-4291-8475-dcb2e5810620","Type":"ContainerStarted","Data":"4146281398b011358434b64ea9cb9cc4a47c4a5c0cad3dc5399e0ba5e96a166a"} Apr 24 21:29:49.377398 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:49.376824 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4d98a752-54c8-4291-8475-dcb2e5810620","Type":"ContainerStarted","Data":"b2a4132874d7a76491dcd3ee404a93800fce1d8925f7c8b4b8a7c86e486a088f"} Apr 24 21:29:49.377398 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:49.376832 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4d98a752-54c8-4291-8475-dcb2e5810620","Type":"ContainerStarted","Data":"a9fc2f81eb8d0f68c4e65fd17b2af12d51d2e24f0db2e5bd9fa9bd0300860be4"} Apr 24 21:29:49.377398 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:49.376839 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4d98a752-54c8-4291-8475-dcb2e5810620","Type":"ContainerStarted","Data":"1a387cc37a1d5fef0e00040bb7c770bd7c00de4c3d55b65334542d8673bab4bd"} Apr 24 21:29:49.408416 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:49.408047 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.408029136 podStartE2EDuration="2.408029136s" podCreationTimestamp="2026-04-24 21:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:29:49.407037349 +0000 UTC m=+146.234907481" watchObservedRunningTime="2026-04-24 21:29:49.408029136 +0000 UTC m=+146.235899269" Apr 24 21:29:50.134538 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.134501 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:29:50.135582 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.135119 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="prometheus" containerID="cri-o://e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755" gracePeriod=600 Apr 24 21:29:50.135582 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.135266 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="kube-rbac-proxy-thanos" containerID="cri-o://b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d" gracePeriod=600 Apr 24 21:29:50.135582 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.135326 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="kube-rbac-proxy" containerID="cri-o://5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95" gracePeriod=600 Apr 24 21:29:50.135582 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.135371 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="thanos-sidecar" containerID="cri-o://a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3" gracePeriod=600 Apr 24 21:29:50.135582 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.135390 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="config-reloader" containerID="cri-o://e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686" gracePeriod=600 Apr 24 21:29:50.135582 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.135474 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="kube-rbac-proxy-web" containerID="cri-o://250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d" gracePeriod=600 Apr 24 21:29:50.383616 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.383587 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:50.384854 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.384794 2578 generic.go:358] "Generic (PLEG): container finished" podID="75481aac-d973-499e-8800-c6a68ac5be43" containerID="b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d" exitCode=0 Apr 24 21:29:50.384854 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.384815 2578 generic.go:358] "Generic (PLEG): container finished" podID="75481aac-d973-499e-8800-c6a68ac5be43" containerID="5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95" exitCode=0 Apr 24 21:29:50.384854 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.384822 2578 generic.go:358] "Generic (PLEG): container finished" podID="75481aac-d973-499e-8800-c6a68ac5be43" containerID="250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d" exitCode=0 Apr 24 21:29:50.384854 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.384827 2578 generic.go:358] "Generic (PLEG): container finished" podID="75481aac-d973-499e-8800-c6a68ac5be43" containerID="a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3" exitCode=0 Apr 24 21:29:50.384854 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.384833 2578 generic.go:358] "Generic (PLEG): container finished" podID="75481aac-d973-499e-8800-c6a68ac5be43" containerID="e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686" exitCode=0 Apr 24 21:29:50.384854 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.384837 2578 generic.go:358] "Generic (PLEG): container finished" podID="75481aac-d973-499e-8800-c6a68ac5be43" containerID="e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755" exitCode=0 Apr 24 21:29:50.385176 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.384864 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75481aac-d973-499e-8800-c6a68ac5be43","Type":"ContainerDied","Data":"b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d"} Apr 24 21:29:50.385176 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.384912 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75481aac-d973-499e-8800-c6a68ac5be43","Type":"ContainerDied","Data":"5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95"} Apr 24 21:29:50.385176 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.384925 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75481aac-d973-499e-8800-c6a68ac5be43","Type":"ContainerDied","Data":"250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d"} Apr 24 21:29:50.385176 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.384937 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75481aac-d973-499e-8800-c6a68ac5be43","Type":"ContainerDied","Data":"a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3"} Apr 24 21:29:50.385176 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.384946 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75481aac-d973-499e-8800-c6a68ac5be43","Type":"ContainerDied","Data":"e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686"} Apr 24 21:29:50.385176 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.384957 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75481aac-d973-499e-8800-c6a68ac5be43","Type":"ContainerDied","Data":"e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755"} Apr 24 21:29:50.385176 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.384967 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75481aac-d973-499e-8800-c6a68ac5be43","Type":"ContainerDied","Data":"3ae6e01462843d78822f0dcaf0997b1f7d71c190ce513173b8430096b47710bb"} Apr 24 21:29:50.385176 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.384987 2578 scope.go:117] "RemoveContainer" containerID="b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d" Apr 24 21:29:50.392497 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.392481 2578 scope.go:117] "RemoveContainer" containerID="5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95" Apr 24 21:29:50.399968 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.399949 2578 scope.go:117] "RemoveContainer" containerID="250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d" Apr 24 21:29:50.406700 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.406670 2578 scope.go:117] "RemoveContainer" containerID="a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3" Apr 24 21:29:50.413832 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.413723 2578 scope.go:117] "RemoveContainer" containerID="e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686" Apr 24 21:29:50.421727 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.421707 2578 scope.go:117] "RemoveContainer" containerID="e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755" Apr 24 21:29:50.431972 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.431942 2578 scope.go:117] "RemoveContainer" containerID="8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b" Apr 24 21:29:50.440155 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.440139 2578 scope.go:117] "RemoveContainer" containerID="b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d" Apr 24 21:29:50.440425 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:29:50.440404 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d\": container with ID starting with b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d not found: ID does not exist" containerID="b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d" Apr 24 21:29:50.440472 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.440438 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d"} err="failed to get container status \"b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d\": rpc error: code = NotFound desc = could not find container \"b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d\": container with ID starting with b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d not found: ID does not exist" Apr 24 21:29:50.440472 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.440456 2578 scope.go:117] "RemoveContainer" containerID="5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95" Apr 24 21:29:50.440713 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:29:50.440698 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95\": container with ID starting with 5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95 not found: ID does not exist" containerID="5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95" Apr 24 21:29:50.440773 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.440715 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95"} err="failed to get container status \"5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95\": rpc error: code = NotFound desc = could not find container \"5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95\": container with ID starting with 5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95 not found: ID does not exist" Apr 24 21:29:50.440773 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.440728 2578 scope.go:117] "RemoveContainer" containerID="250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d" Apr 24 21:29:50.441080 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:29:50.441063 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d\": container with ID starting with 250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d not found: ID does not exist" containerID="250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d" Apr 24 21:29:50.441148 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.441086 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d"} err="failed to get container status \"250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d\": rpc error: code = NotFound desc = could not find container \"250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d\": container with ID starting with 250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d not found: ID does not exist" Apr 24 21:29:50.441148 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.441107 2578 scope.go:117] "RemoveContainer" containerID="a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3" Apr 24 21:29:50.441361 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:29:50.441339 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3\": container with ID starting with a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3 not found: ID does not exist" containerID="a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3" Apr 24 21:29:50.441409 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.441374 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3"} err="failed to get container status \"a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3\": rpc error: code = NotFound desc = could not find container \"a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3\": container with ID starting with a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3 not found: ID does not exist" Apr 24 21:29:50.441409 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.441398 2578 scope.go:117] "RemoveContainer" containerID="e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686" Apr 24 21:29:50.441658 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:29:50.441642 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686\": container with ID starting with e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686 not found: ID does not exist" containerID="e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686" Apr 24 21:29:50.441705 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.441663 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686"} err="failed to get container status \"e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686\": rpc error: code = NotFound desc = could not find container \"e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686\": container with ID starting with e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686 not found: ID does not exist" Apr 24 21:29:50.441705 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.441676 2578 scope.go:117] "RemoveContainer" containerID="e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755" Apr 24 21:29:50.441884 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:29:50.441868 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755\": container with ID starting with e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755 not found: ID does not exist" containerID="e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755" Apr 24 21:29:50.441955 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.441905 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755"} err="failed to get container status \"e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755\": rpc error: code = NotFound desc = could not find container \"e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755\": container with ID starting with e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755 not found: ID does not exist" Apr 24 21:29:50.441955 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.441923 2578 scope.go:117] "RemoveContainer" containerID="8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b" Apr 24 21:29:50.442192 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:29:50.442173 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b\": container with ID starting with 8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b not found: ID does not exist" containerID="8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b" Apr 24 21:29:50.442273 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.442200 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b"} err="failed to get container status \"8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b\": rpc error: code = NotFound desc = could not find container \"8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b\": container with ID starting with 8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b not found: ID does not exist" Apr 24 21:29:50.442273 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.442221 2578 scope.go:117] "RemoveContainer" containerID="b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d" Apr 24 21:29:50.442479 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.442442 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d"} err="failed to get container status \"b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d\": rpc error: code = NotFound desc = could not find container \"b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d\": container with ID starting with b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d not found: ID does not exist" Apr 24 21:29:50.442535 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.442482 2578 scope.go:117] "RemoveContainer" containerID="5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95" Apr 24 21:29:50.442734 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.442715 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95"} err="failed to get container status \"5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95\": rpc error: code = NotFound desc = could not find container \"5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95\": container with ID starting with 5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95 not found: ID does not exist" Apr 24 21:29:50.442788 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.442736 2578 scope.go:117] "RemoveContainer" containerID="250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d" Apr 24 21:29:50.442980 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.442961 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d"} err="failed to get container status \"250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d\": rpc error: code = NotFound desc = could not find container \"250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d\": container with ID starting with 250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d not found: ID does not exist" Apr 24 21:29:50.443030 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.442980 2578 scope.go:117] "RemoveContainer" containerID="a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3" Apr 24 21:29:50.443228 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.443207 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3"} err="failed to get container status \"a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3\": rpc error: code = NotFound desc = could not find container \"a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3\": container with ID starting with a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3 not found: ID does not exist" Apr 24 21:29:50.443306 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.443230 2578 scope.go:117] "RemoveContainer" containerID="e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686" Apr 24 21:29:50.443459 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.443436 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686"} err="failed to get container status \"e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686\": rpc error: code = NotFound desc = could not find container \"e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686\": container with ID starting with e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686 not found: ID does not exist" Apr 24 21:29:50.443539 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.443460 2578 scope.go:117] "RemoveContainer" containerID="e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755" Apr 24 21:29:50.445322 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.445264 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755"} err="failed to get container status \"e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755\": rpc error: code = NotFound desc = could not find container \"e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755\": container with ID starting with e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755 not found: ID does not exist" Apr 24 21:29:50.445322 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.445291 2578 scope.go:117] "RemoveContainer" containerID="8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b" Apr 24 21:29:50.445675 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.445649 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b"} err="failed to get container status \"8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b\": rpc error: code = NotFound desc = could not find container \"8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b\": container with ID starting with 8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b not found: ID does not exist" Apr 24 21:29:50.445775 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.445676 2578 scope.go:117] "RemoveContainer" containerID="b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d" Apr 24 21:29:50.445969 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.445939 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d"} err="failed to get container status \"b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d\": rpc error: code = NotFound desc = could not find container \"b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d\": container with ID starting with b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d not found: ID does not exist" Apr 24 21:29:50.445969 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.445959 2578 scope.go:117] "RemoveContainer" containerID="5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95" Apr 24 21:29:50.446228 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.446201 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95"} err="failed to get container status \"5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95\": rpc error: code = NotFound desc = could not find container \"5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95\": container with ID starting with 5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95 not found: ID does not exist" Apr 24 21:29:50.446291 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.446230 2578 scope.go:117] "RemoveContainer" containerID="250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d" Apr 24 21:29:50.446480 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.446458 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d"} err="failed to get container status \"250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d\": rpc error: code = NotFound desc = could not find container \"250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d\": container with ID starting with 250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d not found: ID does not exist" Apr 24 21:29:50.446575 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.446489 2578 scope.go:117] "RemoveContainer" containerID="a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3" Apr 24 21:29:50.446743 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.446726 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3"} err="failed to get container status \"a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3\": rpc error: code = NotFound desc = could not find container \"a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3\": container with ID starting with a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3 not found: ID does not exist" Apr 24 21:29:50.446793 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.446744 2578 scope.go:117] "RemoveContainer" containerID="e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686" Apr 24 21:29:50.446982 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.446962 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686"} err="failed to get container status \"e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686\": rpc error: code = NotFound desc = could not find container \"e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686\": container with ID starting with e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686 not found: ID does not exist" Apr 24 21:29:50.446982 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.446983 2578 scope.go:117] "RemoveContainer" containerID="e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755" Apr 24 21:29:50.447240 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.447216 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755"} err="failed to get container status \"e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755\": rpc error: code = NotFound desc = could not find container \"e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755\": container with ID starting with e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755 not found: ID does not exist" Apr 24 21:29:50.447310 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.447242 2578 scope.go:117] "RemoveContainer" containerID="8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b" Apr 24 21:29:50.447520 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.447499 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b"} err="failed to get container status \"8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b\": rpc error: code = NotFound desc = could not find container \"8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b\": container with ID starting with 8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b not found: ID does not exist" Apr 24 21:29:50.447597 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.447520 2578 scope.go:117] "RemoveContainer" containerID="b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d" Apr 24 21:29:50.447765 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.447740 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d"} err="failed to get container status \"b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d\": rpc error: code = NotFound desc = could not find container \"b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d\": container with ID starting with b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d not found: ID does not exist" Apr 24 21:29:50.447819 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.447768 2578 scope.go:117] "RemoveContainer" containerID="5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95" Apr 24 21:29:50.448168 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.448136 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95"} err="failed to get container status \"5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95\": rpc error: code = NotFound desc = could not find container \"5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95\": container with ID starting with 5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95 not found: ID does not exist" Apr 24 21:29:50.448168 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.448164 2578 scope.go:117] "RemoveContainer" containerID="250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d" Apr 24 21:29:50.448421 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.448402 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d"} err="failed to get container status \"250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d\": rpc error: code = NotFound desc = could not find container \"250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d\": container with ID starting with 250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d not found: ID does not exist" Apr 24 21:29:50.448497 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.448422 2578 scope.go:117] "RemoveContainer" containerID="a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3" Apr 24 21:29:50.448675 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.448658 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3"} err="failed to get container status \"a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3\": rpc error: code = NotFound desc = could not find container \"a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3\": container with ID starting with a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3 not found: ID does not exist" Apr 24 21:29:50.448740 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.448677 2578 scope.go:117] "RemoveContainer" containerID="e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686" Apr 24 21:29:50.448935 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.448912 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686"} err="failed to get container status \"e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686\": rpc error: code = NotFound desc = could not find container \"e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686\": container with ID starting with e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686 not found: ID does not exist" Apr 24 21:29:50.449014 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.448938 2578 scope.go:117] "RemoveContainer" containerID="e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755" Apr 24 21:29:50.449185 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.449165 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755"} err="failed to get container status \"e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755\": rpc error: code = NotFound desc = could not find container \"e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755\": container with ID starting with e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755 not found: ID does not exist" Apr 24 21:29:50.449250 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.449187 2578 scope.go:117] "RemoveContainer" containerID="8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b" Apr 24 21:29:50.449399 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.449380 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b"} err="failed to get container status \"8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b\": rpc error: code = NotFound desc = could not find container \"8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b\": container with ID starting with 8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b not found: ID does not exist" Apr 24 21:29:50.449440 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.449400 2578 scope.go:117] "RemoveContainer" containerID="b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d" Apr 24 21:29:50.449627 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.449604 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d"} err="failed to get container status \"b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d\": rpc error: code = NotFound desc = could not find container \"b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d\": container with ID starting with b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d not found: ID does not exist" Apr 24 21:29:50.449706 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.449630 2578 scope.go:117] "RemoveContainer" containerID="5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95" Apr 24 21:29:50.449839 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.449819 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95"} err="failed to get container status \"5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95\": rpc error: code = NotFound desc = could not find container \"5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95\": container with ID starting with 5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95 not found: ID does not exist" Apr 24 21:29:50.449880 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.449842 2578 scope.go:117] "RemoveContainer" containerID="250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d" Apr 24 21:29:50.450101 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.450082 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d"} err="failed to get container status \"250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d\": rpc error: code = NotFound desc = could not find container \"250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d\": container with ID starting with 250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d not found: ID does not exist" Apr 24 21:29:50.450101 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.450101 2578 scope.go:117] "RemoveContainer" containerID="a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3" Apr 24 21:29:50.450371 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.450352 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3"} err="failed to get container status \"a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3\": rpc error: code = NotFound desc = could not find container \"a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3\": container with ID starting with a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3 not found: ID does not exist" Apr 24 21:29:50.450371 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.450371 2578 scope.go:117] "RemoveContainer" containerID="e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686" Apr 24 21:29:50.450729 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.450706 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686"} err="failed to get container status \"e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686\": rpc error: code = NotFound desc = could not find container \"e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686\": container with ID starting with e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686 not found: ID does not exist" Apr 24 21:29:50.450729 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.450729 2578 scope.go:117] "RemoveContainer" containerID="e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755" Apr 24 21:29:50.451024 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.450999 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755"} err="failed to get container status \"e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755\": rpc error: code = NotFound desc = could not find container \"e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755\": container with ID starting with e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755 not found: ID does not exist" Apr 24 21:29:50.451174 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.451026 2578 scope.go:117] "RemoveContainer" containerID="8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b" Apr 24 21:29:50.451342 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.451272 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b"} err="failed to get container status \"8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b\": rpc error: code = NotFound desc = could not find container \"8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b\": container with ID starting with 8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b not found: ID does not exist" Apr 24 21:29:50.451342 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.451297 2578 scope.go:117] "RemoveContainer" containerID="b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d" Apr 24 21:29:50.453203 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.451529 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d"} err="failed to get container status \"b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d\": rpc error: code = NotFound desc = could not find container \"b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d\": container with ID starting with b5b7d07add5fd8ce726e0e4b1ce5c933322bc5876c0242d3102aecbe1a62cf2d not found: ID does not exist" Apr 24 21:29:50.453203 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.451551 2578 scope.go:117] "RemoveContainer" containerID="5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95" Apr 24 21:29:50.453203 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.451784 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95"} err="failed to get container status \"5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95\": rpc error: code = NotFound desc = could not find container \"5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95\": container with ID starting with 5901e7861279bceb9dc9fb0fdf78dc61874b0b7db4577e1f74e487c381e68f95 not found: ID does not exist" Apr 24 21:29:50.453203 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.451804 2578 scope.go:117] "RemoveContainer" containerID="250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d" Apr 24 21:29:50.453203 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.452146 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d"} err="failed to get container status \"250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d\": rpc error: code = NotFound desc = could not find container \"250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d\": container with ID starting with 250ed2bc3af05a14093ad943dbfa9a7296f164d453e79c72a7e5d0bab1b9a24d not found: ID does not exist" Apr 24 21:29:50.453203 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.452165 2578 scope.go:117] "RemoveContainer" containerID="a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3" Apr 24 21:29:50.453203 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.452357 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3"} err="failed to get container status \"a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3\": rpc error: code = NotFound desc = could not find container \"a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3\": container with ID starting with a1790e931c837e804984548295487d601bc923a0c0e6deca45270148b84485f3 not found: ID does not exist" Apr 24 21:29:50.453203 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.452375 2578 scope.go:117] "RemoveContainer" containerID="e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686" Apr 24 21:29:50.453203 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.452557 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686"} err="failed to get container status \"e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686\": rpc error: code = NotFound desc = could not find container \"e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686\": container with ID starting with e0f0c3ccec3857afb20e833da4735aa35e5a471b70ff5243a2394eee64564686 not found: ID does not exist" Apr 24 21:29:50.453203 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.452575 2578 scope.go:117] "RemoveContainer" containerID="e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755" Apr 24 21:29:50.453203 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.452794 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755"} err="failed to get container status \"e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755\": rpc error: code = NotFound desc = could not find container \"e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755\": container with ID starting with e6afc3b073d05ceec174c8209f28537883ca2ce9df13e7e5307d6f38cd769755 not found: ID does not exist" Apr 24 21:29:50.453203 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.452815 2578 scope.go:117] "RemoveContainer" containerID="8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b" Apr 24 21:29:50.453203 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.453032 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b"} err="failed to get container status \"8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b\": rpc error: code = NotFound desc = could not find container \"8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b\": container with ID starting with 8fc3b0e37c91868f99b412e3978f88c2ae9f64267c552e7f8b8bc561ca1a267b not found: ID does not exist" Apr 24 21:29:50.500529 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.500503 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-prometheus-trusted-ca-bundle\") pod \"75481aac-d973-499e-8800-c6a68ac5be43\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " Apr 24 21:29:50.500667 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.500545 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-prometheus-k8s-tls\") pod \"75481aac-d973-499e-8800-c6a68ac5be43\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " Apr 24 21:29:50.500667 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.500563 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-config\") pod \"75481aac-d973-499e-8800-c6a68ac5be43\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " Apr 24 21:29:50.500667 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.500592 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-kube-rbac-proxy\") pod \"75481aac-d973-499e-8800-c6a68ac5be43\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " Apr 24 21:29:50.500667 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.500644 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-metrics-client-certs\") pod \"75481aac-d973-499e-8800-c6a68ac5be43\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " Apr 24 21:29:50.500853 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.500682 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75481aac-d973-499e-8800-c6a68ac5be43-tls-assets\") pod \"75481aac-d973-499e-8800-c6a68ac5be43\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " Apr 24 21:29:50.500853 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.500709 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75481aac-d973-499e-8800-c6a68ac5be43-prometheus-k8s-db\") pod \"75481aac-d973-499e-8800-c6a68ac5be43\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " Apr 24 21:29:50.500853 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.500734 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"75481aac-d973-499e-8800-c6a68ac5be43\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " Apr 24 21:29:50.500853 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.500773 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75481aac-d973-499e-8800-c6a68ac5be43-config-out\") pod \"75481aac-d973-499e-8800-c6a68ac5be43\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " Apr 24 21:29:50.500853 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.500796 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-prometheus-k8s-rulefiles-0\") pod \"75481aac-d973-499e-8800-c6a68ac5be43\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " Apr 24 21:29:50.500853 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.500833 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"75481aac-d973-499e-8800-c6a68ac5be43\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " Apr 24 21:29:50.501172 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.500862 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-configmap-metrics-client-ca\") pod \"75481aac-d973-499e-8800-c6a68ac5be43\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " Apr 24 21:29:50.501172 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.500909 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-configmap-kubelet-serving-ca-bundle\") pod \"75481aac-d973-499e-8800-c6a68ac5be43\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " Apr 24 21:29:50.501172 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.500937 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-web-config\") pod \"75481aac-d973-499e-8800-c6a68ac5be43\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " Apr 24 21:29:50.501172 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.500967 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-configmap-serving-certs-ca-bundle\") pod \"75481aac-d973-499e-8800-c6a68ac5be43\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " Apr 24 21:29:50.501172 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.500993 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn95z\" (UniqueName: \"kubernetes.io/projected/75481aac-d973-499e-8800-c6a68ac5be43-kube-api-access-nn95z\") pod \"75481aac-d973-499e-8800-c6a68ac5be43\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " Apr 24 21:29:50.501172 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.501016 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "75481aac-d973-499e-8800-c6a68ac5be43" (UID: "75481aac-d973-499e-8800-c6a68ac5be43"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:50.501172 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.501046 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-thanos-prometheus-http-client-file\") pod \"75481aac-d973-499e-8800-c6a68ac5be43\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " Apr 24 21:29:50.501172 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.501078 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-grpc-tls\") pod \"75481aac-d973-499e-8800-c6a68ac5be43\" (UID: \"75481aac-d973-499e-8800-c6a68ac5be43\") " Apr 24 21:29:50.501533 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.501295 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "75481aac-d973-499e-8800-c6a68ac5be43" (UID: "75481aac-d973-499e-8800-c6a68ac5be43"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:50.501533 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.501449 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-prometheus-trusted-ca-bundle\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:50.501533 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.501466 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-configmap-metrics-client-ca\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:50.502056 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.502027 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "75481aac-d973-499e-8800-c6a68ac5be43" (UID: "75481aac-d973-499e-8800-c6a68ac5be43"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:50.502385 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.502357 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "75481aac-d973-499e-8800-c6a68ac5be43" (UID: "75481aac-d973-499e-8800-c6a68ac5be43"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:50.502478 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.502459 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75481aac-d973-499e-8800-c6a68ac5be43-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "75481aac-d973-499e-8800-c6a68ac5be43" (UID: "75481aac-d973-499e-8800-c6a68ac5be43"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:29:50.503664 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.503305 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "75481aac-d973-499e-8800-c6a68ac5be43" (UID: "75481aac-d973-499e-8800-c6a68ac5be43"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:50.503664 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.503540 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "75481aac-d973-499e-8800-c6a68ac5be43" (UID: "75481aac-d973-499e-8800-c6a68ac5be43"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:50.504725 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.504697 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75481aac-d973-499e-8800-c6a68ac5be43-config-out" (OuterVolumeSpecName: "config-out") pod "75481aac-d973-499e-8800-c6a68ac5be43" (UID: "75481aac-d973-499e-8800-c6a68ac5be43"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:29:50.504991 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.504958 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "75481aac-d973-499e-8800-c6a68ac5be43" (UID: "75481aac-d973-499e-8800-c6a68ac5be43"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:50.505141 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.505121 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-config" (OuterVolumeSpecName: "config") pod "75481aac-d973-499e-8800-c6a68ac5be43" (UID: "75481aac-d973-499e-8800-c6a68ac5be43"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:50.505432 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.505388 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "75481aac-d973-499e-8800-c6a68ac5be43" (UID: "75481aac-d973-499e-8800-c6a68ac5be43"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:50.505681 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.505653 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "75481aac-d973-499e-8800-c6a68ac5be43" (UID: "75481aac-d973-499e-8800-c6a68ac5be43"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:50.505820 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.505796 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75481aac-d973-499e-8800-c6a68ac5be43-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "75481aac-d973-499e-8800-c6a68ac5be43" (UID: "75481aac-d973-499e-8800-c6a68ac5be43"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:50.505971 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.505913 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "75481aac-d973-499e-8800-c6a68ac5be43" (UID: "75481aac-d973-499e-8800-c6a68ac5be43"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:50.506063 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.505985 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "75481aac-d973-499e-8800-c6a68ac5be43" (UID: "75481aac-d973-499e-8800-c6a68ac5be43"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:50.506162 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.506137 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75481aac-d973-499e-8800-c6a68ac5be43-kube-api-access-nn95z" (OuterVolumeSpecName: "kube-api-access-nn95z") pod "75481aac-d973-499e-8800-c6a68ac5be43" (UID: "75481aac-d973-499e-8800-c6a68ac5be43"). InnerVolumeSpecName "kube-api-access-nn95z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:50.506936 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.506915 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "75481aac-d973-499e-8800-c6a68ac5be43" (UID: "75481aac-d973-499e-8800-c6a68ac5be43"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:50.514239 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.514219 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-web-config" (OuterVolumeSpecName: "web-config") pod "75481aac-d973-499e-8800-c6a68ac5be43" (UID: "75481aac-d973-499e-8800-c6a68ac5be43"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:50.602075 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.602042 2578 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-metrics-client-certs\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:50.602075 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.602071 2578 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75481aac-d973-499e-8800-c6a68ac5be43-tls-assets\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:50.602075 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.602081 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75481aac-d973-499e-8800-c6a68ac5be43-prometheus-k8s-db\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:50.602293 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.602091 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:50.602293 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.602101 2578 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75481aac-d973-499e-8800-c6a68ac5be43-config-out\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:50.602293 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.602110 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:50.602293 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.602118 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:50.602293 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.602127 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:50.602293 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.602136 2578 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-web-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:50.602293 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.602146 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75481aac-d973-499e-8800-c6a68ac5be43-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:50.602293 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.602156 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nn95z\" (UniqueName: \"kubernetes.io/projected/75481aac-d973-499e-8800-c6a68ac5be43-kube-api-access-nn95z\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:50.602293 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.602166 2578 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-thanos-prometheus-http-client-file\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:50.602293 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.602174 2578 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-grpc-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:50.602293 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.602183 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-prometheus-k8s-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:50.602293 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.602193 2578 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:50.602293 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:50.602202 2578 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75481aac-d973-499e-8800-c6a68ac5be43-secret-kube-rbac-proxy\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:29:51.388733 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.388701 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.414284 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.414221 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:29:51.419151 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.419125 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:29:51.446631 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.446611 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:29:51.446866 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.446854 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="config-reloader" Apr 24 21:29:51.446924 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.446868 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="config-reloader" Apr 24 21:29:51.446924 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.446880 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="prometheus" Apr 24 21:29:51.446924 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.446900 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="prometheus" Apr 24 21:29:51.446924 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.446909 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="kube-rbac-proxy-thanos" Apr 24 21:29:51.446924 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.446915 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="kube-rbac-proxy-thanos" Apr 24 21:29:51.446924 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.446925 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="init-config-reloader" Apr 24 21:29:51.447098 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.446931 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="init-config-reloader" Apr 24 21:29:51.447098 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.446937 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="kube-rbac-proxy" Apr 24 21:29:51.447098 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.446943 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="kube-rbac-proxy" Apr 24 21:29:51.447098 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.446951 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="thanos-sidecar" Apr 24 21:29:51.447098 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.446956 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="thanos-sidecar" Apr 24 21:29:51.447098 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.446963 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="kube-rbac-proxy-web" Apr 24 21:29:51.447098 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.446968 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="kube-rbac-proxy-web" Apr 24 21:29:51.447098 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.447019 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="kube-rbac-proxy-thanos" Apr 24 21:29:51.447098 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.447026 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="config-reloader" Apr 24 21:29:51.447098 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.447032 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="kube-rbac-proxy-web" Apr 24 21:29:51.447098 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.447039 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="thanos-sidecar" Apr 24 21:29:51.447098 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.447046 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="prometheus" Apr 24 21:29:51.447098 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.447052 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="75481aac-d973-499e-8800-c6a68ac5be43" containerName="kube-rbac-proxy" Apr 24 21:29:51.452533 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.452502 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.455423 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.455394 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 21:29:51.455423 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.455408 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 21:29:51.455571 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.455504 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 21:29:51.455571 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.455564 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 21:29:51.455660 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.455620 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 21:29:51.455660 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.455648 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5ir6j5tecvjda\"" Apr 24 21:29:51.455735 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.455662 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 21:29:51.455845 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.455825 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 21:29:51.456050 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.456033 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 21:29:51.456050 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.456050 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-gs4tm\"" Apr 24 21:29:51.456223 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.456210 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 21:29:51.456303 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.456288 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 21:29:51.456858 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.456831 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 21:29:51.458599 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.458582 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 21:29:51.461725 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.461709 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 21:29:51.466942 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.466923 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:29:51.508005 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.507983 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.508098 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.508016 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/56428658-1c56-461f-912d-7d5323992858-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.508098 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.508036 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.508098 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.508052 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56428658-1c56-461f-912d-7d5323992858-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.508098 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.508072 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-web-config\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.508098 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.508089 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/56428658-1c56-461f-912d-7d5323992858-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.508333 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.508105 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.508333 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.508123 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56428658-1c56-461f-912d-7d5323992858-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.508333 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.508140 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj8mx\" (UniqueName: \"kubernetes.io/projected/56428658-1c56-461f-912d-7d5323992858-kube-api-access-tj8mx\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.508333 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.508220 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56428658-1c56-461f-912d-7d5323992858-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.508333 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.508253 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/56428658-1c56-461f-912d-7d5323992858-config-out\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.508333 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.508281 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56428658-1c56-461f-912d-7d5323992858-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.508333 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.508323 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.508574 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.508353 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.508574 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.508381 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/56428658-1c56-461f-912d-7d5323992858-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.508574 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.508397 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-config\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.508574 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.508412 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.508574 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.508428 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.609548 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.609520 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/56428658-1c56-461f-912d-7d5323992858-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.609640 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.609557 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.609640 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.609585 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56428658-1c56-461f-912d-7d5323992858-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.609738 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.609634 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tj8mx\" (UniqueName: \"kubernetes.io/projected/56428658-1c56-461f-912d-7d5323992858-kube-api-access-tj8mx\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.609738 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.609681 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56428658-1c56-461f-912d-7d5323992858-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.609828 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.609803 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/56428658-1c56-461f-912d-7d5323992858-config-out\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.609884 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.609834 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56428658-1c56-461f-912d-7d5323992858-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.609884 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.609873 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.610052 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.609926 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.610052 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.609953 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/56428658-1c56-461f-912d-7d5323992858-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.610052 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.609988 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/56428658-1c56-461f-912d-7d5323992858-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.610495 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.610441 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56428658-1c56-461f-912d-7d5323992858-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.610611 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.610518 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56428658-1c56-461f-912d-7d5323992858-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.610679 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.610648 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-config\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.610728 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.610681 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.610728 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.610709 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.610831 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.610754 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.610831 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.610801 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/56428658-1c56-461f-912d-7d5323992858-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.610831 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.610820 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56428658-1c56-461f-912d-7d5323992858-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.611004 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.610835 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.611004 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.610868 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56428658-1c56-461f-912d-7d5323992858-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.611004 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.610944 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-web-config\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.613386 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.612743 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56428658-1c56-461f-912d-7d5323992858-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.613386 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.612879 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.613386 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.613192 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.613743 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.613700 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.613983 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.613936 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/56428658-1c56-461f-912d-7d5323992858-config-out\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.614335 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.614310 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-web-config\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.614454 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.614388 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/56428658-1c56-461f-912d-7d5323992858-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.614586 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.614555 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.614586 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.614572 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/56428658-1c56-461f-912d-7d5323992858-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.615428 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.615405 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.616320 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.616293 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.616408 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.616293 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.616861 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.616839 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/56428658-1c56-461f-912d-7d5323992858-config\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.619021 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.619001 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj8mx\" (UniqueName: \"kubernetes.io/projected/56428658-1c56-461f-912d-7d5323992858-kube-api-access-tj8mx\") pod \"prometheus-k8s-0\" (UID: \"56428658-1c56-461f-912d-7d5323992858\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.763167 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.763142 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75481aac-d973-499e-8800-c6a68ac5be43" path="/var/lib/kubelet/pods/75481aac-d973-499e-8800-c6a68ac5be43/volumes" Apr 24 21:29:51.763780 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.763763 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:51.890213 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:51.890185 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:29:51.893482 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:29:51.893457 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56428658_1c56_461f_912d_7d5323992858.slice/crio-45adc34cbf6f3c77fa02ac58a18861a888e3e28d3fd41eb26da301cc0f8716ec WatchSource:0}: Error finding container 45adc34cbf6f3c77fa02ac58a18861a888e3e28d3fd41eb26da301cc0f8716ec: Status 404 returned error can't find the container with id 45adc34cbf6f3c77fa02ac58a18861a888e3e28d3fd41eb26da301cc0f8716ec Apr 24 21:29:52.392754 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:52.392718 2578 generic.go:358] "Generic (PLEG): container finished" podID="56428658-1c56-461f-912d-7d5323992858" containerID="524b921a100c868bbe9fd11599794d2c227dd3493b7b7316d41d819d7229ee22" exitCode=0 Apr 24 21:29:52.393137 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:52.392803 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56428658-1c56-461f-912d-7d5323992858","Type":"ContainerDied","Data":"524b921a100c868bbe9fd11599794d2c227dd3493b7b7316d41d819d7229ee22"} Apr 24 21:29:52.393137 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:52.392838 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56428658-1c56-461f-912d-7d5323992858","Type":"ContainerStarted","Data":"45adc34cbf6f3c77fa02ac58a18861a888e3e28d3fd41eb26da301cc0f8716ec"} Apr 24 21:29:53.398723 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:53.398691 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56428658-1c56-461f-912d-7d5323992858","Type":"ContainerStarted","Data":"7cb24d623775d0ea4e82080895678749b74a4f2bda8e17ab3ecdfd1b6728c1bf"} Apr 24 21:29:53.398723 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:53.398725 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56428658-1c56-461f-912d-7d5323992858","Type":"ContainerStarted","Data":"ef9b10ff0f5b07e4b6cf9f5559b9eb0d2b956a8e5eb179cf156bc74a08757099"} Apr 24 21:29:53.399134 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:53.398736 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56428658-1c56-461f-912d-7d5323992858","Type":"ContainerStarted","Data":"9b3d78117ba3a356511f8f9c5a8b4413f954d76d6a7f918cfe7b988cec81b358"} Apr 24 21:29:53.399134 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:53.398746 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56428658-1c56-461f-912d-7d5323992858","Type":"ContainerStarted","Data":"4df51c033169a9d4c82292aba1ff11e4258f5964aea2b432b20a271c037c6faa"} Apr 24 21:29:53.399134 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:53.398754 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56428658-1c56-461f-912d-7d5323992858","Type":"ContainerStarted","Data":"ddff1d6f26be7a3bd8dd68ea39ddba4beef804b595d2dfb35fedbfb8457d851f"} Apr 24 21:29:53.399134 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:53.398762 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56428658-1c56-461f-912d-7d5323992858","Type":"ContainerStarted","Data":"1cbaac2228cf85a410e73752c5dd5429ca20c933934a98014912fd3de70c42bd"} Apr 24 21:29:53.432452 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:53.432403 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.432383912 podStartE2EDuration="2.432383912s" podCreationTimestamp="2026-04-24 21:29:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:29:53.43181011 +0000 UTC m=+150.259680252" watchObservedRunningTime="2026-04-24 21:29:53.432383912 +0000 UTC m=+150.260254044" Apr 24 21:29:56.764135 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:29:56.764101 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:51.764582 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:30:51.764544 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:51.779680 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:30:51.779653 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:52.575294 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:30:52.575261 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:23.630993 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:23.630965 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krb9p_cde5bc87-530f-4ee7-8f38-39b875bbd4e6/ovn-acl-logging/0.log" Apr 24 21:32:23.633182 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:23.633155 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krb9p_cde5bc87-530f-4ee7-8f38-39b875bbd4e6/ovn-acl-logging/0.log" Apr 24 21:32:23.637454 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:23.637428 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:32:24.860832 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:24.860795 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-w6nkn"] Apr 24 21:32:24.864015 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:24.863989 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-w6nkn" Apr 24 21:32:24.867515 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:24.867488 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:32:24.867639 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:24.867488 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:32:24.868381 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:24.868361 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-czl9m\"" Apr 24 21:32:24.868484 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:24.868365 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 21:32:24.882330 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:24.882297 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-w6nkn"] Apr 24 21:32:24.956611 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:24.956569 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6ce76489-f504-4d62-9d64-cc62a81ad768-data\") pod \"seaweedfs-86cc847c5c-w6nkn\" (UID: \"6ce76489-f504-4d62-9d64-cc62a81ad768\") " pod="kserve/seaweedfs-86cc847c5c-w6nkn" Apr 24 21:32:24.956789 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:24.956632 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hnwk\" (UniqueName: \"kubernetes.io/projected/6ce76489-f504-4d62-9d64-cc62a81ad768-kube-api-access-6hnwk\") pod \"seaweedfs-86cc847c5c-w6nkn\" (UID: \"6ce76489-f504-4d62-9d64-cc62a81ad768\") " pod="kserve/seaweedfs-86cc847c5c-w6nkn" Apr 24 21:32:25.057493 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:25.057457 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hnwk\" (UniqueName: \"kubernetes.io/projected/6ce76489-f504-4d62-9d64-cc62a81ad768-kube-api-access-6hnwk\") pod \"seaweedfs-86cc847c5c-w6nkn\" (UID: \"6ce76489-f504-4d62-9d64-cc62a81ad768\") " pod="kserve/seaweedfs-86cc847c5c-w6nkn" Apr 24 21:32:25.057691 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:25.057527 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6ce76489-f504-4d62-9d64-cc62a81ad768-data\") pod \"seaweedfs-86cc847c5c-w6nkn\" (UID: \"6ce76489-f504-4d62-9d64-cc62a81ad768\") " pod="kserve/seaweedfs-86cc847c5c-w6nkn" Apr 24 21:32:25.057939 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:25.057915 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6ce76489-f504-4d62-9d64-cc62a81ad768-data\") pod \"seaweedfs-86cc847c5c-w6nkn\" (UID: \"6ce76489-f504-4d62-9d64-cc62a81ad768\") " pod="kserve/seaweedfs-86cc847c5c-w6nkn" Apr 24 21:32:25.067216 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:25.067184 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hnwk\" (UniqueName: \"kubernetes.io/projected/6ce76489-f504-4d62-9d64-cc62a81ad768-kube-api-access-6hnwk\") pod \"seaweedfs-86cc847c5c-w6nkn\" (UID: \"6ce76489-f504-4d62-9d64-cc62a81ad768\") " pod="kserve/seaweedfs-86cc847c5c-w6nkn" Apr 24 21:32:25.173424 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:25.173391 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-w6nkn" Apr 24 21:32:25.305917 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:25.305774 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-w6nkn"] Apr 24 21:32:25.308667 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:32:25.308637 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ce76489_f504_4d62_9d64_cc62a81ad768.slice/crio-a6ee684341f3befba167a10cbc132baebe24db613dd05eb6108bbcb7eb286588 WatchSource:0}: Error finding container a6ee684341f3befba167a10cbc132baebe24db613dd05eb6108bbcb7eb286588: Status 404 returned error can't find the container with id a6ee684341f3befba167a10cbc132baebe24db613dd05eb6108bbcb7eb286588 Apr 24 21:32:25.315166 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:25.310288 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:32:25.814873 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:25.814833 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-w6nkn" event={"ID":"6ce76489-f504-4d62-9d64-cc62a81ad768","Type":"ContainerStarted","Data":"a6ee684341f3befba167a10cbc132baebe24db613dd05eb6108bbcb7eb286588"} Apr 24 21:32:28.826125 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:28.826089 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-w6nkn" event={"ID":"6ce76489-f504-4d62-9d64-cc62a81ad768","Type":"ContainerStarted","Data":"3889715c10c7d1761841f411ac6ce33d77685bc664aa37217e7e44d5bd7ac13d"} Apr 24 21:32:28.826532 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:28.826157 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-w6nkn" Apr 24 21:32:28.844019 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:28.843961 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-w6nkn" podStartSLOduration=2.125370582 podStartE2EDuration="4.843944264s" podCreationTimestamp="2026-04-24 21:32:24 +0000 UTC" firstStartedPulling="2026-04-24 21:32:25.310486524 +0000 UTC m=+302.138356649" lastFinishedPulling="2026-04-24 21:32:28.029060212 +0000 UTC m=+304.856930331" observedRunningTime="2026-04-24 21:32:28.843587624 +0000 UTC m=+305.671457756" watchObservedRunningTime="2026-04-24 21:32:28.843944264 +0000 UTC m=+305.671814396" Apr 24 21:32:34.831080 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:32:34.831049 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-w6nkn" Apr 24 21:33:01.746817 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:01.746776 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84b6647887-v847j"] Apr 24 21:33:01.749170 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:01.749147 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-v847j" Apr 24 21:33:01.752681 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:01.752662 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-wtklc\"" Apr 24 21:33:01.752791 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:01.752687 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 21:33:01.758171 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:01.758142 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-v847j"] Apr 24 21:33:01.862645 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:01.862610 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/866f00a9-a524-4cfb-a324-423d668078f9-cert\") pod \"kserve-controller-manager-84b6647887-v847j\" (UID: \"866f00a9-a524-4cfb-a324-423d668078f9\") " pod="kserve/kserve-controller-manager-84b6647887-v847j" Apr 24 21:33:01.862645 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:01.862652 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4zhh\" (UniqueName: \"kubernetes.io/projected/866f00a9-a524-4cfb-a324-423d668078f9-kube-api-access-l4zhh\") pod \"kserve-controller-manager-84b6647887-v847j\" (UID: \"866f00a9-a524-4cfb-a324-423d668078f9\") " pod="kserve/kserve-controller-manager-84b6647887-v847j" Apr 24 21:33:01.963772 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:01.963714 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/866f00a9-a524-4cfb-a324-423d668078f9-cert\") pod \"kserve-controller-manager-84b6647887-v847j\" (UID: \"866f00a9-a524-4cfb-a324-423d668078f9\") " pod="kserve/kserve-controller-manager-84b6647887-v847j" Apr 24 21:33:01.963772 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:01.963772 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4zhh\" (UniqueName: \"kubernetes.io/projected/866f00a9-a524-4cfb-a324-423d668078f9-kube-api-access-l4zhh\") pod \"kserve-controller-manager-84b6647887-v847j\" (UID: \"866f00a9-a524-4cfb-a324-423d668078f9\") " pod="kserve/kserve-controller-manager-84b6647887-v847j" Apr 24 21:33:01.966230 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:01.966210 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/866f00a9-a524-4cfb-a324-423d668078f9-cert\") pod \"kserve-controller-manager-84b6647887-v847j\" (UID: \"866f00a9-a524-4cfb-a324-423d668078f9\") " pod="kserve/kserve-controller-manager-84b6647887-v847j" Apr 24 21:33:01.973337 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:01.973308 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4zhh\" (UniqueName: \"kubernetes.io/projected/866f00a9-a524-4cfb-a324-423d668078f9-kube-api-access-l4zhh\") pod \"kserve-controller-manager-84b6647887-v847j\" (UID: \"866f00a9-a524-4cfb-a324-423d668078f9\") " pod="kserve/kserve-controller-manager-84b6647887-v847j" Apr 24 21:33:02.059601 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:02.059499 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-v847j" Apr 24 21:33:02.181339 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:02.181306 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-v847j"] Apr 24 21:33:02.184576 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:33:02.184539 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod866f00a9_a524_4cfb_a324_423d668078f9.slice/crio-3547aa173cddda7da84e9c994c1490bc69210d9916195ee19a750b09b20e37c9 WatchSource:0}: Error finding container 3547aa173cddda7da84e9c994c1490bc69210d9916195ee19a750b09b20e37c9: Status 404 returned error can't find the container with id 3547aa173cddda7da84e9c994c1490bc69210d9916195ee19a750b09b20e37c9 Apr 24 21:33:02.916825 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:02.916779 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-v847j" event={"ID":"866f00a9-a524-4cfb-a324-423d668078f9","Type":"ContainerStarted","Data":"3547aa173cddda7da84e9c994c1490bc69210d9916195ee19a750b09b20e37c9"} Apr 24 21:33:04.923647 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:04.923620 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-v847j" event={"ID":"866f00a9-a524-4cfb-a324-423d668078f9","Type":"ContainerStarted","Data":"8a50dc161b7eb6697dedf31a55c6804612cd1d943d22536b4b560499e9a5c3e2"} Apr 24 21:33:04.924028 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:04.923728 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84b6647887-v847j" Apr 24 21:33:04.942821 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:04.942781 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84b6647887-v847j" podStartSLOduration=1.482471705 podStartE2EDuration="3.942767856s" podCreationTimestamp="2026-04-24 21:33:01 +0000 UTC" firstStartedPulling="2026-04-24 21:33:02.185982475 +0000 UTC m=+339.013852593" lastFinishedPulling="2026-04-24 21:33:04.646278631 +0000 UTC m=+341.474148744" observedRunningTime="2026-04-24 21:33:04.941082947 +0000 UTC m=+341.768953079" watchObservedRunningTime="2026-04-24 21:33:04.942767856 +0000 UTC m=+341.770637989" Apr 24 21:33:35.932219 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:35.932152 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84b6647887-v847j" Apr 24 21:33:52.454938 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:52.454885 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-z7thh"] Apr 24 21:33:52.456949 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:52.456931 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-z7thh" Apr 24 21:33:52.464553 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:52.464530 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-z7thh"] Apr 24 21:33:52.548374 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:52.548338 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9r4n\" (UniqueName: \"kubernetes.io/projected/0d43d473-09e2-4943-958f-11ec70b2b290-kube-api-access-p9r4n\") pod \"s3-init-z7thh\" (UID: \"0d43d473-09e2-4943-958f-11ec70b2b290\") " pod="kserve/s3-init-z7thh" Apr 24 21:33:52.649850 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:52.649802 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9r4n\" (UniqueName: \"kubernetes.io/projected/0d43d473-09e2-4943-958f-11ec70b2b290-kube-api-access-p9r4n\") pod \"s3-init-z7thh\" (UID: \"0d43d473-09e2-4943-958f-11ec70b2b290\") " pod="kserve/s3-init-z7thh" Apr 24 21:33:52.660291 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:52.660253 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9r4n\" (UniqueName: \"kubernetes.io/projected/0d43d473-09e2-4943-958f-11ec70b2b290-kube-api-access-p9r4n\") pod \"s3-init-z7thh\" (UID: \"0d43d473-09e2-4943-958f-11ec70b2b290\") " pod="kserve/s3-init-z7thh" Apr 24 21:33:52.778766 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:52.778684 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-z7thh" Apr 24 21:33:52.908173 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:52.908142 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-z7thh"] Apr 24 21:33:52.911064 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:33:52.911039 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d43d473_09e2_4943_958f_11ec70b2b290.slice/crio-fc46efcb4e01f73ceda626999daf6a1cf03ea7ecc9c1635bb8a8242b284b06e8 WatchSource:0}: Error finding container fc46efcb4e01f73ceda626999daf6a1cf03ea7ecc9c1635bb8a8242b284b06e8: Status 404 returned error can't find the container with id fc46efcb4e01f73ceda626999daf6a1cf03ea7ecc9c1635bb8a8242b284b06e8 Apr 24 21:33:53.056859 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:53.056774 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-z7thh" event={"ID":"0d43d473-09e2-4943-958f-11ec70b2b290","Type":"ContainerStarted","Data":"fc46efcb4e01f73ceda626999daf6a1cf03ea7ecc9c1635bb8a8242b284b06e8"} Apr 24 21:33:58.073934 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:58.073879 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-z7thh" event={"ID":"0d43d473-09e2-4943-958f-11ec70b2b290","Type":"ContainerStarted","Data":"a705f860aeb15f21acf24edd1d65f7446c644aaa10c9db399f08fa4a6ae3d735"} Apr 24 21:33:58.096171 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:33:58.096124 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-z7thh" podStartSLOduration=1.8038487079999999 podStartE2EDuration="6.096110112s" podCreationTimestamp="2026-04-24 21:33:52 +0000 UTC" firstStartedPulling="2026-04-24 21:33:52.912983374 +0000 UTC m=+389.740853489" lastFinishedPulling="2026-04-24 21:33:57.205244763 +0000 UTC m=+394.033114893" observedRunningTime="2026-04-24 21:33:58.095232557 +0000 UTC m=+394.923102689" watchObservedRunningTime="2026-04-24 21:33:58.096110112 +0000 UTC m=+394.923980243" Apr 24 21:34:01.083766 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:01.083731 2578 generic.go:358] "Generic (PLEG): container finished" podID="0d43d473-09e2-4943-958f-11ec70b2b290" containerID="a705f860aeb15f21acf24edd1d65f7446c644aaa10c9db399f08fa4a6ae3d735" exitCode=0 Apr 24 21:34:01.084152 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:01.083804 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-z7thh" event={"ID":"0d43d473-09e2-4943-958f-11ec70b2b290","Type":"ContainerDied","Data":"a705f860aeb15f21acf24edd1d65f7446c644aaa10c9db399f08fa4a6ae3d735"} Apr 24 21:34:02.218499 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:02.218478 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-z7thh" Apr 24 21:34:02.325539 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:02.325503 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9r4n\" (UniqueName: \"kubernetes.io/projected/0d43d473-09e2-4943-958f-11ec70b2b290-kube-api-access-p9r4n\") pod \"0d43d473-09e2-4943-958f-11ec70b2b290\" (UID: \"0d43d473-09e2-4943-958f-11ec70b2b290\") " Apr 24 21:34:02.327715 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:02.327675 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d43d473-09e2-4943-958f-11ec70b2b290-kube-api-access-p9r4n" (OuterVolumeSpecName: "kube-api-access-p9r4n") pod "0d43d473-09e2-4943-958f-11ec70b2b290" (UID: "0d43d473-09e2-4943-958f-11ec70b2b290"). InnerVolumeSpecName "kube-api-access-p9r4n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:34:02.427034 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:02.427007 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p9r4n\" (UniqueName: \"kubernetes.io/projected/0d43d473-09e2-4943-958f-11ec70b2b290-kube-api-access-p9r4n\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:34:03.090745 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:03.090706 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-z7thh" event={"ID":"0d43d473-09e2-4943-958f-11ec70b2b290","Type":"ContainerDied","Data":"fc46efcb4e01f73ceda626999daf6a1cf03ea7ecc9c1635bb8a8242b284b06e8"} Apr 24 21:34:03.090745 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:03.090740 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc46efcb4e01f73ceda626999daf6a1cf03ea7ecc9c1635bb8a8242b284b06e8" Apr 24 21:34:03.090745 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:03.090720 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-z7thh" Apr 24 21:34:12.219324 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.219291 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl"] Apr 24 21:34:12.219799 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.219583 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d43d473-09e2-4943-958f-11ec70b2b290" containerName="s3-init" Apr 24 21:34:12.219799 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.219593 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d43d473-09e2-4943-958f-11ec70b2b290" containerName="s3-init" Apr 24 21:34:12.219799 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.219649 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d43d473-09e2-4943-958f-11ec70b2b290" containerName="s3-init" Apr 24 21:34:12.221732 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.221715 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:34:12.224402 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.224370 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-8t9rb\"" Apr 24 21:34:12.224402 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.224372 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:34:12.224618 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.224412 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:34:12.224706 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.224689 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-2f47a-kube-rbac-proxy-sar-config\"" Apr 24 21:34:12.225818 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.225795 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-2f47a-predictor-serving-cert\"" Apr 24 21:34:12.235625 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.235595 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl"] Apr 24 21:34:12.406687 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.406647 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l46t\" (UniqueName: \"kubernetes.io/projected/4650e380-d5f3-407b-8763-5e988c00f004-kube-api-access-7l46t\") pod \"isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl\" (UID: \"4650e380-d5f3-407b-8763-5e988c00f004\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:34:12.406687 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.406688 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4650e380-d5f3-407b-8763-5e988c00f004-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl\" (UID: \"4650e380-d5f3-407b-8763-5e988c00f004\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:34:12.406984 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.406728 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-raw-sklearn-batcher-2f47a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4650e380-d5f3-407b-8763-5e988c00f004-isvc-raw-sklearn-batcher-2f47a-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl\" (UID: \"4650e380-d5f3-407b-8763-5e988c00f004\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:34:12.406984 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.406850 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4650e380-d5f3-407b-8763-5e988c00f004-proxy-tls\") pod \"isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl\" (UID: \"4650e380-d5f3-407b-8763-5e988c00f004\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:34:12.508218 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.508109 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4650e380-d5f3-407b-8763-5e988c00f004-proxy-tls\") pod \"isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl\" (UID: \"4650e380-d5f3-407b-8763-5e988c00f004\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:34:12.508218 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.508174 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7l46t\" (UniqueName: \"kubernetes.io/projected/4650e380-d5f3-407b-8763-5e988c00f004-kube-api-access-7l46t\") pod \"isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl\" (UID: \"4650e380-d5f3-407b-8763-5e988c00f004\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:34:12.508218 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.508192 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4650e380-d5f3-407b-8763-5e988c00f004-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl\" (UID: \"4650e380-d5f3-407b-8763-5e988c00f004\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:34:12.508218 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.508229 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-raw-sklearn-batcher-2f47a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4650e380-d5f3-407b-8763-5e988c00f004-isvc-raw-sklearn-batcher-2f47a-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl\" (UID: \"4650e380-d5f3-407b-8763-5e988c00f004\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:34:12.508706 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.508681 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4650e380-d5f3-407b-8763-5e988c00f004-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl\" (UID: \"4650e380-d5f3-407b-8763-5e988c00f004\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:34:12.508987 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.508968 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-raw-sklearn-batcher-2f47a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4650e380-d5f3-407b-8763-5e988c00f004-isvc-raw-sklearn-batcher-2f47a-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl\" (UID: \"4650e380-d5f3-407b-8763-5e988c00f004\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:34:12.510608 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.510584 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4650e380-d5f3-407b-8763-5e988c00f004-proxy-tls\") pod \"isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl\" (UID: \"4650e380-d5f3-407b-8763-5e988c00f004\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:34:12.519624 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.519603 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l46t\" (UniqueName: \"kubernetes.io/projected/4650e380-d5f3-407b-8763-5e988c00f004-kube-api-access-7l46t\") pod \"isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl\" (UID: \"4650e380-d5f3-407b-8763-5e988c00f004\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:34:12.531610 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.531586 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:34:12.659381 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:12.659345 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl"] Apr 24 21:34:12.662409 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:34:12.662382 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4650e380_d5f3_407b_8763_5e988c00f004.slice/crio-04f37559a140b79e69782935da408f1087b5f121ddc48f0810d51b6665907025 WatchSource:0}: Error finding container 04f37559a140b79e69782935da408f1087b5f121ddc48f0810d51b6665907025: Status 404 returned error can't find the container with id 04f37559a140b79e69782935da408f1087b5f121ddc48f0810d51b6665907025 Apr 24 21:34:13.118992 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:13.118962 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" event={"ID":"4650e380-d5f3-407b-8763-5e988c00f004","Type":"ContainerStarted","Data":"04f37559a140b79e69782935da408f1087b5f121ddc48f0810d51b6665907025"} Apr 24 21:34:17.134961 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:17.134858 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" event={"ID":"4650e380-d5f3-407b-8763-5e988c00f004","Type":"ContainerStarted","Data":"7cd6bada683dd23fb2c42d3647a9a2eec927f562ce7067f5f579e91f60b08b6e"} Apr 24 21:34:21.149043 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:21.149007 2578 generic.go:358] "Generic (PLEG): container finished" podID="4650e380-d5f3-407b-8763-5e988c00f004" containerID="7cd6bada683dd23fb2c42d3647a9a2eec927f562ce7067f5f579e91f60b08b6e" exitCode=0 Apr 24 21:34:21.149448 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:21.149057 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" event={"ID":"4650e380-d5f3-407b-8763-5e988c00f004","Type":"ContainerDied","Data":"7cd6bada683dd23fb2c42d3647a9a2eec927f562ce7067f5f579e91f60b08b6e"} Apr 24 21:34:35.200159 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:35.200121 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" event={"ID":"4650e380-d5f3-407b-8763-5e988c00f004","Type":"ContainerStarted","Data":"ce8a7cbeb21792b3614691877c8b89870c874b70cfb4cacc9d2553ccf2ffa4c5"} Apr 24 21:34:37.208111 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:37.208075 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" event={"ID":"4650e380-d5f3-407b-8763-5e988c00f004","Type":"ContainerStarted","Data":"577bb69ba894723349ff3e4108f0f33b1f2b4e667e38d9cc1fd0eec3b2e829f3"} Apr 24 21:34:40.219382 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:40.219342 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" event={"ID":"4650e380-d5f3-407b-8763-5e988c00f004","Type":"ContainerStarted","Data":"2261285d9ba189bac4abed0f61794d3090ca8cece2aeadb1d1007845c48435fb"} Apr 24 21:34:40.220016 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:40.219559 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:34:40.220016 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:40.219585 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:34:40.220016 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:40.219601 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:34:40.220988 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:40.220935 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 24 21:34:40.221620 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:40.221599 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:34:40.244316 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:40.244266 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podStartSLOduration=1.5833805170000002 podStartE2EDuration="28.244251427s" podCreationTimestamp="2026-04-24 21:34:12 +0000 UTC" firstStartedPulling="2026-04-24 21:34:12.66455338 +0000 UTC m=+409.492423489" lastFinishedPulling="2026-04-24 21:34:39.325424287 +0000 UTC m=+436.153294399" observedRunningTime="2026-04-24 21:34:40.243652258 +0000 UTC m=+437.071522403" watchObservedRunningTime="2026-04-24 21:34:40.244251427 +0000 UTC m=+437.072121572" Apr 24 21:34:41.222418 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:41.222377 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 24 21:34:41.222903 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:41.222827 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:34:41.226009 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:41.225989 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:34:42.227745 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:42.227702 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 24 21:34:42.228174 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:42.228083 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:34:52.227686 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:52.227630 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 24 21:34:52.228239 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:34:52.228168 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:35:02.228151 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:02.228106 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 24 21:35:02.228673 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:02.228646 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:35:12.228248 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:12.228195 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 24 21:35:12.228655 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:12.228598 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:35:22.228324 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:22.228275 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 24 21:35:22.229045 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:22.229005 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:35:32.227943 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:32.227883 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 24 21:35:32.228404 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:32.228332 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:35:42.228115 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:42.228084 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:35:42.228589 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:42.228283 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:35:57.221879 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.221841 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl"] Apr 24 21:35:57.222364 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.222308 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kserve-container" containerID="cri-o://ce8a7cbeb21792b3614691877c8b89870c874b70cfb4cacc9d2553ccf2ffa4c5" gracePeriod=30 Apr 24 21:35:57.222454 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.222390 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="agent" containerID="cri-o://2261285d9ba189bac4abed0f61794d3090ca8cece2aeadb1d1007845c48435fb" gracePeriod=30 Apr 24 21:35:57.222509 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.222447 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kube-rbac-proxy" containerID="cri-o://577bb69ba894723349ff3e4108f0f33b1f2b4e667e38d9cc1fd0eec3b2e829f3" gracePeriod=30 Apr 24 21:35:57.303576 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.303549 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk"] Apr 24 21:35:57.305884 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.305866 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" Apr 24 21:35:57.308305 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.308256 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-7bb74-predictor-serving-cert\"" Apr 24 21:35:57.308305 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.308289 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-7bb74-kube-rbac-proxy-sar-config\"" Apr 24 21:35:57.318689 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.318670 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk"] Apr 24 21:35:57.347464 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.347435 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-7bb74-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a1cfccd-3d5e-4f76-a972-59d416230e95-isvc-sklearn-graph-raw-7bb74-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk\" (UID: \"0a1cfccd-3d5e-4f76-a972-59d416230e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" Apr 24 21:35:57.347592 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.347529 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz7t2\" (UniqueName: \"kubernetes.io/projected/0a1cfccd-3d5e-4f76-a972-59d416230e95-kube-api-access-tz7t2\") pod \"isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk\" (UID: \"0a1cfccd-3d5e-4f76-a972-59d416230e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" Apr 24 21:35:57.347592 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.347577 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a1cfccd-3d5e-4f76-a972-59d416230e95-proxy-tls\") pod \"isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk\" (UID: \"0a1cfccd-3d5e-4f76-a972-59d416230e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" Apr 24 21:35:57.347696 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.347628 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a1cfccd-3d5e-4f76-a972-59d416230e95-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk\" (UID: \"0a1cfccd-3d5e-4f76-a972-59d416230e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" Apr 24 21:35:57.396275 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.396250 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t"] Apr 24 21:35:57.398481 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.398464 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" Apr 24 21:35:57.400811 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.400792 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-7bb74-predictor-serving-cert\"" Apr 24 21:35:57.400934 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.400830 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-7bb74-kube-rbac-proxy-sar-config\"" Apr 24 21:35:57.409536 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.409516 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t"] Apr 24 21:35:57.438000 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.437979 2578 generic.go:358] "Generic (PLEG): container finished" podID="4650e380-d5f3-407b-8763-5e988c00f004" containerID="577bb69ba894723349ff3e4108f0f33b1f2b4e667e38d9cc1fd0eec3b2e829f3" exitCode=2 Apr 24 21:35:57.438096 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.438036 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" event={"ID":"4650e380-d5f3-407b-8763-5e988c00f004","Type":"ContainerDied","Data":"577bb69ba894723349ff3e4108f0f33b1f2b4e667e38d9cc1fd0eec3b2e829f3"} Apr 24 21:35:57.448340 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.448316 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tz7t2\" (UniqueName: \"kubernetes.io/projected/0a1cfccd-3d5e-4f76-a972-59d416230e95-kube-api-access-tz7t2\") pod \"isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk\" (UID: \"0a1cfccd-3d5e-4f76-a972-59d416230e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" Apr 24 21:35:57.448424 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.448351 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52lvm\" (UniqueName: \"kubernetes.io/projected/90218cfe-6610-442a-9cdf-aedb36d45ffe-kube-api-access-52lvm\") pod \"isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t\" (UID: \"90218cfe-6610-442a-9cdf-aedb36d45ffe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" Apr 24 21:35:57.448424 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.448383 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a1cfccd-3d5e-4f76-a972-59d416230e95-proxy-tls\") pod \"isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk\" (UID: \"0a1cfccd-3d5e-4f76-a972-59d416230e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" Apr 24 21:35:57.448566 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.448548 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a1cfccd-3d5e-4f76-a972-59d416230e95-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk\" (UID: \"0a1cfccd-3d5e-4f76-a972-59d416230e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" Apr 24 21:35:57.448615 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.448591 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90218cfe-6610-442a-9cdf-aedb36d45ffe-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t\" (UID: \"90218cfe-6610-442a-9cdf-aedb36d45ffe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" Apr 24 21:35:57.448673 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.448623 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-7bb74-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a1cfccd-3d5e-4f76-a972-59d416230e95-isvc-sklearn-graph-raw-7bb74-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk\" (UID: \"0a1cfccd-3d5e-4f76-a972-59d416230e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" Apr 24 21:35:57.448673 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.448648 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-7bb74-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/90218cfe-6610-442a-9cdf-aedb36d45ffe-isvc-xgboost-graph-raw-7bb74-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t\" (UID: \"90218cfe-6610-442a-9cdf-aedb36d45ffe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" Apr 24 21:35:57.448782 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.448701 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/90218cfe-6610-442a-9cdf-aedb36d45ffe-proxy-tls\") pod \"isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t\" (UID: \"90218cfe-6610-442a-9cdf-aedb36d45ffe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" Apr 24 21:35:57.448952 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.448936 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a1cfccd-3d5e-4f76-a972-59d416230e95-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk\" (UID: \"0a1cfccd-3d5e-4f76-a972-59d416230e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" Apr 24 21:35:57.449265 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.449246 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-7bb74-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a1cfccd-3d5e-4f76-a972-59d416230e95-isvc-sklearn-graph-raw-7bb74-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk\" (UID: \"0a1cfccd-3d5e-4f76-a972-59d416230e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" Apr 24 21:35:57.450682 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.450665 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a1cfccd-3d5e-4f76-a972-59d416230e95-proxy-tls\") pod \"isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk\" (UID: \"0a1cfccd-3d5e-4f76-a972-59d416230e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" Apr 24 21:35:57.456754 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.456731 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz7t2\" (UniqueName: \"kubernetes.io/projected/0a1cfccd-3d5e-4f76-a972-59d416230e95-kube-api-access-tz7t2\") pod \"isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk\" (UID: \"0a1cfccd-3d5e-4f76-a972-59d416230e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" Apr 24 21:35:57.549795 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.549749 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52lvm\" (UniqueName: \"kubernetes.io/projected/90218cfe-6610-442a-9cdf-aedb36d45ffe-kube-api-access-52lvm\") pod \"isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t\" (UID: \"90218cfe-6610-442a-9cdf-aedb36d45ffe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" Apr 24 21:35:57.549795 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.549792 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90218cfe-6610-442a-9cdf-aedb36d45ffe-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t\" (UID: \"90218cfe-6610-442a-9cdf-aedb36d45ffe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" Apr 24 21:35:57.550008 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.549820 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-7bb74-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/90218cfe-6610-442a-9cdf-aedb36d45ffe-isvc-xgboost-graph-raw-7bb74-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t\" (UID: \"90218cfe-6610-442a-9cdf-aedb36d45ffe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" Apr 24 21:35:57.550008 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.549856 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/90218cfe-6610-442a-9cdf-aedb36d45ffe-proxy-tls\") pod \"isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t\" (UID: \"90218cfe-6610-442a-9cdf-aedb36d45ffe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" Apr 24 21:35:57.554999 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.550452 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90218cfe-6610-442a-9cdf-aedb36d45ffe-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t\" (UID: \"90218cfe-6610-442a-9cdf-aedb36d45ffe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" Apr 24 21:35:57.554999 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:35:57.550553 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-serving-cert: secret "isvc-xgboost-graph-raw-7bb74-predictor-serving-cert" not found Apr 24 21:35:57.554999 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:35:57.550746 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90218cfe-6610-442a-9cdf-aedb36d45ffe-proxy-tls podName:90218cfe-6610-442a-9cdf-aedb36d45ffe nodeName:}" failed. No retries permitted until 2026-04-24 21:35:58.050711557 +0000 UTC m=+514.878581685 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/90218cfe-6610-442a-9cdf-aedb36d45ffe-proxy-tls") pod "isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" (UID: "90218cfe-6610-442a-9cdf-aedb36d45ffe") : secret "isvc-xgboost-graph-raw-7bb74-predictor-serving-cert" not found Apr 24 21:35:57.554999 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.551012 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-7bb74-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/90218cfe-6610-442a-9cdf-aedb36d45ffe-isvc-xgboost-graph-raw-7bb74-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t\" (UID: \"90218cfe-6610-442a-9cdf-aedb36d45ffe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" Apr 24 21:35:57.559273 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.559241 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52lvm\" (UniqueName: \"kubernetes.io/projected/90218cfe-6610-442a-9cdf-aedb36d45ffe-kube-api-access-52lvm\") pod \"isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t\" (UID: \"90218cfe-6610-442a-9cdf-aedb36d45ffe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" Apr 24 21:35:57.615947 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.615917 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" Apr 24 21:35:57.734388 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:57.734368 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk"] Apr 24 21:35:57.736402 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:35:57.736371 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a1cfccd_3d5e_4f76_a972_59d416230e95.slice/crio-fd19b81443c508ff62044189a3f2c78903733dad6252df2ab663b39421001335 WatchSource:0}: Error finding container fd19b81443c508ff62044189a3f2c78903733dad6252df2ab663b39421001335: Status 404 returned error can't find the container with id fd19b81443c508ff62044189a3f2c78903733dad6252df2ab663b39421001335 Apr 24 21:35:58.054013 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:58.053976 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/90218cfe-6610-442a-9cdf-aedb36d45ffe-proxy-tls\") pod \"isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t\" (UID: \"90218cfe-6610-442a-9cdf-aedb36d45ffe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" Apr 24 21:35:58.056229 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:58.056202 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/90218cfe-6610-442a-9cdf-aedb36d45ffe-proxy-tls\") pod \"isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t\" (UID: \"90218cfe-6610-442a-9cdf-aedb36d45ffe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" Apr 24 21:35:58.307952 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:58.307853 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" Apr 24 21:35:58.423249 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:58.423177 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t"] Apr 24 21:35:58.425202 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:35:58.425177 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90218cfe_6610_442a_9cdf_aedb36d45ffe.slice/crio-5b1db1cbd3654ce9665eed3490e56d669c969eff469d2894aaa7be64e7f9732e WatchSource:0}: Error finding container 5b1db1cbd3654ce9665eed3490e56d669c969eff469d2894aaa7be64e7f9732e: Status 404 returned error can't find the container with id 5b1db1cbd3654ce9665eed3490e56d669c969eff469d2894aaa7be64e7f9732e Apr 24 21:35:58.442739 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:58.442711 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" event={"ID":"0a1cfccd-3d5e-4f76-a972-59d416230e95","Type":"ContainerStarted","Data":"6461fefd40f51fd2fbaec1cfbe7f375f0d0dc46dbf7d15d96af24364c43a5172"} Apr 24 21:35:58.442866 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:58.442747 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" event={"ID":"0a1cfccd-3d5e-4f76-a972-59d416230e95","Type":"ContainerStarted","Data":"fd19b81443c508ff62044189a3f2c78903733dad6252df2ab663b39421001335"} Apr 24 21:35:58.444051 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:58.444028 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" event={"ID":"90218cfe-6610-442a-9cdf-aedb36d45ffe","Type":"ContainerStarted","Data":"5b1db1cbd3654ce9665eed3490e56d669c969eff469d2894aaa7be64e7f9732e"} Apr 24 21:35:59.451664 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:35:59.451618 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" event={"ID":"90218cfe-6610-442a-9cdf-aedb36d45ffe","Type":"ContainerStarted","Data":"72d14678a915d6adaedb2e7f71dca685260055fba50dcc169368ee81a01c92d2"} Apr 24 21:36:01.223174 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:01.223137 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.20:8643/healthz\": dial tcp 10.133.0.20:8643: connect: connection refused" Apr 24 21:36:02.228132 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:02.228076 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 24 21:36:02.228554 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:02.228424 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:36:02.462219 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:02.462182 2578 generic.go:358] "Generic (PLEG): container finished" podID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerID="6461fefd40f51fd2fbaec1cfbe7f375f0d0dc46dbf7d15d96af24364c43a5172" exitCode=0 Apr 24 21:36:02.462389 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:02.462302 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" event={"ID":"0a1cfccd-3d5e-4f76-a972-59d416230e95","Type":"ContainerDied","Data":"6461fefd40f51fd2fbaec1cfbe7f375f0d0dc46dbf7d15d96af24364c43a5172"} Apr 24 21:36:02.464165 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:02.464132 2578 generic.go:358] "Generic (PLEG): container finished" podID="90218cfe-6610-442a-9cdf-aedb36d45ffe" containerID="72d14678a915d6adaedb2e7f71dca685260055fba50dcc169368ee81a01c92d2" exitCode=0 Apr 24 21:36:02.464275 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:02.464185 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" event={"ID":"90218cfe-6610-442a-9cdf-aedb36d45ffe","Type":"ContainerDied","Data":"72d14678a915d6adaedb2e7f71dca685260055fba50dcc169368ee81a01c92d2"} Apr 24 21:36:03.469405 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:03.469360 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" event={"ID":"0a1cfccd-3d5e-4f76-a972-59d416230e95","Type":"ContainerStarted","Data":"860bab1f06f66c0fd107cb24335daf4e11f6faee42f3c7c15c1e26ab8aea72e4"} Apr 24 21:36:03.469862 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:03.469419 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" event={"ID":"0a1cfccd-3d5e-4f76-a972-59d416230e95","Type":"ContainerStarted","Data":"fc4c06ec637ab48b5aed62d3734e8131f88afd67d3d8afb094d29a24024c1a7f"} Apr 24 21:36:03.470062 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:03.469939 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" Apr 24 21:36:03.470062 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:03.469986 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" Apr 24 21:36:03.471360 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:03.471282 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" podUID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:36:03.494331 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:03.492879 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" podStartSLOduration=6.492860302 podStartE2EDuration="6.492860302s" podCreationTimestamp="2026-04-24 21:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:36:03.490304881 +0000 UTC m=+520.318175027" watchObservedRunningTime="2026-04-24 21:36:03.492860302 +0000 UTC m=+520.320730435" Apr 24 21:36:04.474204 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:04.474089 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" podUID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:36:05.478763 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:05.478727 2578 generic.go:358] "Generic (PLEG): container finished" podID="4650e380-d5f3-407b-8763-5e988c00f004" containerID="ce8a7cbeb21792b3614691877c8b89870c874b70cfb4cacc9d2553ccf2ffa4c5" exitCode=0 Apr 24 21:36:05.479129 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:05.478778 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" event={"ID":"4650e380-d5f3-407b-8763-5e988c00f004","Type":"ContainerDied","Data":"ce8a7cbeb21792b3614691877c8b89870c874b70cfb4cacc9d2553ccf2ffa4c5"} Apr 24 21:36:06.223499 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:06.223457 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.20:8643/healthz\": dial tcp 10.133.0.20:8643: connect: connection refused" Apr 24 21:36:09.479184 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:09.479157 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" Apr 24 21:36:09.479739 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:09.479709 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" podUID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:36:11.222659 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:11.222615 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.20:8643/healthz\": dial tcp 10.133.0.20:8643: connect: connection refused" Apr 24 21:36:11.223113 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:11.222794 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:36:12.228599 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:12.228441 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 24 21:36:12.229426 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:12.229358 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:36:16.223230 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:16.223182 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.20:8643/healthz\": dial tcp 10.133.0.20:8643: connect: connection refused" Apr 24 21:36:19.479777 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:19.479739 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" podUID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:36:20.527591 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:20.527557 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" event={"ID":"90218cfe-6610-442a-9cdf-aedb36d45ffe","Type":"ContainerStarted","Data":"8ee7e3f3f51268d336050f747f2204a613ca2be6dea9d2286d24ed83b1ad39d5"} Apr 24 21:36:20.527971 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:20.527603 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" event={"ID":"90218cfe-6610-442a-9cdf-aedb36d45ffe","Type":"ContainerStarted","Data":"c212867ef71762a7d94afd37bb79f3ff6fe6b388a9aa36e396af1bdb3c42ea5b"} Apr 24 21:36:20.527971 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:20.527806 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" Apr 24 21:36:20.548151 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:20.548105 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" podStartSLOduration=6.382013212 podStartE2EDuration="23.548092308s" podCreationTimestamp="2026-04-24 21:35:57 +0000 UTC" firstStartedPulling="2026-04-24 21:36:02.465566763 +0000 UTC m=+519.293436873" lastFinishedPulling="2026-04-24 21:36:19.631645859 +0000 UTC m=+536.459515969" observedRunningTime="2026-04-24 21:36:20.546997978 +0000 UTC m=+537.374868112" watchObservedRunningTime="2026-04-24 21:36:20.548092308 +0000 UTC m=+537.375962439" Apr 24 21:36:21.222484 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:21.222441 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.20:8643/healthz\": dial tcp 10.133.0.20:8643: connect: connection refused" Apr 24 21:36:21.531029 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:21.530947 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" Apr 24 21:36:21.532004 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:21.531968 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" podUID="90218cfe-6610-442a-9cdf-aedb36d45ffe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 21:36:22.228401 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:22.228358 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 24 21:36:22.228599 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:22.228548 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:36:22.228665 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:22.228633 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:36:22.228748 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:22.228732 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:36:22.534793 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:22.534714 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" podUID="90218cfe-6610-442a-9cdf-aedb36d45ffe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 21:36:26.222518 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:26.222473 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.20:8643/healthz\": dial tcp 10.133.0.20:8643: connect: connection refused" Apr 24 21:36:27.538901 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:27.538808 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" Apr 24 21:36:27.539441 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:27.539411 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" podUID="90218cfe-6610-442a-9cdf-aedb36d45ffe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 21:36:27.883103 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:27.883084 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:36:27.918787 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:27.918759 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4650e380-d5f3-407b-8763-5e988c00f004-kserve-provision-location\") pod \"4650e380-d5f3-407b-8763-5e988c00f004\" (UID: \"4650e380-d5f3-407b-8763-5e988c00f004\") " Apr 24 21:36:27.918948 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:27.918798 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4650e380-d5f3-407b-8763-5e988c00f004-proxy-tls\") pod \"4650e380-d5f3-407b-8763-5e988c00f004\" (UID: \"4650e380-d5f3-407b-8763-5e988c00f004\") " Apr 24 21:36:27.918948 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:27.918836 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-raw-sklearn-batcher-2f47a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4650e380-d5f3-407b-8763-5e988c00f004-isvc-raw-sklearn-batcher-2f47a-kube-rbac-proxy-sar-config\") pod \"4650e380-d5f3-407b-8763-5e988c00f004\" (UID: \"4650e380-d5f3-407b-8763-5e988c00f004\") " Apr 24 21:36:27.918948 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:27.918937 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7l46t\" (UniqueName: \"kubernetes.io/projected/4650e380-d5f3-407b-8763-5e988c00f004-kube-api-access-7l46t\") pod \"4650e380-d5f3-407b-8763-5e988c00f004\" (UID: \"4650e380-d5f3-407b-8763-5e988c00f004\") " Apr 24 21:36:27.919220 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:27.919125 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4650e380-d5f3-407b-8763-5e988c00f004-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4650e380-d5f3-407b-8763-5e988c00f004" (UID: "4650e380-d5f3-407b-8763-5e988c00f004"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:36:27.919344 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:27.919270 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4650e380-d5f3-407b-8763-5e988c00f004-isvc-raw-sklearn-batcher-2f47a-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-raw-sklearn-batcher-2f47a-kube-rbac-proxy-sar-config") pod "4650e380-d5f3-407b-8763-5e988c00f004" (UID: "4650e380-d5f3-407b-8763-5e988c00f004"). InnerVolumeSpecName "isvc-raw-sklearn-batcher-2f47a-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:36:27.921097 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:27.921073 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4650e380-d5f3-407b-8763-5e988c00f004-kube-api-access-7l46t" (OuterVolumeSpecName: "kube-api-access-7l46t") pod "4650e380-d5f3-407b-8763-5e988c00f004" (UID: "4650e380-d5f3-407b-8763-5e988c00f004"). InnerVolumeSpecName "kube-api-access-7l46t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:36:27.921194 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:27.921174 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4650e380-d5f3-407b-8763-5e988c00f004-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4650e380-d5f3-407b-8763-5e988c00f004" (UID: "4650e380-d5f3-407b-8763-5e988c00f004"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:36:28.020450 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:28.020421 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4650e380-d5f3-407b-8763-5e988c00f004-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:36:28.020450 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:28.020447 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-raw-sklearn-batcher-2f47a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4650e380-d5f3-407b-8763-5e988c00f004-isvc-raw-sklearn-batcher-2f47a-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:36:28.020450 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:28.020460 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7l46t\" (UniqueName: \"kubernetes.io/projected/4650e380-d5f3-407b-8763-5e988c00f004-kube-api-access-7l46t\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:36:28.020668 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:28.020469 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4650e380-d5f3-407b-8763-5e988c00f004-kserve-provision-location\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:36:28.552256 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:28.552224 2578 generic.go:358] "Generic (PLEG): container finished" podID="4650e380-d5f3-407b-8763-5e988c00f004" containerID="2261285d9ba189bac4abed0f61794d3090ca8cece2aeadb1d1007845c48435fb" exitCode=0 Apr 24 21:36:28.552662 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:28.552279 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" event={"ID":"4650e380-d5f3-407b-8763-5e988c00f004","Type":"ContainerDied","Data":"2261285d9ba189bac4abed0f61794d3090ca8cece2aeadb1d1007845c48435fb"} Apr 24 21:36:28.552662 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:28.552305 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" event={"ID":"4650e380-d5f3-407b-8763-5e988c00f004","Type":"ContainerDied","Data":"04f37559a140b79e69782935da408f1087b5f121ddc48f0810d51b6665907025"} Apr 24 21:36:28.552662 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:28.552310 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl" Apr 24 21:36:28.552662 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:28.552320 2578 scope.go:117] "RemoveContainer" containerID="2261285d9ba189bac4abed0f61794d3090ca8cece2aeadb1d1007845c48435fb" Apr 24 21:36:28.560672 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:28.560651 2578 scope.go:117] "RemoveContainer" containerID="577bb69ba894723349ff3e4108f0f33b1f2b4e667e38d9cc1fd0eec3b2e829f3" Apr 24 21:36:28.569014 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:28.568998 2578 scope.go:117] "RemoveContainer" containerID="ce8a7cbeb21792b3614691877c8b89870c874b70cfb4cacc9d2553ccf2ffa4c5" Apr 24 21:36:28.576595 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:28.576577 2578 scope.go:117] "RemoveContainer" containerID="7cd6bada683dd23fb2c42d3647a9a2eec927f562ce7067f5f579e91f60b08b6e" Apr 24 21:36:28.577446 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:28.577249 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl"] Apr 24 21:36:28.579137 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:28.579116 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2f47a-predictor-5c664b4786-666dl"] Apr 24 21:36:28.583770 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:28.583752 2578 scope.go:117] "RemoveContainer" containerID="2261285d9ba189bac4abed0f61794d3090ca8cece2aeadb1d1007845c48435fb" Apr 24 21:36:28.584020 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:36:28.584003 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2261285d9ba189bac4abed0f61794d3090ca8cece2aeadb1d1007845c48435fb\": container with ID starting with 2261285d9ba189bac4abed0f61794d3090ca8cece2aeadb1d1007845c48435fb not found: ID does not exist" containerID="2261285d9ba189bac4abed0f61794d3090ca8cece2aeadb1d1007845c48435fb" Apr 24 21:36:28.584083 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:28.584027 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2261285d9ba189bac4abed0f61794d3090ca8cece2aeadb1d1007845c48435fb"} err="failed to get container status \"2261285d9ba189bac4abed0f61794d3090ca8cece2aeadb1d1007845c48435fb\": rpc error: code = NotFound desc = could not find container \"2261285d9ba189bac4abed0f61794d3090ca8cece2aeadb1d1007845c48435fb\": container with ID starting with 2261285d9ba189bac4abed0f61794d3090ca8cece2aeadb1d1007845c48435fb not found: ID does not exist" Apr 24 21:36:28.584083 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:28.584044 2578 scope.go:117] "RemoveContainer" containerID="577bb69ba894723349ff3e4108f0f33b1f2b4e667e38d9cc1fd0eec3b2e829f3" Apr 24 21:36:28.584265 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:36:28.584243 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"577bb69ba894723349ff3e4108f0f33b1f2b4e667e38d9cc1fd0eec3b2e829f3\": container with ID starting with 577bb69ba894723349ff3e4108f0f33b1f2b4e667e38d9cc1fd0eec3b2e829f3 not found: ID does not exist" containerID="577bb69ba894723349ff3e4108f0f33b1f2b4e667e38d9cc1fd0eec3b2e829f3" Apr 24 21:36:28.584303 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:28.584270 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"577bb69ba894723349ff3e4108f0f33b1f2b4e667e38d9cc1fd0eec3b2e829f3"} err="failed to get container status \"577bb69ba894723349ff3e4108f0f33b1f2b4e667e38d9cc1fd0eec3b2e829f3\": rpc error: code = NotFound desc = could not find container \"577bb69ba894723349ff3e4108f0f33b1f2b4e667e38d9cc1fd0eec3b2e829f3\": container with ID starting with 577bb69ba894723349ff3e4108f0f33b1f2b4e667e38d9cc1fd0eec3b2e829f3 not found: ID does not exist" Apr 24 21:36:28.584303 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:28.584284 2578 scope.go:117] "RemoveContainer" containerID="ce8a7cbeb21792b3614691877c8b89870c874b70cfb4cacc9d2553ccf2ffa4c5" Apr 24 21:36:28.584503 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:36:28.584484 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce8a7cbeb21792b3614691877c8b89870c874b70cfb4cacc9d2553ccf2ffa4c5\": container with ID starting with ce8a7cbeb21792b3614691877c8b89870c874b70cfb4cacc9d2553ccf2ffa4c5 not found: ID does not exist" containerID="ce8a7cbeb21792b3614691877c8b89870c874b70cfb4cacc9d2553ccf2ffa4c5" Apr 24 21:36:28.584571 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:28.584520 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce8a7cbeb21792b3614691877c8b89870c874b70cfb4cacc9d2553ccf2ffa4c5"} err="failed to get container status \"ce8a7cbeb21792b3614691877c8b89870c874b70cfb4cacc9d2553ccf2ffa4c5\": rpc error: code = NotFound desc = could not find container \"ce8a7cbeb21792b3614691877c8b89870c874b70cfb4cacc9d2553ccf2ffa4c5\": container with ID starting with ce8a7cbeb21792b3614691877c8b89870c874b70cfb4cacc9d2553ccf2ffa4c5 not found: ID does not exist" Apr 24 21:36:28.584571 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:28.584543 2578 scope.go:117] "RemoveContainer" containerID="7cd6bada683dd23fb2c42d3647a9a2eec927f562ce7067f5f579e91f60b08b6e" Apr 24 21:36:28.584764 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:36:28.584743 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cd6bada683dd23fb2c42d3647a9a2eec927f562ce7067f5f579e91f60b08b6e\": container with ID starting with 7cd6bada683dd23fb2c42d3647a9a2eec927f562ce7067f5f579e91f60b08b6e not found: ID does not exist" containerID="7cd6bada683dd23fb2c42d3647a9a2eec927f562ce7067f5f579e91f60b08b6e" Apr 24 21:36:28.584806 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:28.584771 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cd6bada683dd23fb2c42d3647a9a2eec927f562ce7067f5f579e91f60b08b6e"} err="failed to get container status \"7cd6bada683dd23fb2c42d3647a9a2eec927f562ce7067f5f579e91f60b08b6e\": rpc error: code = NotFound desc = could not find container \"7cd6bada683dd23fb2c42d3647a9a2eec927f562ce7067f5f579e91f60b08b6e\": container with ID starting with 7cd6bada683dd23fb2c42d3647a9a2eec927f562ce7067f5f579e91f60b08b6e not found: ID does not exist" Apr 24 21:36:29.479955 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:29.479910 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" podUID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:36:29.762240 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:29.762168 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4650e380-d5f3-407b-8763-5e988c00f004" path="/var/lib/kubelet/pods/4650e380-d5f3-407b-8763-5e988c00f004/volumes" Apr 24 21:36:37.540294 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:37.540256 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" podUID="90218cfe-6610-442a-9cdf-aedb36d45ffe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 21:36:39.480642 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:39.480599 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" podUID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:36:47.540152 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:47.540105 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" podUID="90218cfe-6610-442a-9cdf-aedb36d45ffe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 21:36:49.479618 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:49.479574 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" podUID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:36:57.539945 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:57.539903 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" podUID="90218cfe-6610-442a-9cdf-aedb36d45ffe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 21:36:59.480415 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:36:59.480370 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" podUID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:37:07.539761 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:07.539711 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" podUID="90218cfe-6610-442a-9cdf-aedb36d45ffe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 21:37:09.480404 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:09.480377 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" Apr 24 21:37:17.539687 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:17.539650 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" podUID="90218cfe-6610-442a-9cdf-aedb36d45ffe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 24 21:37:23.654662 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:23.654634 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krb9p_cde5bc87-530f-4ee7-8f38-39b875bbd4e6/ovn-acl-logging/0.log" Apr 24 21:37:23.655133 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:23.654878 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krb9p_cde5bc87-530f-4ee7-8f38-39b875bbd4e6/ovn-acl-logging/0.log" Apr 24 21:37:27.540643 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:27.540615 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" Apr 24 21:37:47.576207 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.576164 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk"] Apr 24 21:37:47.576653 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.576603 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" podUID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerName="kserve-container" containerID="cri-o://fc4c06ec637ab48b5aed62d3734e8131f88afd67d3d8afb094d29a24024c1a7f" gracePeriod=30 Apr 24 21:37:47.576725 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.576627 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" podUID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerName="kube-rbac-proxy" containerID="cri-o://860bab1f06f66c0fd107cb24335daf4e11f6faee42f3c7c15c1e26ab8aea72e4" gracePeriod=30 Apr 24 21:37:47.643879 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.643839 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524"] Apr 24 21:37:47.644385 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.644364 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="agent" Apr 24 21:37:47.644439 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.644390 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="agent" Apr 24 21:37:47.644439 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.644419 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="storage-initializer" Apr 24 21:37:47.644439 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.644431 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="storage-initializer" Apr 24 21:37:47.644527 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.644454 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kserve-container" Apr 24 21:37:47.644527 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.644464 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kserve-container" Apr 24 21:37:47.644527 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.644479 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kube-rbac-proxy" Apr 24 21:37:47.644527 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.644487 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kube-rbac-proxy" Apr 24 21:37:47.644669 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.644564 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kube-rbac-proxy" Apr 24 21:37:47.644669 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.644579 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="agent" Apr 24 21:37:47.644669 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.644592 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="4650e380-d5f3-407b-8763-5e988c00f004" containerName="kserve-container" Apr 24 21:37:47.647061 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.647026 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" Apr 24 21:37:47.649542 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.649520 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-21211-predictor-serving-cert\"" Apr 24 21:37:47.649684 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.649536 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-21211-kube-rbac-proxy-sar-config\"" Apr 24 21:37:47.657719 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.657685 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524"] Apr 24 21:37:47.697473 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.697443 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t"] Apr 24 21:37:47.697845 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.697796 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" podUID="90218cfe-6610-442a-9cdf-aedb36d45ffe" containerName="kube-rbac-proxy" containerID="cri-o://8ee7e3f3f51268d336050f747f2204a613ca2be6dea9d2286d24ed83b1ad39d5" gracePeriod=30 Apr 24 21:37:47.697961 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.697773 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" podUID="90218cfe-6610-442a-9cdf-aedb36d45ffe" containerName="kserve-container" containerID="cri-o://c212867ef71762a7d94afd37bb79f3ff6fe6b388a9aa36e396af1bdb3c42ea5b" gracePeriod=30 Apr 24 21:37:47.737090 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.737049 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj"] Apr 24 21:37:47.739700 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.739676 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" Apr 24 21:37:47.742720 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.742688 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-21211-predictor-serving-cert\"" Apr 24 21:37:47.742868 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.742803 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-21211-kube-rbac-proxy-sar-config\"" Apr 24 21:37:47.754135 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.754104 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj"] Apr 24 21:37:47.770062 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.770023 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmsmm\" (UniqueName: \"kubernetes.io/projected/6f3f4204-24d6-4d92-ba57-df58612146cf-kube-api-access-nmsmm\") pod \"isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524\" (UID: \"6f3f4204-24d6-4d92-ba57-df58612146cf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" Apr 24 21:37:47.770062 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.770064 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f3f4204-24d6-4d92-ba57-df58612146cf-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524\" (UID: \"6f3f4204-24d6-4d92-ba57-df58612146cf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" Apr 24 21:37:47.770261 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.770089 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f3f4204-24d6-4d92-ba57-df58612146cf-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524\" (UID: \"6f3f4204-24d6-4d92-ba57-df58612146cf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" Apr 24 21:37:47.770261 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.770226 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-hpa-21211-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6f3f4204-24d6-4d92-ba57-df58612146cf-isvc-sklearn-graph-raw-hpa-21211-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524\" (UID: \"6f3f4204-24d6-4d92-ba57-df58612146cf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" Apr 24 21:37:47.806839 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.806810 2578 generic.go:358] "Generic (PLEG): container finished" podID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerID="860bab1f06f66c0fd107cb24335daf4e11f6faee42f3c7c15c1e26ab8aea72e4" exitCode=2 Apr 24 21:37:47.807010 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.806880 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" event={"ID":"0a1cfccd-3d5e-4f76-a972-59d416230e95","Type":"ContainerDied","Data":"860bab1f06f66c0fd107cb24335daf4e11f6faee42f3c7c15c1e26ab8aea72e4"} Apr 24 21:37:47.871016 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.870927 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj\" (UID: \"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" Apr 24 21:37:47.871016 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.870975 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmsmm\" (UniqueName: \"kubernetes.io/projected/6f3f4204-24d6-4d92-ba57-df58612146cf-kube-api-access-nmsmm\") pod \"isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524\" (UID: \"6f3f4204-24d6-4d92-ba57-df58612146cf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" Apr 24 21:37:47.871234 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.871036 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f3f4204-24d6-4d92-ba57-df58612146cf-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524\" (UID: \"6f3f4204-24d6-4d92-ba57-df58612146cf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" Apr 24 21:37:47.871234 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.871102 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-hpa-21211-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-isvc-xgboost-graph-raw-hpa-21211-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj\" (UID: \"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" Apr 24 21:37:47.871234 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.871139 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f3f4204-24d6-4d92-ba57-df58612146cf-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524\" (UID: \"6f3f4204-24d6-4d92-ba57-df58612146cf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" Apr 24 21:37:47.871234 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.871215 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d459r\" (UniqueName: \"kubernetes.io/projected/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-kube-api-access-d459r\") pod \"isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj\" (UID: \"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" Apr 24 21:37:47.871441 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.871260 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj\" (UID: \"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" Apr 24 21:37:47.871441 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.871363 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-hpa-21211-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6f3f4204-24d6-4d92-ba57-df58612146cf-isvc-sklearn-graph-raw-hpa-21211-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524\" (UID: \"6f3f4204-24d6-4d92-ba57-df58612146cf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" Apr 24 21:37:47.871544 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.871444 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f3f4204-24d6-4d92-ba57-df58612146cf-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524\" (UID: \"6f3f4204-24d6-4d92-ba57-df58612146cf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" Apr 24 21:37:47.871970 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.871951 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-hpa-21211-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6f3f4204-24d6-4d92-ba57-df58612146cf-isvc-sklearn-graph-raw-hpa-21211-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524\" (UID: \"6f3f4204-24d6-4d92-ba57-df58612146cf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" Apr 24 21:37:47.873688 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.873668 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f3f4204-24d6-4d92-ba57-df58612146cf-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524\" (UID: \"6f3f4204-24d6-4d92-ba57-df58612146cf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" Apr 24 21:37:47.881119 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.881091 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmsmm\" (UniqueName: \"kubernetes.io/projected/6f3f4204-24d6-4d92-ba57-df58612146cf-kube-api-access-nmsmm\") pod \"isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524\" (UID: \"6f3f4204-24d6-4d92-ba57-df58612146cf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" Apr 24 21:37:47.959923 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.959875 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" Apr 24 21:37:47.972780 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.972748 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj\" (UID: \"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" Apr 24 21:37:47.972949 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.972797 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-hpa-21211-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-isvc-xgboost-graph-raw-hpa-21211-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj\" (UID: \"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" Apr 24 21:37:47.972949 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.972830 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d459r\" (UniqueName: \"kubernetes.io/projected/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-kube-api-access-d459r\") pod \"isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj\" (UID: \"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" Apr 24 21:37:47.972949 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.972855 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj\" (UID: \"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" Apr 24 21:37:47.973260 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.973233 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj\" (UID: \"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" Apr 24 21:37:47.973480 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.973461 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-hpa-21211-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-isvc-xgboost-graph-raw-hpa-21211-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj\" (UID: \"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" Apr 24 21:37:47.975227 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.975207 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj\" (UID: \"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" Apr 24 21:37:47.982155 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:47.982127 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d459r\" (UniqueName: \"kubernetes.io/projected/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-kube-api-access-d459r\") pod \"isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj\" (UID: \"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" Apr 24 21:37:48.050528 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:48.050496 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" Apr 24 21:37:48.086910 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:48.086857 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524"] Apr 24 21:37:48.089757 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:37:48.089716 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f3f4204_24d6_4d92_ba57_df58612146cf.slice/crio-c433c46279efbff8f5709237ec3ccb52823c09f559509e0372823cf035616419 WatchSource:0}: Error finding container c433c46279efbff8f5709237ec3ccb52823c09f559509e0372823cf035616419: Status 404 returned error can't find the container with id c433c46279efbff8f5709237ec3ccb52823c09f559509e0372823cf035616419 Apr 24 21:37:48.092254 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:48.092234 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:37:48.202617 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:48.202584 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj"] Apr 24 21:37:48.206691 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:37:48.206648 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd3e79e1_ba1d_4249_a3fc_c46bfd68dce5.slice/crio-215b63eea681b07993e9d31c8c10555776968e4bfd8dc6fce007f3f6a89ea735 WatchSource:0}: Error finding container 215b63eea681b07993e9d31c8c10555776968e4bfd8dc6fce007f3f6a89ea735: Status 404 returned error can't find the container with id 215b63eea681b07993e9d31c8c10555776968e4bfd8dc6fce007f3f6a89ea735 Apr 24 21:37:48.811447 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:48.811405 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" event={"ID":"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5","Type":"ContainerStarted","Data":"c0750feebd86ee63480069cd84de01346f3dcb5d0604e409e8c55059cc14f03a"} Apr 24 21:37:48.811447 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:48.811450 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" event={"ID":"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5","Type":"ContainerStarted","Data":"215b63eea681b07993e9d31c8c10555776968e4bfd8dc6fce007f3f6a89ea735"} Apr 24 21:37:48.812848 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:48.812818 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" event={"ID":"6f3f4204-24d6-4d92-ba57-df58612146cf","Type":"ContainerStarted","Data":"8798fe987c63d73209248521db3b6b5411cb8590eda6585945240075f34aa92c"} Apr 24 21:37:48.812973 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:48.812853 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" event={"ID":"6f3f4204-24d6-4d92-ba57-df58612146cf","Type":"ContainerStarted","Data":"c433c46279efbff8f5709237ec3ccb52823c09f559509e0372823cf035616419"} Apr 24 21:37:48.814514 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:48.814493 2578 generic.go:358] "Generic (PLEG): container finished" podID="90218cfe-6610-442a-9cdf-aedb36d45ffe" containerID="8ee7e3f3f51268d336050f747f2204a613ca2be6dea9d2286d24ed83b1ad39d5" exitCode=2 Apr 24 21:37:48.814593 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:48.814548 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" event={"ID":"90218cfe-6610-442a-9cdf-aedb36d45ffe","Type":"ContainerDied","Data":"8ee7e3f3f51268d336050f747f2204a613ca2be6dea9d2286d24ed83b1ad39d5"} Apr 24 21:37:49.474786 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:49.474737 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" podUID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.21:8643/healthz\": dial tcp 10.133.0.21:8643: connect: connection refused" Apr 24 21:37:49.480151 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:49.480116 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" podUID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:37:51.539853 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.539829 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" Apr 24 21:37:51.706187 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.706152 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/90218cfe-6610-442a-9cdf-aedb36d45ffe-proxy-tls\") pod \"90218cfe-6610-442a-9cdf-aedb36d45ffe\" (UID: \"90218cfe-6610-442a-9cdf-aedb36d45ffe\") " Apr 24 21:37:51.706187 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.706197 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90218cfe-6610-442a-9cdf-aedb36d45ffe-kserve-provision-location\") pod \"90218cfe-6610-442a-9cdf-aedb36d45ffe\" (UID: \"90218cfe-6610-442a-9cdf-aedb36d45ffe\") " Apr 24 21:37:51.706432 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.706241 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52lvm\" (UniqueName: \"kubernetes.io/projected/90218cfe-6610-442a-9cdf-aedb36d45ffe-kube-api-access-52lvm\") pod \"90218cfe-6610-442a-9cdf-aedb36d45ffe\" (UID: \"90218cfe-6610-442a-9cdf-aedb36d45ffe\") " Apr 24 21:37:51.706432 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.706345 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-7bb74-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/90218cfe-6610-442a-9cdf-aedb36d45ffe-isvc-xgboost-graph-raw-7bb74-kube-rbac-proxy-sar-config\") pod \"90218cfe-6610-442a-9cdf-aedb36d45ffe\" (UID: \"90218cfe-6610-442a-9cdf-aedb36d45ffe\") " Apr 24 21:37:51.706644 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.706608 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90218cfe-6610-442a-9cdf-aedb36d45ffe-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "90218cfe-6610-442a-9cdf-aedb36d45ffe" (UID: "90218cfe-6610-442a-9cdf-aedb36d45ffe"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:37:51.706763 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.706681 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90218cfe-6610-442a-9cdf-aedb36d45ffe-isvc-xgboost-graph-raw-7bb74-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-7bb74-kube-rbac-proxy-sar-config") pod "90218cfe-6610-442a-9cdf-aedb36d45ffe" (UID: "90218cfe-6610-442a-9cdf-aedb36d45ffe"). InnerVolumeSpecName "isvc-xgboost-graph-raw-7bb74-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:37:51.708265 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.708241 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90218cfe-6610-442a-9cdf-aedb36d45ffe-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "90218cfe-6610-442a-9cdf-aedb36d45ffe" (UID: "90218cfe-6610-442a-9cdf-aedb36d45ffe"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:37:51.708350 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.708273 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90218cfe-6610-442a-9cdf-aedb36d45ffe-kube-api-access-52lvm" (OuterVolumeSpecName: "kube-api-access-52lvm") pod "90218cfe-6610-442a-9cdf-aedb36d45ffe" (UID: "90218cfe-6610-442a-9cdf-aedb36d45ffe"). InnerVolumeSpecName "kube-api-access-52lvm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:37:51.808002 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.807957 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-7bb74-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/90218cfe-6610-442a-9cdf-aedb36d45ffe-isvc-xgboost-graph-raw-7bb74-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:37:51.808002 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.807999 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/90218cfe-6610-442a-9cdf-aedb36d45ffe-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:37:51.808210 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.808015 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90218cfe-6610-442a-9cdf-aedb36d45ffe-kserve-provision-location\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:37:51.808210 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.808030 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-52lvm\" (UniqueName: \"kubernetes.io/projected/90218cfe-6610-442a-9cdf-aedb36d45ffe-kube-api-access-52lvm\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:37:51.826774 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.826739 2578 generic.go:358] "Generic (PLEG): container finished" podID="90218cfe-6610-442a-9cdf-aedb36d45ffe" containerID="c212867ef71762a7d94afd37bb79f3ff6fe6b388a9aa36e396af1bdb3c42ea5b" exitCode=0 Apr 24 21:37:51.826963 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.826798 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" event={"ID":"90218cfe-6610-442a-9cdf-aedb36d45ffe","Type":"ContainerDied","Data":"c212867ef71762a7d94afd37bb79f3ff6fe6b388a9aa36e396af1bdb3c42ea5b"} Apr 24 21:37:51.826963 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.826827 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" Apr 24 21:37:51.826963 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.826838 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t" event={"ID":"90218cfe-6610-442a-9cdf-aedb36d45ffe","Type":"ContainerDied","Data":"5b1db1cbd3654ce9665eed3490e56d669c969eff469d2894aaa7be64e7f9732e"} Apr 24 21:37:51.826963 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.826859 2578 scope.go:117] "RemoveContainer" containerID="8ee7e3f3f51268d336050f747f2204a613ca2be6dea9d2286d24ed83b1ad39d5" Apr 24 21:37:51.914884 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.914862 2578 scope.go:117] "RemoveContainer" containerID="c212867ef71762a7d94afd37bb79f3ff6fe6b388a9aa36e396af1bdb3c42ea5b" Apr 24 21:37:51.922441 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.922421 2578 scope.go:117] "RemoveContainer" containerID="72d14678a915d6adaedb2e7f71dca685260055fba50dcc169368ee81a01c92d2" Apr 24 21:37:51.928262 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.928231 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t"] Apr 24 21:37:51.930450 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.930430 2578 scope.go:117] "RemoveContainer" containerID="8ee7e3f3f51268d336050f747f2204a613ca2be6dea9d2286d24ed83b1ad39d5" Apr 24 21:37:51.930807 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:37:51.930783 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ee7e3f3f51268d336050f747f2204a613ca2be6dea9d2286d24ed83b1ad39d5\": container with ID starting with 8ee7e3f3f51268d336050f747f2204a613ca2be6dea9d2286d24ed83b1ad39d5 not found: ID does not exist" containerID="8ee7e3f3f51268d336050f747f2204a613ca2be6dea9d2286d24ed83b1ad39d5" Apr 24 21:37:51.930916 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.930840 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ee7e3f3f51268d336050f747f2204a613ca2be6dea9d2286d24ed83b1ad39d5"} err="failed to get container status \"8ee7e3f3f51268d336050f747f2204a613ca2be6dea9d2286d24ed83b1ad39d5\": rpc error: code = NotFound desc = could not find container \"8ee7e3f3f51268d336050f747f2204a613ca2be6dea9d2286d24ed83b1ad39d5\": container with ID starting with 8ee7e3f3f51268d336050f747f2204a613ca2be6dea9d2286d24ed83b1ad39d5 not found: ID does not exist" Apr 24 21:37:51.930916 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.930866 2578 scope.go:117] "RemoveContainer" containerID="c212867ef71762a7d94afd37bb79f3ff6fe6b388a9aa36e396af1bdb3c42ea5b" Apr 24 21:37:51.931598 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:37:51.931577 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c212867ef71762a7d94afd37bb79f3ff6fe6b388a9aa36e396af1bdb3c42ea5b\": container with ID starting with c212867ef71762a7d94afd37bb79f3ff6fe6b388a9aa36e396af1bdb3c42ea5b not found: ID does not exist" containerID="c212867ef71762a7d94afd37bb79f3ff6fe6b388a9aa36e396af1bdb3c42ea5b" Apr 24 21:37:51.931798 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.931756 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c212867ef71762a7d94afd37bb79f3ff6fe6b388a9aa36e396af1bdb3c42ea5b"} err="failed to get container status \"c212867ef71762a7d94afd37bb79f3ff6fe6b388a9aa36e396af1bdb3c42ea5b\": rpc error: code = NotFound desc = could not find container \"c212867ef71762a7d94afd37bb79f3ff6fe6b388a9aa36e396af1bdb3c42ea5b\": container with ID starting with c212867ef71762a7d94afd37bb79f3ff6fe6b388a9aa36e396af1bdb3c42ea5b not found: ID does not exist" Apr 24 21:37:51.931798 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.931797 2578 scope.go:117] "RemoveContainer" containerID="72d14678a915d6adaedb2e7f71dca685260055fba50dcc169368ee81a01c92d2" Apr 24 21:37:51.932326 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:37:51.932302 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72d14678a915d6adaedb2e7f71dca685260055fba50dcc169368ee81a01c92d2\": container with ID starting with 72d14678a915d6adaedb2e7f71dca685260055fba50dcc169368ee81a01c92d2 not found: ID does not exist" containerID="72d14678a915d6adaedb2e7f71dca685260055fba50dcc169368ee81a01c92d2" Apr 24 21:37:51.932377 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.932333 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d14678a915d6adaedb2e7f71dca685260055fba50dcc169368ee81a01c92d2"} err="failed to get container status \"72d14678a915d6adaedb2e7f71dca685260055fba50dcc169368ee81a01c92d2\": rpc error: code = NotFound desc = could not find container \"72d14678a915d6adaedb2e7f71dca685260055fba50dcc169368ee81a01c92d2\": container with ID starting with 72d14678a915d6adaedb2e7f71dca685260055fba50dcc169368ee81a01c92d2 not found: ID does not exist" Apr 24 21:37:51.933228 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:51.933208 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-7bb74-predictor-7bf6b85585-w9q5t"] Apr 24 21:37:52.299907 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.299866 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" Apr 24 21:37:52.414877 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.414838 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a1cfccd-3d5e-4f76-a972-59d416230e95-proxy-tls\") pod \"0a1cfccd-3d5e-4f76-a972-59d416230e95\" (UID: \"0a1cfccd-3d5e-4f76-a972-59d416230e95\") " Apr 24 21:37:52.415037 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.414899 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-7bb74-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a1cfccd-3d5e-4f76-a972-59d416230e95-isvc-sklearn-graph-raw-7bb74-kube-rbac-proxy-sar-config\") pod \"0a1cfccd-3d5e-4f76-a972-59d416230e95\" (UID: \"0a1cfccd-3d5e-4f76-a972-59d416230e95\") " Apr 24 21:37:52.415037 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.414946 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a1cfccd-3d5e-4f76-a972-59d416230e95-kserve-provision-location\") pod \"0a1cfccd-3d5e-4f76-a972-59d416230e95\" (UID: \"0a1cfccd-3d5e-4f76-a972-59d416230e95\") " Apr 24 21:37:52.415037 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.414986 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz7t2\" (UniqueName: \"kubernetes.io/projected/0a1cfccd-3d5e-4f76-a972-59d416230e95-kube-api-access-tz7t2\") pod \"0a1cfccd-3d5e-4f76-a972-59d416230e95\" (UID: \"0a1cfccd-3d5e-4f76-a972-59d416230e95\") " Apr 24 21:37:52.415316 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.415290 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a1cfccd-3d5e-4f76-a972-59d416230e95-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0a1cfccd-3d5e-4f76-a972-59d416230e95" (UID: "0a1cfccd-3d5e-4f76-a972-59d416230e95"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:37:52.415384 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.415294 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a1cfccd-3d5e-4f76-a972-59d416230e95-isvc-sklearn-graph-raw-7bb74-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-7bb74-kube-rbac-proxy-sar-config") pod "0a1cfccd-3d5e-4f76-a972-59d416230e95" (UID: "0a1cfccd-3d5e-4f76-a972-59d416230e95"). InnerVolumeSpecName "isvc-sklearn-graph-raw-7bb74-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:37:52.417063 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.417040 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a1cfccd-3d5e-4f76-a972-59d416230e95-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0a1cfccd-3d5e-4f76-a972-59d416230e95" (UID: "0a1cfccd-3d5e-4f76-a972-59d416230e95"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:37:52.417146 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.417109 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a1cfccd-3d5e-4f76-a972-59d416230e95-kube-api-access-tz7t2" (OuterVolumeSpecName: "kube-api-access-tz7t2") pod "0a1cfccd-3d5e-4f76-a972-59d416230e95" (UID: "0a1cfccd-3d5e-4f76-a972-59d416230e95"). InnerVolumeSpecName "kube-api-access-tz7t2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:37:52.516474 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.516437 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a1cfccd-3d5e-4f76-a972-59d416230e95-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:37:52.516474 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.516467 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-7bb74-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a1cfccd-3d5e-4f76-a972-59d416230e95-isvc-sklearn-graph-raw-7bb74-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:37:52.516474 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.516478 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a1cfccd-3d5e-4f76-a972-59d416230e95-kserve-provision-location\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:37:52.516691 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.516489 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tz7t2\" (UniqueName: \"kubernetes.io/projected/0a1cfccd-3d5e-4f76-a972-59d416230e95-kube-api-access-tz7t2\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:37:52.831265 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.831230 2578 generic.go:358] "Generic (PLEG): container finished" podID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerID="c0750feebd86ee63480069cd84de01346f3dcb5d0604e409e8c55059cc14f03a" exitCode=0 Apr 24 21:37:52.831725 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.831308 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" event={"ID":"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5","Type":"ContainerDied","Data":"c0750feebd86ee63480069cd84de01346f3dcb5d0604e409e8c55059cc14f03a"} Apr 24 21:37:52.832619 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.832598 2578 generic.go:358] "Generic (PLEG): container finished" podID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerID="8798fe987c63d73209248521db3b6b5411cb8590eda6585945240075f34aa92c" exitCode=0 Apr 24 21:37:52.832688 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.832671 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" event={"ID":"6f3f4204-24d6-4d92-ba57-df58612146cf","Type":"ContainerDied","Data":"8798fe987c63d73209248521db3b6b5411cb8590eda6585945240075f34aa92c"} Apr 24 21:37:52.834525 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.834496 2578 generic.go:358] "Generic (PLEG): container finished" podID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerID="fc4c06ec637ab48b5aed62d3734e8131f88afd67d3d8afb094d29a24024c1a7f" exitCode=0 Apr 24 21:37:52.834619 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.834525 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" event={"ID":"0a1cfccd-3d5e-4f76-a972-59d416230e95","Type":"ContainerDied","Data":"fc4c06ec637ab48b5aed62d3734e8131f88afd67d3d8afb094d29a24024c1a7f"} Apr 24 21:37:52.834619 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.834559 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" event={"ID":"0a1cfccd-3d5e-4f76-a972-59d416230e95","Type":"ContainerDied","Data":"fd19b81443c508ff62044189a3f2c78903733dad6252df2ab663b39421001335"} Apr 24 21:37:52.834619 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.834580 2578 scope.go:117] "RemoveContainer" containerID="860bab1f06f66c0fd107cb24335daf4e11f6faee42f3c7c15c1e26ab8aea72e4" Apr 24 21:37:52.834619 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.834584 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk" Apr 24 21:37:52.846685 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.846662 2578 scope.go:117] "RemoveContainer" containerID="fc4c06ec637ab48b5aed62d3734e8131f88afd67d3d8afb094d29a24024c1a7f" Apr 24 21:37:52.857337 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.857311 2578 scope.go:117] "RemoveContainer" containerID="6461fefd40f51fd2fbaec1cfbe7f375f0d0dc46dbf7d15d96af24364c43a5172" Apr 24 21:37:52.876444 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.876416 2578 scope.go:117] "RemoveContainer" containerID="860bab1f06f66c0fd107cb24335daf4e11f6faee42f3c7c15c1e26ab8aea72e4" Apr 24 21:37:52.877163 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:37:52.877141 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"860bab1f06f66c0fd107cb24335daf4e11f6faee42f3c7c15c1e26ab8aea72e4\": container with ID starting with 860bab1f06f66c0fd107cb24335daf4e11f6faee42f3c7c15c1e26ab8aea72e4 not found: ID does not exist" containerID="860bab1f06f66c0fd107cb24335daf4e11f6faee42f3c7c15c1e26ab8aea72e4" Apr 24 21:37:52.877282 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.877176 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"860bab1f06f66c0fd107cb24335daf4e11f6faee42f3c7c15c1e26ab8aea72e4"} err="failed to get container status \"860bab1f06f66c0fd107cb24335daf4e11f6faee42f3c7c15c1e26ab8aea72e4\": rpc error: code = NotFound desc = could not find container \"860bab1f06f66c0fd107cb24335daf4e11f6faee42f3c7c15c1e26ab8aea72e4\": container with ID starting with 860bab1f06f66c0fd107cb24335daf4e11f6faee42f3c7c15c1e26ab8aea72e4 not found: ID does not exist" Apr 24 21:37:52.877282 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.877197 2578 scope.go:117] "RemoveContainer" containerID="fc4c06ec637ab48b5aed62d3734e8131f88afd67d3d8afb094d29a24024c1a7f" Apr 24 21:37:52.877493 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:37:52.877475 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc4c06ec637ab48b5aed62d3734e8131f88afd67d3d8afb094d29a24024c1a7f\": container with ID starting with fc4c06ec637ab48b5aed62d3734e8131f88afd67d3d8afb094d29a24024c1a7f not found: ID does not exist" containerID="fc4c06ec637ab48b5aed62d3734e8131f88afd67d3d8afb094d29a24024c1a7f" Apr 24 21:37:52.877547 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.877498 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc4c06ec637ab48b5aed62d3734e8131f88afd67d3d8afb094d29a24024c1a7f"} err="failed to get container status \"fc4c06ec637ab48b5aed62d3734e8131f88afd67d3d8afb094d29a24024c1a7f\": rpc error: code = NotFound desc = could not find container \"fc4c06ec637ab48b5aed62d3734e8131f88afd67d3d8afb094d29a24024c1a7f\": container with ID starting with fc4c06ec637ab48b5aed62d3734e8131f88afd67d3d8afb094d29a24024c1a7f not found: ID does not exist" Apr 24 21:37:52.877547 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.877518 2578 scope.go:117] "RemoveContainer" containerID="6461fefd40f51fd2fbaec1cfbe7f375f0d0dc46dbf7d15d96af24364c43a5172" Apr 24 21:37:52.877812 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:37:52.877785 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6461fefd40f51fd2fbaec1cfbe7f375f0d0dc46dbf7d15d96af24364c43a5172\": container with ID starting with 6461fefd40f51fd2fbaec1cfbe7f375f0d0dc46dbf7d15d96af24364c43a5172 not found: ID does not exist" containerID="6461fefd40f51fd2fbaec1cfbe7f375f0d0dc46dbf7d15d96af24364c43a5172" Apr 24 21:37:52.877936 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.877815 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6461fefd40f51fd2fbaec1cfbe7f375f0d0dc46dbf7d15d96af24364c43a5172"} err="failed to get container status \"6461fefd40f51fd2fbaec1cfbe7f375f0d0dc46dbf7d15d96af24364c43a5172\": rpc error: code = NotFound desc = could not find container \"6461fefd40f51fd2fbaec1cfbe7f375f0d0dc46dbf7d15d96af24364c43a5172\": container with ID starting with 6461fefd40f51fd2fbaec1cfbe7f375f0d0dc46dbf7d15d96af24364c43a5172 not found: ID does not exist" Apr 24 21:37:52.888966 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.888932 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk"] Apr 24 21:37:52.890987 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:52.890957 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-7bb74-predictor-76f8949cdb-bvczk"] Apr 24 21:37:53.763002 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:53.762969 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a1cfccd-3d5e-4f76-a972-59d416230e95" path="/var/lib/kubelet/pods/0a1cfccd-3d5e-4f76-a972-59d416230e95/volumes" Apr 24 21:37:53.763484 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:53.763470 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90218cfe-6610-442a-9cdf-aedb36d45ffe" path="/var/lib/kubelet/pods/90218cfe-6610-442a-9cdf-aedb36d45ffe/volumes" Apr 24 21:37:53.839669 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:53.839628 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" event={"ID":"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5","Type":"ContainerStarted","Data":"3c1c46c11ec1b742d69d17eadb10508b35752ec78f30d2e01d0ef1ac95b0d2f4"} Apr 24 21:37:53.839669 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:53.839672 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" event={"ID":"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5","Type":"ContainerStarted","Data":"e9d7de623b9c85f71fc5a1dfbaf28623a5d049199b992d6c8069da0cde4db83c"} Apr 24 21:37:53.840169 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:53.840008 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" Apr 24 21:37:53.840169 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:53.840037 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" Apr 24 21:37:53.841355 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:53.841331 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" podUID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:37:53.841779 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:53.841759 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" event={"ID":"6f3f4204-24d6-4d92-ba57-df58612146cf","Type":"ContainerStarted","Data":"06c7d03390a9d293ea6b6d6ea50520463ad84ba20f9ddad777ff6eda936d9e9e"} Apr 24 21:37:53.841873 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:53.841788 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" event={"ID":"6f3f4204-24d6-4d92-ba57-df58612146cf","Type":"ContainerStarted","Data":"478a567e40cab153d56536481c00ae726988375c37377badd7f50349dd6cb24d"} Apr 24 21:37:53.842131 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:53.842098 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" Apr 24 21:37:53.842131 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:53.842129 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" Apr 24 21:37:53.843095 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:53.843072 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" podUID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 21:37:53.859051 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:53.858994 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" podStartSLOduration=6.85897539 podStartE2EDuration="6.85897539s" podCreationTimestamp="2026-04-24 21:37:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:37:53.857328532 +0000 UTC m=+630.685198662" watchObservedRunningTime="2026-04-24 21:37:53.85897539 +0000 UTC m=+630.686845525" Apr 24 21:37:53.876051 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:53.876004 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" podStartSLOduration=6.875988479 podStartE2EDuration="6.875988479s" podCreationTimestamp="2026-04-24 21:37:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:37:53.875119943 +0000 UTC m=+630.702990073" watchObservedRunningTime="2026-04-24 21:37:53.875988479 +0000 UTC m=+630.703858610" Apr 24 21:37:54.846364 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:54.846327 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" podUID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 21:37:54.846734 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:54.846436 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" podUID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:37:59.850882 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:59.850854 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" Apr 24 21:37:59.851406 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:59.851385 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" Apr 24 21:37:59.851469 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:59.851397 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" podUID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:37:59.851792 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:37:59.851773 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" podUID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 21:38:09.851797 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:38:09.851759 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" podUID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:38:09.852209 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:38:09.851759 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" podUID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 21:38:19.851784 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:38:19.851740 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" podUID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:38:19.852348 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:38:19.851884 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" podUID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 21:38:29.852243 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:38:29.852197 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" podUID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 21:38:29.852725 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:38:29.852257 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" podUID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:38:39.851625 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:38:39.851588 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" podUID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:38:39.852021 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:38:39.851807 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" podUID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 21:38:49.852221 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:38:49.852176 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" podUID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 21:38:49.852221 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:38:49.852176 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" podUID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:38:59.851865 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:38:59.851833 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" Apr 24 21:38:59.852629 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:38:59.852610 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" Apr 24 21:39:27.854213 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.854004 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj"] Apr 24 21:39:27.854581 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.854411 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" podUID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerName="kserve-container" containerID="cri-o://e9d7de623b9c85f71fc5a1dfbaf28623a5d049199b992d6c8069da0cde4db83c" gracePeriod=30 Apr 24 21:39:27.854581 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.854549 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" podUID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerName="kube-rbac-proxy" containerID="cri-o://3c1c46c11ec1b742d69d17eadb10508b35752ec78f30d2e01d0ef1ac95b0d2f4" gracePeriod=30 Apr 24 21:39:27.911187 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.911156 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524"] Apr 24 21:39:27.911537 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.911484 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" podUID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerName="kserve-container" containerID="cri-o://478a567e40cab153d56536481c00ae726988375c37377badd7f50349dd6cb24d" gracePeriod=30 Apr 24 21:39:27.911653 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.911621 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" podUID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerName="kube-rbac-proxy" containerID="cri-o://06c7d03390a9d293ea6b6d6ea50520463ad84ba20f9ddad777ff6eda936d9e9e" gracePeriod=30 Apr 24 21:39:27.937500 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.937469 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28"] Apr 24 21:39:27.937796 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.937783 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90218cfe-6610-442a-9cdf-aedb36d45ffe" containerName="kube-rbac-proxy" Apr 24 21:39:27.937841 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.937798 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="90218cfe-6610-442a-9cdf-aedb36d45ffe" containerName="kube-rbac-proxy" Apr 24 21:39:27.937841 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.937810 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90218cfe-6610-442a-9cdf-aedb36d45ffe" containerName="storage-initializer" Apr 24 21:39:27.937841 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.937816 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="90218cfe-6610-442a-9cdf-aedb36d45ffe" containerName="storage-initializer" Apr 24 21:39:27.937841 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.937829 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerName="kserve-container" Apr 24 21:39:27.937841 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.937835 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerName="kserve-container" Apr 24 21:39:27.937841 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.937841 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerName="kube-rbac-proxy" Apr 24 21:39:27.938042 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.937846 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerName="kube-rbac-proxy" Apr 24 21:39:27.938042 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.937858 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerName="storage-initializer" Apr 24 21:39:27.938042 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.937863 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerName="storage-initializer" Apr 24 21:39:27.938042 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.937869 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90218cfe-6610-442a-9cdf-aedb36d45ffe" containerName="kserve-container" Apr 24 21:39:27.938042 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.937875 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="90218cfe-6610-442a-9cdf-aedb36d45ffe" containerName="kserve-container" Apr 24 21:39:27.938042 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.937937 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerName="kserve-container" Apr 24 21:39:27.938042 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.937946 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a1cfccd-3d5e-4f76-a972-59d416230e95" containerName="kube-rbac-proxy" Apr 24 21:39:27.938042 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.937953 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="90218cfe-6610-442a-9cdf-aedb36d45ffe" containerName="kube-rbac-proxy" Apr 24 21:39:27.938042 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.937962 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="90218cfe-6610-442a-9cdf-aedb36d45ffe" containerName="kserve-container" Apr 24 21:39:27.941129 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.941111 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" Apr 24 21:39:27.943466 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.943441 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-9a20e-kube-rbac-proxy-sar-config\"" Apr 24 21:39:27.943581 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.943441 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-9a20e-predictor-serving-cert\"" Apr 24 21:39:27.950744 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.950713 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28"] Apr 24 21:39:27.989979 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.989933 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-raw-9a20e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5131ad7e-9174-49ef-9b25-764060e0f422-message-dumper-raw-9a20e-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28\" (UID: \"5131ad7e-9174-49ef-9b25-764060e0f422\") " pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" Apr 24 21:39:27.990129 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.989998 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxdhh\" (UniqueName: \"kubernetes.io/projected/5131ad7e-9174-49ef-9b25-764060e0f422-kube-api-access-gxdhh\") pod \"message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28\" (UID: \"5131ad7e-9174-49ef-9b25-764060e0f422\") " pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" Apr 24 21:39:27.990129 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:27.990036 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5131ad7e-9174-49ef-9b25-764060e0f422-proxy-tls\") pod \"message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28\" (UID: \"5131ad7e-9174-49ef-9b25-764060e0f422\") " pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" Apr 24 21:39:28.091210 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:28.091156 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-raw-9a20e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5131ad7e-9174-49ef-9b25-764060e0f422-message-dumper-raw-9a20e-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28\" (UID: \"5131ad7e-9174-49ef-9b25-764060e0f422\") " pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" Apr 24 21:39:28.091421 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:28.091219 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxdhh\" (UniqueName: \"kubernetes.io/projected/5131ad7e-9174-49ef-9b25-764060e0f422-kube-api-access-gxdhh\") pod \"message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28\" (UID: \"5131ad7e-9174-49ef-9b25-764060e0f422\") " pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" Apr 24 21:39:28.091421 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:28.091254 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5131ad7e-9174-49ef-9b25-764060e0f422-proxy-tls\") pod \"message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28\" (UID: \"5131ad7e-9174-49ef-9b25-764060e0f422\") " pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" Apr 24 21:39:28.091556 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:39:28.091451 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-serving-cert: secret "message-dumper-raw-9a20e-predictor-serving-cert" not found Apr 24 21:39:28.091556 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:39:28.091543 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5131ad7e-9174-49ef-9b25-764060e0f422-proxy-tls podName:5131ad7e-9174-49ef-9b25-764060e0f422 nodeName:}" failed. No retries permitted until 2026-04-24 21:39:28.591520835 +0000 UTC m=+725.419390960 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5131ad7e-9174-49ef-9b25-764060e0f422-proxy-tls") pod "message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" (UID: "5131ad7e-9174-49ef-9b25-764060e0f422") : secret "message-dumper-raw-9a20e-predictor-serving-cert" not found Apr 24 21:39:28.091808 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:28.091786 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-raw-9a20e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5131ad7e-9174-49ef-9b25-764060e0f422-message-dumper-raw-9a20e-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28\" (UID: \"5131ad7e-9174-49ef-9b25-764060e0f422\") " pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" Apr 24 21:39:28.100306 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:28.100276 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxdhh\" (UniqueName: \"kubernetes.io/projected/5131ad7e-9174-49ef-9b25-764060e0f422-kube-api-access-gxdhh\") pod \"message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28\" (UID: \"5131ad7e-9174-49ef-9b25-764060e0f422\") " pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" Apr 24 21:39:28.127085 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:28.127008 2578 generic.go:358] "Generic (PLEG): container finished" podID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerID="3c1c46c11ec1b742d69d17eadb10508b35752ec78f30d2e01d0ef1ac95b0d2f4" exitCode=2 Apr 24 21:39:28.127214 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:28.127086 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" event={"ID":"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5","Type":"ContainerDied","Data":"3c1c46c11ec1b742d69d17eadb10508b35752ec78f30d2e01d0ef1ac95b0d2f4"} Apr 24 21:39:28.128841 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:28.128815 2578 generic.go:358] "Generic (PLEG): container finished" podID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerID="06c7d03390a9d293ea6b6d6ea50520463ad84ba20f9ddad777ff6eda936d9e9e" exitCode=2 Apr 24 21:39:28.128965 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:28.128877 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" event={"ID":"6f3f4204-24d6-4d92-ba57-df58612146cf","Type":"ContainerDied","Data":"06c7d03390a9d293ea6b6d6ea50520463ad84ba20f9ddad777ff6eda936d9e9e"} Apr 24 21:39:28.596736 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:28.596698 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5131ad7e-9174-49ef-9b25-764060e0f422-proxy-tls\") pod \"message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28\" (UID: \"5131ad7e-9174-49ef-9b25-764060e0f422\") " pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" Apr 24 21:39:28.599076 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:28.599049 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5131ad7e-9174-49ef-9b25-764060e0f422-proxy-tls\") pod \"message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28\" (UID: \"5131ad7e-9174-49ef-9b25-764060e0f422\") " pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" Apr 24 21:39:28.853102 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:28.853018 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" Apr 24 21:39:28.974618 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:28.974591 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28"] Apr 24 21:39:28.976459 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:39:28.976416 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5131ad7e_9174_49ef_9b25_764060e0f422.slice/crio-1c57761d22cc883807187644aa1ba5caf06c1a0be32b8a0cf7534f5862612c40 WatchSource:0}: Error finding container 1c57761d22cc883807187644aa1ba5caf06c1a0be32b8a0cf7534f5862612c40: Status 404 returned error can't find the container with id 1c57761d22cc883807187644aa1ba5caf06c1a0be32b8a0cf7534f5862612c40 Apr 24 21:39:29.133211 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:29.133131 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" event={"ID":"5131ad7e-9174-49ef-9b25-764060e0f422","Type":"ContainerStarted","Data":"1c57761d22cc883807187644aa1ba5caf06c1a0be32b8a0cf7534f5862612c40"} Apr 24 21:39:29.846608 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:29.846569 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" podUID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.24:8643/healthz\": dial tcp 10.133.0.24:8643: connect: connection refused" Apr 24 21:39:29.846826 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:29.846569 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" podUID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.23:8643/healthz\": dial tcp 10.133.0.23:8643: connect: connection refused" Apr 24 21:39:29.851928 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:29.851906 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" podUID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 24 21:39:29.851928 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:29.851918 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" podUID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:39:30.137425 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:30.137396 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" event={"ID":"5131ad7e-9174-49ef-9b25-764060e0f422","Type":"ContainerStarted","Data":"349a83e7eefc77973374819e6568ebccd237ea1158217a1c45599d6294ad5b71"} Apr 24 21:39:31.142120 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:31.142082 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" event={"ID":"5131ad7e-9174-49ef-9b25-764060e0f422","Type":"ContainerStarted","Data":"adf8c9e7aa7de3010be6d46d548575b8d750ff39eb21beed0d84d15c3db48e78"} Apr 24 21:39:31.142576 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:31.142247 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" Apr 24 21:39:31.161457 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:31.161408 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" podStartSLOduration=3.090996573 podStartE2EDuration="4.16139427s" podCreationTimestamp="2026-04-24 21:39:27 +0000 UTC" firstStartedPulling="2026-04-24 21:39:28.978159264 +0000 UTC m=+725.806029375" lastFinishedPulling="2026-04-24 21:39:30.048556953 +0000 UTC m=+726.876427072" observedRunningTime="2026-04-24 21:39:31.15984749 +0000 UTC m=+727.987717626" watchObservedRunningTime="2026-04-24 21:39:31.16139427 +0000 UTC m=+727.989264404" Apr 24 21:39:31.398314 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:31.398250 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" Apr 24 21:39:31.525094 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:31.525065 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d459r\" (UniqueName: \"kubernetes.io/projected/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-kube-api-access-d459r\") pod \"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5\" (UID: \"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5\") " Apr 24 21:39:31.525259 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:31.525156 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-proxy-tls\") pod \"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5\" (UID: \"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5\") " Apr 24 21:39:31.525259 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:31.525194 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-kserve-provision-location\") pod \"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5\" (UID: \"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5\") " Apr 24 21:39:31.525259 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:31.525234 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-hpa-21211-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-isvc-xgboost-graph-raw-hpa-21211-kube-rbac-proxy-sar-config\") pod \"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5\" (UID: \"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5\") " Apr 24 21:39:31.525535 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:31.525510 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" (UID: "fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:39:31.525614 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:31.525562 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-isvc-xgboost-graph-raw-hpa-21211-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-hpa-21211-kube-rbac-proxy-sar-config") pod "fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" (UID: "fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5"). InnerVolumeSpecName "isvc-xgboost-graph-raw-hpa-21211-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:39:31.527209 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:31.527181 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-kube-api-access-d459r" (OuterVolumeSpecName: "kube-api-access-d459r") pod "fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" (UID: "fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5"). InnerVolumeSpecName "kube-api-access-d459r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:39:31.527342 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:31.527325 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" (UID: "fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:39:31.626652 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:31.626613 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:39:31.626652 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:31.626646 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-kserve-provision-location\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:39:31.626840 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:31.626660 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-hpa-21211-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-isvc-xgboost-graph-raw-hpa-21211-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:39:31.626840 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:31.626676 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d459r\" (UniqueName: \"kubernetes.io/projected/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5-kube-api-access-d459r\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:39:32.130860 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.130838 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" Apr 24 21:39:32.146025 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.145995 2578 generic.go:358] "Generic (PLEG): container finished" podID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerID="e9d7de623b9c85f71fc5a1dfbaf28623a5d049199b992d6c8069da0cde4db83c" exitCode=0 Apr 24 21:39:32.146476 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.146092 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" Apr 24 21:39:32.146476 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.146089 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" event={"ID":"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5","Type":"ContainerDied","Data":"e9d7de623b9c85f71fc5a1dfbaf28623a5d049199b992d6c8069da0cde4db83c"} Apr 24 21:39:32.146476 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.146210 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj" event={"ID":"fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5","Type":"ContainerDied","Data":"215b63eea681b07993e9d31c8c10555776968e4bfd8dc6fce007f3f6a89ea735"} Apr 24 21:39:32.146476 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.146233 2578 scope.go:117] "RemoveContainer" containerID="3c1c46c11ec1b742d69d17eadb10508b35752ec78f30d2e01d0ef1ac95b0d2f4" Apr 24 21:39:32.148465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.148394 2578 generic.go:358] "Generic (PLEG): container finished" podID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerID="478a567e40cab153d56536481c00ae726988375c37377badd7f50349dd6cb24d" exitCode=0 Apr 24 21:39:32.148580 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.148527 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" Apr 24 21:39:32.148580 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.148548 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" event={"ID":"6f3f4204-24d6-4d92-ba57-df58612146cf","Type":"ContainerDied","Data":"478a567e40cab153d56536481c00ae726988375c37377badd7f50349dd6cb24d"} Apr 24 21:39:32.148688 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.148589 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524" event={"ID":"6f3f4204-24d6-4d92-ba57-df58612146cf","Type":"ContainerDied","Data":"c433c46279efbff8f5709237ec3ccb52823c09f559509e0372823cf035616419"} Apr 24 21:39:32.148922 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.148904 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" Apr 24 21:39:32.150953 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.150934 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" Apr 24 21:39:32.156753 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.156719 2578 scope.go:117] "RemoveContainer" containerID="e9d7de623b9c85f71fc5a1dfbaf28623a5d049199b992d6c8069da0cde4db83c" Apr 24 21:39:32.164433 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.164414 2578 scope.go:117] "RemoveContainer" containerID="c0750feebd86ee63480069cd84de01346f3dcb5d0604e409e8c55059cc14f03a" Apr 24 21:39:32.169088 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.169064 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj"] Apr 24 21:39:32.173015 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.172993 2578 scope.go:117] "RemoveContainer" containerID="3c1c46c11ec1b742d69d17eadb10508b35752ec78f30d2e01d0ef1ac95b0d2f4" Apr 24 21:39:32.173338 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:39:32.173317 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c1c46c11ec1b742d69d17eadb10508b35752ec78f30d2e01d0ef1ac95b0d2f4\": container with ID starting with 3c1c46c11ec1b742d69d17eadb10508b35752ec78f30d2e01d0ef1ac95b0d2f4 not found: ID does not exist" containerID="3c1c46c11ec1b742d69d17eadb10508b35752ec78f30d2e01d0ef1ac95b0d2f4" Apr 24 21:39:32.173427 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.173345 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c1c46c11ec1b742d69d17eadb10508b35752ec78f30d2e01d0ef1ac95b0d2f4"} err="failed to get container status \"3c1c46c11ec1b742d69d17eadb10508b35752ec78f30d2e01d0ef1ac95b0d2f4\": rpc error: code = NotFound desc = could not find container \"3c1c46c11ec1b742d69d17eadb10508b35752ec78f30d2e01d0ef1ac95b0d2f4\": container with ID starting with 3c1c46c11ec1b742d69d17eadb10508b35752ec78f30d2e01d0ef1ac95b0d2f4 not found: ID does not exist" Apr 24 21:39:32.173427 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.173365 2578 scope.go:117] "RemoveContainer" containerID="e9d7de623b9c85f71fc5a1dfbaf28623a5d049199b992d6c8069da0cde4db83c" Apr 24 21:39:32.173662 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:39:32.173643 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9d7de623b9c85f71fc5a1dfbaf28623a5d049199b992d6c8069da0cde4db83c\": container with ID starting with e9d7de623b9c85f71fc5a1dfbaf28623a5d049199b992d6c8069da0cde4db83c not found: ID does not exist" containerID="e9d7de623b9c85f71fc5a1dfbaf28623a5d049199b992d6c8069da0cde4db83c" Apr 24 21:39:32.173716 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.173668 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9d7de623b9c85f71fc5a1dfbaf28623a5d049199b992d6c8069da0cde4db83c"} err="failed to get container status \"e9d7de623b9c85f71fc5a1dfbaf28623a5d049199b992d6c8069da0cde4db83c\": rpc error: code = NotFound desc = could not find container \"e9d7de623b9c85f71fc5a1dfbaf28623a5d049199b992d6c8069da0cde4db83c\": container with ID starting with e9d7de623b9c85f71fc5a1dfbaf28623a5d049199b992d6c8069da0cde4db83c not found: ID does not exist" Apr 24 21:39:32.173716 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.173687 2578 scope.go:117] "RemoveContainer" containerID="c0750feebd86ee63480069cd84de01346f3dcb5d0604e409e8c55059cc14f03a" Apr 24 21:39:32.173787 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.173685 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-21211-predictor-69cf7fd86f-zr7qj"] Apr 24 21:39:32.174059 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:39:32.174043 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0750feebd86ee63480069cd84de01346f3dcb5d0604e409e8c55059cc14f03a\": container with ID starting with c0750feebd86ee63480069cd84de01346f3dcb5d0604e409e8c55059cc14f03a not found: ID does not exist" containerID="c0750feebd86ee63480069cd84de01346f3dcb5d0604e409e8c55059cc14f03a" Apr 24 21:39:32.174125 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.174063 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0750feebd86ee63480069cd84de01346f3dcb5d0604e409e8c55059cc14f03a"} err="failed to get container status \"c0750feebd86ee63480069cd84de01346f3dcb5d0604e409e8c55059cc14f03a\": rpc error: code = NotFound desc = could not find container \"c0750feebd86ee63480069cd84de01346f3dcb5d0604e409e8c55059cc14f03a\": container with ID starting with c0750feebd86ee63480069cd84de01346f3dcb5d0604e409e8c55059cc14f03a not found: ID does not exist" Apr 24 21:39:32.174125 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.174077 2578 scope.go:117] "RemoveContainer" containerID="06c7d03390a9d293ea6b6d6ea50520463ad84ba20f9ddad777ff6eda936d9e9e" Apr 24 21:39:32.180867 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.180852 2578 scope.go:117] "RemoveContainer" containerID="478a567e40cab153d56536481c00ae726988375c37377badd7f50349dd6cb24d" Apr 24 21:39:32.187578 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.187556 2578 scope.go:117] "RemoveContainer" containerID="8798fe987c63d73209248521db3b6b5411cb8590eda6585945240075f34aa92c" Apr 24 21:39:32.194420 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.194398 2578 scope.go:117] "RemoveContainer" containerID="06c7d03390a9d293ea6b6d6ea50520463ad84ba20f9ddad777ff6eda936d9e9e" Apr 24 21:39:32.194678 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:39:32.194657 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06c7d03390a9d293ea6b6d6ea50520463ad84ba20f9ddad777ff6eda936d9e9e\": container with ID starting with 06c7d03390a9d293ea6b6d6ea50520463ad84ba20f9ddad777ff6eda936d9e9e not found: ID does not exist" containerID="06c7d03390a9d293ea6b6d6ea50520463ad84ba20f9ddad777ff6eda936d9e9e" Apr 24 21:39:32.194727 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.194689 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06c7d03390a9d293ea6b6d6ea50520463ad84ba20f9ddad777ff6eda936d9e9e"} err="failed to get container status \"06c7d03390a9d293ea6b6d6ea50520463ad84ba20f9ddad777ff6eda936d9e9e\": rpc error: code = NotFound desc = could not find container \"06c7d03390a9d293ea6b6d6ea50520463ad84ba20f9ddad777ff6eda936d9e9e\": container with ID starting with 06c7d03390a9d293ea6b6d6ea50520463ad84ba20f9ddad777ff6eda936d9e9e not found: ID does not exist" Apr 24 21:39:32.194727 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.194708 2578 scope.go:117] "RemoveContainer" containerID="478a567e40cab153d56536481c00ae726988375c37377badd7f50349dd6cb24d" Apr 24 21:39:32.194971 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:39:32.194950 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"478a567e40cab153d56536481c00ae726988375c37377badd7f50349dd6cb24d\": container with ID starting with 478a567e40cab153d56536481c00ae726988375c37377badd7f50349dd6cb24d not found: ID does not exist" containerID="478a567e40cab153d56536481c00ae726988375c37377badd7f50349dd6cb24d" Apr 24 21:39:32.195065 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.194974 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"478a567e40cab153d56536481c00ae726988375c37377badd7f50349dd6cb24d"} err="failed to get container status \"478a567e40cab153d56536481c00ae726988375c37377badd7f50349dd6cb24d\": rpc error: code = NotFound desc = could not find container \"478a567e40cab153d56536481c00ae726988375c37377badd7f50349dd6cb24d\": container with ID starting with 478a567e40cab153d56536481c00ae726988375c37377badd7f50349dd6cb24d not found: ID does not exist" Apr 24 21:39:32.195065 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.194989 2578 scope.go:117] "RemoveContainer" containerID="8798fe987c63d73209248521db3b6b5411cb8590eda6585945240075f34aa92c" Apr 24 21:39:32.195250 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:39:32.195231 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8798fe987c63d73209248521db3b6b5411cb8590eda6585945240075f34aa92c\": container with ID starting with 8798fe987c63d73209248521db3b6b5411cb8590eda6585945240075f34aa92c not found: ID does not exist" containerID="8798fe987c63d73209248521db3b6b5411cb8590eda6585945240075f34aa92c" Apr 24 21:39:32.195293 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.195255 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8798fe987c63d73209248521db3b6b5411cb8590eda6585945240075f34aa92c"} err="failed to get container status \"8798fe987c63d73209248521db3b6b5411cb8590eda6585945240075f34aa92c\": rpc error: code = NotFound desc = could not find container \"8798fe987c63d73209248521db3b6b5411cb8590eda6585945240075f34aa92c\": container with ID starting with 8798fe987c63d73209248521db3b6b5411cb8590eda6585945240075f34aa92c not found: ID does not exist" Apr 24 21:39:32.232519 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.232494 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmsmm\" (UniqueName: \"kubernetes.io/projected/6f3f4204-24d6-4d92-ba57-df58612146cf-kube-api-access-nmsmm\") pod \"6f3f4204-24d6-4d92-ba57-df58612146cf\" (UID: \"6f3f4204-24d6-4d92-ba57-df58612146cf\") " Apr 24 21:39:32.232623 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.232561 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-hpa-21211-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6f3f4204-24d6-4d92-ba57-df58612146cf-isvc-sklearn-graph-raw-hpa-21211-kube-rbac-proxy-sar-config\") pod \"6f3f4204-24d6-4d92-ba57-df58612146cf\" (UID: \"6f3f4204-24d6-4d92-ba57-df58612146cf\") " Apr 24 21:39:32.232623 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.232603 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f3f4204-24d6-4d92-ba57-df58612146cf-proxy-tls\") pod \"6f3f4204-24d6-4d92-ba57-df58612146cf\" (UID: \"6f3f4204-24d6-4d92-ba57-df58612146cf\") " Apr 24 21:39:32.232692 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.232642 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f3f4204-24d6-4d92-ba57-df58612146cf-kserve-provision-location\") pod \"6f3f4204-24d6-4d92-ba57-df58612146cf\" (UID: \"6f3f4204-24d6-4d92-ba57-df58612146cf\") " Apr 24 21:39:32.232998 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.232969 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f3f4204-24d6-4d92-ba57-df58612146cf-isvc-sklearn-graph-raw-hpa-21211-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-hpa-21211-kube-rbac-proxy-sar-config") pod "6f3f4204-24d6-4d92-ba57-df58612146cf" (UID: "6f3f4204-24d6-4d92-ba57-df58612146cf"). InnerVolumeSpecName "isvc-sklearn-graph-raw-hpa-21211-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:39:32.233096 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.233013 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f3f4204-24d6-4d92-ba57-df58612146cf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6f3f4204-24d6-4d92-ba57-df58612146cf" (UID: "6f3f4204-24d6-4d92-ba57-df58612146cf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:39:32.234634 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.234613 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f3f4204-24d6-4d92-ba57-df58612146cf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6f3f4204-24d6-4d92-ba57-df58612146cf" (UID: "6f3f4204-24d6-4d92-ba57-df58612146cf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:39:32.234714 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.234666 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f3f4204-24d6-4d92-ba57-df58612146cf-kube-api-access-nmsmm" (OuterVolumeSpecName: "kube-api-access-nmsmm") pod "6f3f4204-24d6-4d92-ba57-df58612146cf" (UID: "6f3f4204-24d6-4d92-ba57-df58612146cf"). InnerVolumeSpecName "kube-api-access-nmsmm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:39:32.333953 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.333856 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-hpa-21211-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6f3f4204-24d6-4d92-ba57-df58612146cf-isvc-sklearn-graph-raw-hpa-21211-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:39:32.333953 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.333885 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f3f4204-24d6-4d92-ba57-df58612146cf-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:39:32.333953 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.333933 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f3f4204-24d6-4d92-ba57-df58612146cf-kserve-provision-location\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:39:32.333953 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.333943 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nmsmm\" (UniqueName: \"kubernetes.io/projected/6f3f4204-24d6-4d92-ba57-df58612146cf-kube-api-access-nmsmm\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:39:32.476144 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.476113 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524"] Apr 24 21:39:32.483299 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:32.483272 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-21211-predictor-5f96d8846-c4524"] Apr 24 21:39:33.763465 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:33.763402 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f3f4204-24d6-4d92-ba57-df58612146cf" path="/var/lib/kubelet/pods/6f3f4204-24d6-4d92-ba57-df58612146cf/volumes" Apr 24 21:39:33.764115 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:33.764092 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" path="/var/lib/kubelet/pods/fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5/volumes" Apr 24 21:39:39.160024 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:39.159995 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" Apr 24 21:39:48.033908 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.033861 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv"] Apr 24 21:39:48.034317 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.034188 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerName="storage-initializer" Apr 24 21:39:48.034317 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.034199 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerName="storage-initializer" Apr 24 21:39:48.034317 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.034211 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerName="kube-rbac-proxy" Apr 24 21:39:48.034317 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.034217 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerName="kube-rbac-proxy" Apr 24 21:39:48.034317 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.034225 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerName="kserve-container" Apr 24 21:39:48.034317 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.034230 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerName="kserve-container" Apr 24 21:39:48.034317 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.034238 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerName="kube-rbac-proxy" Apr 24 21:39:48.034317 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.034243 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerName="kube-rbac-proxy" Apr 24 21:39:48.034317 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.034249 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerName="storage-initializer" Apr 24 21:39:48.034317 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.034255 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerName="storage-initializer" Apr 24 21:39:48.034317 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.034262 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerName="kserve-container" Apr 24 21:39:48.034317 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.034267 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerName="kserve-container" Apr 24 21:39:48.034317 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.034316 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerName="kube-rbac-proxy" Apr 24 21:39:48.034317 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.034323 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerName="kube-rbac-proxy" Apr 24 21:39:48.034721 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.034329 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f3f4204-24d6-4d92-ba57-df58612146cf" containerName="kserve-container" Apr 24 21:39:48.034721 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.034339 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd3e79e1-ba1d-4249-a3fc-c46bfd68dce5" containerName="kserve-container" Apr 24 21:39:48.037481 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.037460 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:39:48.040173 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.040141 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-9a20e-predictor-serving-cert\"" Apr 24 21:39:48.040307 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.040156 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-9a20e-kube-rbac-proxy-sar-config\"" Apr 24 21:39:48.055289 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.055257 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv"] Apr 24 21:39:48.156562 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.156530 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-raw-9a20e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/536557ba-cfa7-4d51-8905-09b279dcd213-isvc-logger-raw-9a20e-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv\" (UID: \"536557ba-cfa7-4d51-8905-09b279dcd213\") " pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:39:48.156562 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.156569 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/536557ba-cfa7-4d51-8905-09b279dcd213-kserve-provision-location\") pod \"isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv\" (UID: \"536557ba-cfa7-4d51-8905-09b279dcd213\") " pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:39:48.156775 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.156647 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/536557ba-cfa7-4d51-8905-09b279dcd213-proxy-tls\") pod \"isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv\" (UID: \"536557ba-cfa7-4d51-8905-09b279dcd213\") " pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:39:48.156775 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.156734 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndf9j\" (UniqueName: \"kubernetes.io/projected/536557ba-cfa7-4d51-8905-09b279dcd213-kube-api-access-ndf9j\") pod \"isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv\" (UID: \"536557ba-cfa7-4d51-8905-09b279dcd213\") " pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:39:48.257117 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.257082 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndf9j\" (UniqueName: \"kubernetes.io/projected/536557ba-cfa7-4d51-8905-09b279dcd213-kube-api-access-ndf9j\") pod \"isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv\" (UID: \"536557ba-cfa7-4d51-8905-09b279dcd213\") " pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:39:48.257264 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.257130 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-raw-9a20e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/536557ba-cfa7-4d51-8905-09b279dcd213-isvc-logger-raw-9a20e-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv\" (UID: \"536557ba-cfa7-4d51-8905-09b279dcd213\") " pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:39:48.257264 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.257249 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/536557ba-cfa7-4d51-8905-09b279dcd213-kserve-provision-location\") pod \"isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv\" (UID: \"536557ba-cfa7-4d51-8905-09b279dcd213\") " pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:39:48.257371 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.257332 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/536557ba-cfa7-4d51-8905-09b279dcd213-proxy-tls\") pod \"isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv\" (UID: \"536557ba-cfa7-4d51-8905-09b279dcd213\") " pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:39:48.257636 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.257616 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/536557ba-cfa7-4d51-8905-09b279dcd213-kserve-provision-location\") pod \"isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv\" (UID: \"536557ba-cfa7-4d51-8905-09b279dcd213\") " pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:39:48.257744 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.257726 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-raw-9a20e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/536557ba-cfa7-4d51-8905-09b279dcd213-isvc-logger-raw-9a20e-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv\" (UID: \"536557ba-cfa7-4d51-8905-09b279dcd213\") " pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:39:48.259638 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.259620 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/536557ba-cfa7-4d51-8905-09b279dcd213-proxy-tls\") pod \"isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv\" (UID: \"536557ba-cfa7-4d51-8905-09b279dcd213\") " pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:39:48.270753 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.270725 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndf9j\" (UniqueName: \"kubernetes.io/projected/536557ba-cfa7-4d51-8905-09b279dcd213-kube-api-access-ndf9j\") pod \"isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv\" (UID: \"536557ba-cfa7-4d51-8905-09b279dcd213\") " pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:39:48.347818 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.347751 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:39:48.476846 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:48.476814 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv"] Apr 24 21:39:48.480550 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:39:48.480516 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod536557ba_cfa7_4d51_8905_09b279dcd213.slice/crio-619cdb5618f7aca4c4b734b4b8c4ded7321c1810a0bc6424ea31ca964dcb7a25 WatchSource:0}: Error finding container 619cdb5618f7aca4c4b734b4b8c4ded7321c1810a0bc6424ea31ca964dcb7a25: Status 404 returned error can't find the container with id 619cdb5618f7aca4c4b734b4b8c4ded7321c1810a0bc6424ea31ca964dcb7a25 Apr 24 21:39:49.200869 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:49.200832 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" event={"ID":"536557ba-cfa7-4d51-8905-09b279dcd213","Type":"ContainerStarted","Data":"cc56d00f8312c88ce8e9ae6211525f50148ad8bab659da17eadca8b74b6a7527"} Apr 24 21:39:49.200869 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:49.200868 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" event={"ID":"536557ba-cfa7-4d51-8905-09b279dcd213","Type":"ContainerStarted","Data":"619cdb5618f7aca4c4b734b4b8c4ded7321c1810a0bc6424ea31ca964dcb7a25"} Apr 24 21:39:52.210649 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:52.210615 2578 generic.go:358] "Generic (PLEG): container finished" podID="536557ba-cfa7-4d51-8905-09b279dcd213" containerID="cc56d00f8312c88ce8e9ae6211525f50148ad8bab659da17eadca8b74b6a7527" exitCode=0 Apr 24 21:39:52.211048 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:52.210695 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" event={"ID":"536557ba-cfa7-4d51-8905-09b279dcd213","Type":"ContainerDied","Data":"cc56d00f8312c88ce8e9ae6211525f50148ad8bab659da17eadca8b74b6a7527"} Apr 24 21:39:53.215541 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:53.215505 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" event={"ID":"536557ba-cfa7-4d51-8905-09b279dcd213","Type":"ContainerStarted","Data":"e4ec4ea1f2a779e2e9396392291d6c1da18f9131be7e8c5ccd69bafc15658d6e"} Apr 24 21:39:53.215541 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:53.215546 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" event={"ID":"536557ba-cfa7-4d51-8905-09b279dcd213","Type":"ContainerStarted","Data":"7d3af2bddb581fb3846596ff3defd39be91aefc401683d417a3e8d7a71e31ccc"} Apr 24 21:39:53.215975 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:53.215557 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" event={"ID":"536557ba-cfa7-4d51-8905-09b279dcd213","Type":"ContainerStarted","Data":"992c9f4b5c584798e0d1ed9627ddad052785ddeef976ec1ccb02a587db0f69ea"} Apr 24 21:39:53.215975 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:53.215909 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:39:53.215975 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:53.215942 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:39:53.217154 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:53.217128 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:39:53.237737 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:53.237692 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podStartSLOduration=5.23767635 podStartE2EDuration="5.23767635s" podCreationTimestamp="2026-04-24 21:39:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:39:53.236489744 +0000 UTC m=+750.064359866" watchObservedRunningTime="2026-04-24 21:39:53.23767635 +0000 UTC m=+750.065546481" Apr 24 21:39:54.218385 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:54.218347 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:39:54.218840 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:54.218501 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:39:54.219374 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:54.219349 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:55.221395 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:55.221343 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:39:55.221839 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:39:55.221742 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:00.225140 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:40:00.225111 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:40:00.225761 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:40:00.225724 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:40:00.226056 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:40:00.226031 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:10.225940 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:40:10.225881 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:40:10.226340 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:40:10.226321 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:20.226238 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:40:20.226198 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:40:20.226679 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:40:20.226598 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:30.226463 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:40:30.226417 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:40:30.226942 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:40:30.226918 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:40.226161 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:40:40.226113 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:40:40.226622 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:40:40.226594 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:50.225707 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:40:50.225656 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:40:50.226133 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:40:50.226095 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:41:00.226866 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:00.226836 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:41:00.227412 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:00.227029 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:41:12.983523 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:12.983491 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28_5131ad7e-9174-49ef-9b25-764060e0f422/kserve-container/0.log" Apr 24 21:41:13.129359 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.129330 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28"] Apr 24 21:41:13.129631 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.129585 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" podUID="5131ad7e-9174-49ef-9b25-764060e0f422" containerName="kserve-container" containerID="cri-o://349a83e7eefc77973374819e6568ebccd237ea1158217a1c45599d6294ad5b71" gracePeriod=30 Apr 24 21:41:13.129720 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.129657 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" podUID="5131ad7e-9174-49ef-9b25-764060e0f422" containerName="kube-rbac-proxy" containerID="cri-o://adf8c9e7aa7de3010be6d46d548575b8d750ff39eb21beed0d84d15c3db48e78" gracePeriod=30 Apr 24 21:41:13.193377 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.193351 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv"] Apr 24 21:41:13.193754 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.193699 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kserve-container" containerID="cri-o://992c9f4b5c584798e0d1ed9627ddad052785ddeef976ec1ccb02a587db0f69ea" gracePeriod=30 Apr 24 21:41:13.193919 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.193732 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="agent" containerID="cri-o://e4ec4ea1f2a779e2e9396392291d6c1da18f9131be7e8c5ccd69bafc15658d6e" gracePeriod=30 Apr 24 21:41:13.193919 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.193779 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kube-rbac-proxy" containerID="cri-o://7d3af2bddb581fb3846596ff3defd39be91aefc401683d417a3e8d7a71e31ccc" gracePeriod=30 Apr 24 21:41:13.221117 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.221095 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd"] Apr 24 21:41:13.224650 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.224634 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" Apr 24 21:41:13.227027 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.227004 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-9e9e5-kube-rbac-proxy-sar-config\"" Apr 24 21:41:13.227128 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.227025 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-9e9e5-predictor-serving-cert\"" Apr 24 21:41:13.236498 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.236451 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd"] Apr 24 21:41:13.259094 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.259063 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df4lb\" (UniqueName: \"kubernetes.io/projected/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-kube-api-access-df4lb\") pod \"isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd\" (UID: \"e7a6b280-586b-4d38-8a28-672f8fb0c8f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" Apr 24 21:41:13.259181 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.259128 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-scale-raw-9e9e5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-isvc-sklearn-scale-raw-9e9e5-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd\" (UID: \"e7a6b280-586b-4d38-8a28-672f8fb0c8f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" Apr 24 21:41:13.259240 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.259207 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-proxy-tls\") pod \"isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd\" (UID: \"e7a6b280-586b-4d38-8a28-672f8fb0c8f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" Apr 24 21:41:13.259277 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.259243 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd\" (UID: \"e7a6b280-586b-4d38-8a28-672f8fb0c8f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" Apr 24 21:41:13.359708 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.359672 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-scale-raw-9e9e5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-isvc-sklearn-scale-raw-9e9e5-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd\" (UID: \"e7a6b280-586b-4d38-8a28-672f8fb0c8f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" Apr 24 21:41:13.359801 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.359749 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-proxy-tls\") pod \"isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd\" (UID: \"e7a6b280-586b-4d38-8a28-672f8fb0c8f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" Apr 24 21:41:13.359875 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:41:13.359847 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-serving-cert: secret "isvc-sklearn-scale-raw-9e9e5-predictor-serving-cert" not found Apr 24 21:41:13.359875 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.359849 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd\" (UID: \"e7a6b280-586b-4d38-8a28-672f8fb0c8f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" Apr 24 21:41:13.360029 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:41:13.359936 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-proxy-tls podName:e7a6b280-586b-4d38-8a28-672f8fb0c8f7 nodeName:}" failed. No retries permitted until 2026-04-24 21:41:13.859903269 +0000 UTC m=+830.687773396 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-proxy-tls") pod "isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" (UID: "e7a6b280-586b-4d38-8a28-672f8fb0c8f7") : secret "isvc-sklearn-scale-raw-9e9e5-predictor-serving-cert" not found Apr 24 21:41:13.360029 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.359988 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-df4lb\" (UniqueName: \"kubernetes.io/projected/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-kube-api-access-df4lb\") pod \"isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd\" (UID: \"e7a6b280-586b-4d38-8a28-672f8fb0c8f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" Apr 24 21:41:13.360263 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.360234 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd\" (UID: \"e7a6b280-586b-4d38-8a28-672f8fb0c8f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" Apr 24 21:41:13.360966 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.360944 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-scale-raw-9e9e5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-isvc-sklearn-scale-raw-9e9e5-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd\" (UID: \"e7a6b280-586b-4d38-8a28-672f8fb0c8f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" Apr 24 21:41:13.370981 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.370956 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-df4lb\" (UniqueName: \"kubernetes.io/projected/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-kube-api-access-df4lb\") pod \"isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd\" (UID: \"e7a6b280-586b-4d38-8a28-672f8fb0c8f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" Apr 24 21:41:13.410022 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.410005 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" Apr 24 21:41:13.449770 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.449746 2578 generic.go:358] "Generic (PLEG): container finished" podID="536557ba-cfa7-4d51-8905-09b279dcd213" containerID="7d3af2bddb581fb3846596ff3defd39be91aefc401683d417a3e8d7a71e31ccc" exitCode=2 Apr 24 21:41:13.449879 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.449830 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" event={"ID":"536557ba-cfa7-4d51-8905-09b279dcd213","Type":"ContainerDied","Data":"7d3af2bddb581fb3846596ff3defd39be91aefc401683d417a3e8d7a71e31ccc"} Apr 24 21:41:13.451012 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.450991 2578 generic.go:358] "Generic (PLEG): container finished" podID="5131ad7e-9174-49ef-9b25-764060e0f422" containerID="adf8c9e7aa7de3010be6d46d548575b8d750ff39eb21beed0d84d15c3db48e78" exitCode=2 Apr 24 21:41:13.451012 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.451010 2578 generic.go:358] "Generic (PLEG): container finished" podID="5131ad7e-9174-49ef-9b25-764060e0f422" containerID="349a83e7eefc77973374819e6568ebccd237ea1158217a1c45599d6294ad5b71" exitCode=2 Apr 24 21:41:13.451160 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.451028 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" event={"ID":"5131ad7e-9174-49ef-9b25-764060e0f422","Type":"ContainerDied","Data":"adf8c9e7aa7de3010be6d46d548575b8d750ff39eb21beed0d84d15c3db48e78"} Apr 24 21:41:13.451160 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.451048 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" event={"ID":"5131ad7e-9174-49ef-9b25-764060e0f422","Type":"ContainerDied","Data":"349a83e7eefc77973374819e6568ebccd237ea1158217a1c45599d6294ad5b71"} Apr 24 21:41:13.451160 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.451058 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" event={"ID":"5131ad7e-9174-49ef-9b25-764060e0f422","Type":"ContainerDied","Data":"1c57761d22cc883807187644aa1ba5caf06c1a0be32b8a0cf7534f5862612c40"} Apr 24 21:41:13.451160 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.451072 2578 scope.go:117] "RemoveContainer" containerID="adf8c9e7aa7de3010be6d46d548575b8d750ff39eb21beed0d84d15c3db48e78" Apr 24 21:41:13.451160 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.451079 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28" Apr 24 21:41:13.458394 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.458377 2578 scope.go:117] "RemoveContainer" containerID="349a83e7eefc77973374819e6568ebccd237ea1158217a1c45599d6294ad5b71" Apr 24 21:41:13.460356 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.460337 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5131ad7e-9174-49ef-9b25-764060e0f422-proxy-tls\") pod \"5131ad7e-9174-49ef-9b25-764060e0f422\" (UID: \"5131ad7e-9174-49ef-9b25-764060e0f422\") " Apr 24 21:41:13.460460 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.460396 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-raw-9a20e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5131ad7e-9174-49ef-9b25-764060e0f422-message-dumper-raw-9a20e-kube-rbac-proxy-sar-config\") pod \"5131ad7e-9174-49ef-9b25-764060e0f422\" (UID: \"5131ad7e-9174-49ef-9b25-764060e0f422\") " Apr 24 21:41:13.460460 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.460421 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxdhh\" (UniqueName: \"kubernetes.io/projected/5131ad7e-9174-49ef-9b25-764060e0f422-kube-api-access-gxdhh\") pod \"5131ad7e-9174-49ef-9b25-764060e0f422\" (UID: \"5131ad7e-9174-49ef-9b25-764060e0f422\") " Apr 24 21:41:13.460738 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.460717 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5131ad7e-9174-49ef-9b25-764060e0f422-message-dumper-raw-9a20e-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-raw-9a20e-kube-rbac-proxy-sar-config") pod "5131ad7e-9174-49ef-9b25-764060e0f422" (UID: "5131ad7e-9174-49ef-9b25-764060e0f422"). InnerVolumeSpecName "message-dumper-raw-9a20e-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:41:13.462267 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.462246 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5131ad7e-9174-49ef-9b25-764060e0f422-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5131ad7e-9174-49ef-9b25-764060e0f422" (UID: "5131ad7e-9174-49ef-9b25-764060e0f422"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:41:13.462364 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.462341 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5131ad7e-9174-49ef-9b25-764060e0f422-kube-api-access-gxdhh" (OuterVolumeSpecName: "kube-api-access-gxdhh") pod "5131ad7e-9174-49ef-9b25-764060e0f422" (UID: "5131ad7e-9174-49ef-9b25-764060e0f422"). InnerVolumeSpecName "kube-api-access-gxdhh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:41:13.465341 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.465322 2578 scope.go:117] "RemoveContainer" containerID="adf8c9e7aa7de3010be6d46d548575b8d750ff39eb21beed0d84d15c3db48e78" Apr 24 21:41:13.465576 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:41:13.465556 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adf8c9e7aa7de3010be6d46d548575b8d750ff39eb21beed0d84d15c3db48e78\": container with ID starting with adf8c9e7aa7de3010be6d46d548575b8d750ff39eb21beed0d84d15c3db48e78 not found: ID does not exist" containerID="adf8c9e7aa7de3010be6d46d548575b8d750ff39eb21beed0d84d15c3db48e78" Apr 24 21:41:13.465633 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.465585 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adf8c9e7aa7de3010be6d46d548575b8d750ff39eb21beed0d84d15c3db48e78"} err="failed to get container status \"adf8c9e7aa7de3010be6d46d548575b8d750ff39eb21beed0d84d15c3db48e78\": rpc error: code = NotFound desc = could not find container \"adf8c9e7aa7de3010be6d46d548575b8d750ff39eb21beed0d84d15c3db48e78\": container with ID starting with adf8c9e7aa7de3010be6d46d548575b8d750ff39eb21beed0d84d15c3db48e78 not found: ID does not exist" Apr 24 21:41:13.465633 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.465603 2578 scope.go:117] "RemoveContainer" containerID="349a83e7eefc77973374819e6568ebccd237ea1158217a1c45599d6294ad5b71" Apr 24 21:41:13.465822 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:41:13.465808 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"349a83e7eefc77973374819e6568ebccd237ea1158217a1c45599d6294ad5b71\": container with ID starting with 349a83e7eefc77973374819e6568ebccd237ea1158217a1c45599d6294ad5b71 not found: ID does not exist" containerID="349a83e7eefc77973374819e6568ebccd237ea1158217a1c45599d6294ad5b71" Apr 24 21:41:13.465874 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.465825 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"349a83e7eefc77973374819e6568ebccd237ea1158217a1c45599d6294ad5b71"} err="failed to get container status \"349a83e7eefc77973374819e6568ebccd237ea1158217a1c45599d6294ad5b71\": rpc error: code = NotFound desc = could not find container \"349a83e7eefc77973374819e6568ebccd237ea1158217a1c45599d6294ad5b71\": container with ID starting with 349a83e7eefc77973374819e6568ebccd237ea1158217a1c45599d6294ad5b71 not found: ID does not exist" Apr 24 21:41:13.465874 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.465839 2578 scope.go:117] "RemoveContainer" containerID="adf8c9e7aa7de3010be6d46d548575b8d750ff39eb21beed0d84d15c3db48e78" Apr 24 21:41:13.466068 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.466052 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adf8c9e7aa7de3010be6d46d548575b8d750ff39eb21beed0d84d15c3db48e78"} err="failed to get container status \"adf8c9e7aa7de3010be6d46d548575b8d750ff39eb21beed0d84d15c3db48e78\": rpc error: code = NotFound desc = could not find container \"adf8c9e7aa7de3010be6d46d548575b8d750ff39eb21beed0d84d15c3db48e78\": container with ID starting with adf8c9e7aa7de3010be6d46d548575b8d750ff39eb21beed0d84d15c3db48e78 not found: ID does not exist" Apr 24 21:41:13.466110 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.466068 2578 scope.go:117] "RemoveContainer" containerID="349a83e7eefc77973374819e6568ebccd237ea1158217a1c45599d6294ad5b71" Apr 24 21:41:13.466269 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.466252 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"349a83e7eefc77973374819e6568ebccd237ea1158217a1c45599d6294ad5b71"} err="failed to get container status \"349a83e7eefc77973374819e6568ebccd237ea1158217a1c45599d6294ad5b71\": rpc error: code = NotFound desc = could not find container \"349a83e7eefc77973374819e6568ebccd237ea1158217a1c45599d6294ad5b71\": container with ID starting with 349a83e7eefc77973374819e6568ebccd237ea1158217a1c45599d6294ad5b71 not found: ID does not exist" Apr 24 21:41:13.561287 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.561236 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5131ad7e-9174-49ef-9b25-764060e0f422-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:41:13.561287 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.561264 2578 reconciler_common.go:299] "Volume detached for volume \"message-dumper-raw-9a20e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5131ad7e-9174-49ef-9b25-764060e0f422-message-dumper-raw-9a20e-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:41:13.561287 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.561275 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gxdhh\" (UniqueName: \"kubernetes.io/projected/5131ad7e-9174-49ef-9b25-764060e0f422-kube-api-access-gxdhh\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:41:13.775512 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.775481 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28"] Apr 24 21:41:13.781702 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.781682 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-9a20e-predictor-7bfbff9d59-cjz28"] Apr 24 21:41:13.863423 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:13.863360 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-proxy-tls\") pod \"isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd\" (UID: \"e7a6b280-586b-4d38-8a28-672f8fb0c8f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" Apr 24 21:41:13.863516 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:41:13.863460 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-serving-cert: secret "isvc-sklearn-scale-raw-9e9e5-predictor-serving-cert" not found Apr 24 21:41:13.863516 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:41:13.863506 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-proxy-tls podName:e7a6b280-586b-4d38-8a28-672f8fb0c8f7 nodeName:}" failed. No retries permitted until 2026-04-24 21:41:14.863493261 +0000 UTC m=+831.691363370 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-proxy-tls") pod "isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" (UID: "e7a6b280-586b-4d38-8a28-672f8fb0c8f7") : secret "isvc-sklearn-scale-raw-9e9e5-predictor-serving-cert" not found Apr 24 21:41:14.870999 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:14.870965 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-proxy-tls\") pod \"isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd\" (UID: \"e7a6b280-586b-4d38-8a28-672f8fb0c8f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" Apr 24 21:41:14.873274 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:14.873246 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-proxy-tls\") pod \"isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd\" (UID: \"e7a6b280-586b-4d38-8a28-672f8fb0c8f7\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" Apr 24 21:41:15.035231 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:15.035182 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" Apr 24 21:41:15.157545 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:15.157473 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd"] Apr 24 21:41:15.160570 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:41:15.160538 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7a6b280_586b_4d38_8a28_672f8fb0c8f7.slice/crio-f11fa32fa4dc6d23e6dbe6015773acfb3a626e19332410d4293a0152ead8af77 WatchSource:0}: Error finding container f11fa32fa4dc6d23e6dbe6015773acfb3a626e19332410d4293a0152ead8af77: Status 404 returned error can't find the container with id f11fa32fa4dc6d23e6dbe6015773acfb3a626e19332410d4293a0152ead8af77 Apr 24 21:41:15.222341 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:15.222307 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.26:8643/healthz\": dial tcp 10.133.0.26:8643: connect: connection refused" Apr 24 21:41:15.458744 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:15.458705 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" event={"ID":"e7a6b280-586b-4d38-8a28-672f8fb0c8f7","Type":"ContainerStarted","Data":"ad501262ea56b797897ea9f034e5d00ecd8a804a19eb4bc33d1476af6c8af4b5"} Apr 24 21:41:15.458744 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:15.458741 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" event={"ID":"e7a6b280-586b-4d38-8a28-672f8fb0c8f7","Type":"ContainerStarted","Data":"f11fa32fa4dc6d23e6dbe6015773acfb3a626e19332410d4293a0152ead8af77"} Apr 24 21:41:15.763864 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:15.763782 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5131ad7e-9174-49ef-9b25-764060e0f422" path="/var/lib/kubelet/pods/5131ad7e-9174-49ef-9b25-764060e0f422/volumes" Apr 24 21:41:17.467006 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:17.466885 2578 generic.go:358] "Generic (PLEG): container finished" podID="536557ba-cfa7-4d51-8905-09b279dcd213" containerID="992c9f4b5c584798e0d1ed9627ddad052785ddeef976ec1ccb02a587db0f69ea" exitCode=0 Apr 24 21:41:17.467006 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:17.466945 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" event={"ID":"536557ba-cfa7-4d51-8905-09b279dcd213","Type":"ContainerDied","Data":"992c9f4b5c584798e0d1ed9627ddad052785ddeef976ec1ccb02a587db0f69ea"} Apr 24 21:41:19.474457 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:19.474424 2578 generic.go:358] "Generic (PLEG): container finished" podID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerID="ad501262ea56b797897ea9f034e5d00ecd8a804a19eb4bc33d1476af6c8af4b5" exitCode=0 Apr 24 21:41:19.474817 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:19.474472 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" event={"ID":"e7a6b280-586b-4d38-8a28-672f8fb0c8f7","Type":"ContainerDied","Data":"ad501262ea56b797897ea9f034e5d00ecd8a804a19eb4bc33d1476af6c8af4b5"} Apr 24 21:41:20.222459 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:20.222419 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.26:8643/healthz\": dial tcp 10.133.0.26:8643: connect: connection refused" Apr 24 21:41:20.225803 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:20.225776 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:41:20.226120 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:20.226095 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:41:20.479858 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:20.479773 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" event={"ID":"e7a6b280-586b-4d38-8a28-672f8fb0c8f7","Type":"ContainerStarted","Data":"9d296da6937fbc2ee041343f03ae22f6150a905eb207c8894e316d191b93aa1b"} Apr 24 21:41:20.479858 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:20.479812 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" event={"ID":"e7a6b280-586b-4d38-8a28-672f8fb0c8f7","Type":"ContainerStarted","Data":"37e13817a521eca88a1579845b93e82df06f96b84274ad0a41f77b44e5f771a0"} Apr 24 21:41:20.480306 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:20.480095 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" Apr 24 21:41:20.480306 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:20.480121 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" Apr 24 21:41:20.481417 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:20.481396 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:41:20.500010 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:20.499964 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" podStartSLOduration=7.499952178 podStartE2EDuration="7.499952178s" podCreationTimestamp="2026-04-24 21:41:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:41:20.498512487 +0000 UTC m=+837.326382621" watchObservedRunningTime="2026-04-24 21:41:20.499952178 +0000 UTC m=+837.327822304" Apr 24 21:41:21.483915 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:21.483866 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:41:25.222219 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:25.222180 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.26:8643/healthz\": dial tcp 10.133.0.26:8643: connect: connection refused" Apr 24 21:41:25.222649 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:25.222287 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:41:26.489174 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:26.489143 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" Apr 24 21:41:26.489700 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:26.489673 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:41:30.222177 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:30.222134 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.26:8643/healthz\": dial tcp 10.133.0.26:8643: connect: connection refused" Apr 24 21:41:30.225923 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:30.225865 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:41:30.226216 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:30.226193 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:41:35.222576 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:35.222527 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.26:8643/healthz\": dial tcp 10.133.0.26:8643: connect: connection refused" Apr 24 21:41:36.489900 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:36.489846 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:41:40.221716 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:40.221670 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.26:8643/healthz\": dial tcp 10.133.0.26:8643: connect: connection refused" Apr 24 21:41:40.225919 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:40.225875 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:41:40.226059 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:40.226043 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:41:40.226245 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:40.226222 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:41:40.226322 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:40.226312 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:41:43.365973 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.365946 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:41:43.504703 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.504613 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/536557ba-cfa7-4d51-8905-09b279dcd213-kserve-provision-location\") pod \"536557ba-cfa7-4d51-8905-09b279dcd213\" (UID: \"536557ba-cfa7-4d51-8905-09b279dcd213\") " Apr 24 21:41:43.504703 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.504671 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndf9j\" (UniqueName: \"kubernetes.io/projected/536557ba-cfa7-4d51-8905-09b279dcd213-kube-api-access-ndf9j\") pod \"536557ba-cfa7-4d51-8905-09b279dcd213\" (UID: \"536557ba-cfa7-4d51-8905-09b279dcd213\") " Apr 24 21:41:43.504703 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.504701 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-raw-9a20e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/536557ba-cfa7-4d51-8905-09b279dcd213-isvc-logger-raw-9a20e-kube-rbac-proxy-sar-config\") pod \"536557ba-cfa7-4d51-8905-09b279dcd213\" (UID: \"536557ba-cfa7-4d51-8905-09b279dcd213\") " Apr 24 21:41:43.505006 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.504735 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/536557ba-cfa7-4d51-8905-09b279dcd213-proxy-tls\") pod \"536557ba-cfa7-4d51-8905-09b279dcd213\" (UID: \"536557ba-cfa7-4d51-8905-09b279dcd213\") " Apr 24 21:41:43.505067 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.505046 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/536557ba-cfa7-4d51-8905-09b279dcd213-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "536557ba-cfa7-4d51-8905-09b279dcd213" (UID: "536557ba-cfa7-4d51-8905-09b279dcd213"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:41:43.505124 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.505079 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/536557ba-cfa7-4d51-8905-09b279dcd213-isvc-logger-raw-9a20e-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-raw-9a20e-kube-rbac-proxy-sar-config") pod "536557ba-cfa7-4d51-8905-09b279dcd213" (UID: "536557ba-cfa7-4d51-8905-09b279dcd213"). InnerVolumeSpecName "isvc-logger-raw-9a20e-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:41:43.506741 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.506720 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/536557ba-cfa7-4d51-8905-09b279dcd213-kube-api-access-ndf9j" (OuterVolumeSpecName: "kube-api-access-ndf9j") pod "536557ba-cfa7-4d51-8905-09b279dcd213" (UID: "536557ba-cfa7-4d51-8905-09b279dcd213"). InnerVolumeSpecName "kube-api-access-ndf9j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:41:43.506846 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.506828 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/536557ba-cfa7-4d51-8905-09b279dcd213-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "536557ba-cfa7-4d51-8905-09b279dcd213" (UID: "536557ba-cfa7-4d51-8905-09b279dcd213"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:41:43.554553 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.554513 2578 generic.go:358] "Generic (PLEG): container finished" podID="536557ba-cfa7-4d51-8905-09b279dcd213" containerID="e4ec4ea1f2a779e2e9396392291d6c1da18f9131be7e8c5ccd69bafc15658d6e" exitCode=0 Apr 24 21:41:43.554712 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.554596 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" event={"ID":"536557ba-cfa7-4d51-8905-09b279dcd213","Type":"ContainerDied","Data":"e4ec4ea1f2a779e2e9396392291d6c1da18f9131be7e8c5ccd69bafc15658d6e"} Apr 24 21:41:43.554712 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.554639 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" event={"ID":"536557ba-cfa7-4d51-8905-09b279dcd213","Type":"ContainerDied","Data":"619cdb5618f7aca4c4b734b4b8c4ded7321c1810a0bc6424ea31ca964dcb7a25"} Apr 24 21:41:43.554712 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.554656 2578 scope.go:117] "RemoveContainer" containerID="e4ec4ea1f2a779e2e9396392291d6c1da18f9131be7e8c5ccd69bafc15658d6e" Apr 24 21:41:43.554712 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.554608 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv" Apr 24 21:41:43.563149 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.563128 2578 scope.go:117] "RemoveContainer" containerID="7d3af2bddb581fb3846596ff3defd39be91aefc401683d417a3e8d7a71e31ccc" Apr 24 21:41:43.570152 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.570132 2578 scope.go:117] "RemoveContainer" containerID="992c9f4b5c584798e0d1ed9627ddad052785ddeef976ec1ccb02a587db0f69ea" Apr 24 21:41:43.576693 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.576676 2578 scope.go:117] "RemoveContainer" containerID="cc56d00f8312c88ce8e9ae6211525f50148ad8bab659da17eadca8b74b6a7527" Apr 24 21:41:43.580662 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.580544 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv"] Apr 24 21:41:43.582370 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.582349 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-9a20e-predictor-5976c8f66-dfrcv"] Apr 24 21:41:43.584765 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.584749 2578 scope.go:117] "RemoveContainer" containerID="e4ec4ea1f2a779e2e9396392291d6c1da18f9131be7e8c5ccd69bafc15658d6e" Apr 24 21:41:43.585040 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:41:43.585024 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4ec4ea1f2a779e2e9396392291d6c1da18f9131be7e8c5ccd69bafc15658d6e\": container with ID starting with e4ec4ea1f2a779e2e9396392291d6c1da18f9131be7e8c5ccd69bafc15658d6e not found: ID does not exist" containerID="e4ec4ea1f2a779e2e9396392291d6c1da18f9131be7e8c5ccd69bafc15658d6e" Apr 24 21:41:43.585077 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.585050 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4ec4ea1f2a779e2e9396392291d6c1da18f9131be7e8c5ccd69bafc15658d6e"} err="failed to get container status \"e4ec4ea1f2a779e2e9396392291d6c1da18f9131be7e8c5ccd69bafc15658d6e\": rpc error: code = NotFound desc = could not find container \"e4ec4ea1f2a779e2e9396392291d6c1da18f9131be7e8c5ccd69bafc15658d6e\": container with ID starting with e4ec4ea1f2a779e2e9396392291d6c1da18f9131be7e8c5ccd69bafc15658d6e not found: ID does not exist" Apr 24 21:41:43.585077 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.585068 2578 scope.go:117] "RemoveContainer" containerID="7d3af2bddb581fb3846596ff3defd39be91aefc401683d417a3e8d7a71e31ccc" Apr 24 21:41:43.585301 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:41:43.585281 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d3af2bddb581fb3846596ff3defd39be91aefc401683d417a3e8d7a71e31ccc\": container with ID starting with 7d3af2bddb581fb3846596ff3defd39be91aefc401683d417a3e8d7a71e31ccc not found: ID does not exist" containerID="7d3af2bddb581fb3846596ff3defd39be91aefc401683d417a3e8d7a71e31ccc" Apr 24 21:41:43.585351 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.585309 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d3af2bddb581fb3846596ff3defd39be91aefc401683d417a3e8d7a71e31ccc"} err="failed to get container status \"7d3af2bddb581fb3846596ff3defd39be91aefc401683d417a3e8d7a71e31ccc\": rpc error: code = NotFound desc = could not find container \"7d3af2bddb581fb3846596ff3defd39be91aefc401683d417a3e8d7a71e31ccc\": container with ID starting with 7d3af2bddb581fb3846596ff3defd39be91aefc401683d417a3e8d7a71e31ccc not found: ID does not exist" Apr 24 21:41:43.585351 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.585327 2578 scope.go:117] "RemoveContainer" containerID="992c9f4b5c584798e0d1ed9627ddad052785ddeef976ec1ccb02a587db0f69ea" Apr 24 21:41:43.585555 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:41:43.585528 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"992c9f4b5c584798e0d1ed9627ddad052785ddeef976ec1ccb02a587db0f69ea\": container with ID starting with 992c9f4b5c584798e0d1ed9627ddad052785ddeef976ec1ccb02a587db0f69ea not found: ID does not exist" containerID="992c9f4b5c584798e0d1ed9627ddad052785ddeef976ec1ccb02a587db0f69ea" Apr 24 21:41:43.585621 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.585565 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"992c9f4b5c584798e0d1ed9627ddad052785ddeef976ec1ccb02a587db0f69ea"} err="failed to get container status \"992c9f4b5c584798e0d1ed9627ddad052785ddeef976ec1ccb02a587db0f69ea\": rpc error: code = NotFound desc = could not find container \"992c9f4b5c584798e0d1ed9627ddad052785ddeef976ec1ccb02a587db0f69ea\": container with ID starting with 992c9f4b5c584798e0d1ed9627ddad052785ddeef976ec1ccb02a587db0f69ea not found: ID does not exist" Apr 24 21:41:43.585621 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.585588 2578 scope.go:117] "RemoveContainer" containerID="cc56d00f8312c88ce8e9ae6211525f50148ad8bab659da17eadca8b74b6a7527" Apr 24 21:41:43.585802 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:41:43.585784 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc56d00f8312c88ce8e9ae6211525f50148ad8bab659da17eadca8b74b6a7527\": container with ID starting with cc56d00f8312c88ce8e9ae6211525f50148ad8bab659da17eadca8b74b6a7527 not found: ID does not exist" containerID="cc56d00f8312c88ce8e9ae6211525f50148ad8bab659da17eadca8b74b6a7527" Apr 24 21:41:43.585842 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.585807 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc56d00f8312c88ce8e9ae6211525f50148ad8bab659da17eadca8b74b6a7527"} err="failed to get container status \"cc56d00f8312c88ce8e9ae6211525f50148ad8bab659da17eadca8b74b6a7527\": rpc error: code = NotFound desc = could not find container \"cc56d00f8312c88ce8e9ae6211525f50148ad8bab659da17eadca8b74b6a7527\": container with ID starting with cc56d00f8312c88ce8e9ae6211525f50148ad8bab659da17eadca8b74b6a7527 not found: ID does not exist" Apr 24 21:41:43.606222 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.606197 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-raw-9a20e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/536557ba-cfa7-4d51-8905-09b279dcd213-isvc-logger-raw-9a20e-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:41:43.606222 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.606217 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/536557ba-cfa7-4d51-8905-09b279dcd213-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:41:43.606352 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.606230 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/536557ba-cfa7-4d51-8905-09b279dcd213-kserve-provision-location\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:41:43.606352 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.606240 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ndf9j\" (UniqueName: \"kubernetes.io/projected/536557ba-cfa7-4d51-8905-09b279dcd213-kube-api-access-ndf9j\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:41:43.763084 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:43.762990 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" path="/var/lib/kubelet/pods/536557ba-cfa7-4d51-8905-09b279dcd213/volumes" Apr 24 21:41:46.490376 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:46.490341 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:41:56.489742 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:41:56.489702 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:42:06.489724 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:42:06.489686 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:42:16.489680 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:42:16.489593 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:42:23.675394 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:42:23.675367 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krb9p_cde5bc87-530f-4ee7-8f38-39b875bbd4e6/ovn-acl-logging/0.log" Apr 24 21:42:23.677562 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:42:23.677539 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krb9p_cde5bc87-530f-4ee7-8f38-39b875bbd4e6/ovn-acl-logging/0.log" Apr 24 21:42:26.490240 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:42:26.490192 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:42:31.759146 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:42:31.759105 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:42:41.759220 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:42:41.759177 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:42:51.759672 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:42:51.759633 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:43:01.759441 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:01.759394 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:43:11.759081 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:11.759029 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:43:21.759263 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:21.759214 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:43:31.759979 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:31.759931 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:43:41.763252 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:41.763218 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" Apr 24 21:43:43.395436 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.395345 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd"] Apr 24 21:43:43.395995 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.395792 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kserve-container" containerID="cri-o://37e13817a521eca88a1579845b93e82df06f96b84274ad0a41f77b44e5f771a0" gracePeriod=30 Apr 24 21:43:43.395995 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.395842 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kube-rbac-proxy" containerID="cri-o://9d296da6937fbc2ee041343f03ae22f6150a905eb207c8894e316d191b93aa1b" gracePeriod=30 Apr 24 21:43:43.494121 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.494085 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22"] Apr 24 21:43:43.494555 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.494515 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5131ad7e-9174-49ef-9b25-764060e0f422" containerName="kube-rbac-proxy" Apr 24 21:43:43.494692 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.494557 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5131ad7e-9174-49ef-9b25-764060e0f422" containerName="kube-rbac-proxy" Apr 24 21:43:43.494692 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.494569 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="storage-initializer" Apr 24 21:43:43.494692 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.494577 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="storage-initializer" Apr 24 21:43:43.494692 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.494602 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5131ad7e-9174-49ef-9b25-764060e0f422" containerName="kserve-container" Apr 24 21:43:43.494692 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.494611 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5131ad7e-9174-49ef-9b25-764060e0f422" containerName="kserve-container" Apr 24 21:43:43.494692 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.494622 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kube-rbac-proxy" Apr 24 21:43:43.494692 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.494631 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kube-rbac-proxy" Apr 24 21:43:43.494692 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.494642 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kserve-container" Apr 24 21:43:43.494692 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.494652 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kserve-container" Apr 24 21:43:43.494692 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.494668 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="agent" Apr 24 21:43:43.494692 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.494676 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="agent" Apr 24 21:43:43.495268 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.494749 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kserve-container" Apr 24 21:43:43.495268 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.494762 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="kube-rbac-proxy" Apr 24 21:43:43.495268 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.494777 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="5131ad7e-9174-49ef-9b25-764060e0f422" containerName="kserve-container" Apr 24 21:43:43.495268 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.494786 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="5131ad7e-9174-49ef-9b25-764060e0f422" containerName="kube-rbac-proxy" Apr 24 21:43:43.495268 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.494798 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="536557ba-cfa7-4d51-8905-09b279dcd213" containerName="agent" Apr 24 21:43:43.498188 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.498169 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" Apr 24 21:43:43.500782 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.500764 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-ab1d4b-predictor-serving-cert\"" Apr 24 21:43:43.500880 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.500767 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-ab1d4b-kube-rbac-proxy-sar-config\"" Apr 24 21:43:43.507621 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.507597 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22"] Apr 24 21:43:43.633879 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.633840 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-kserve-provision-location\") pod \"isvc-primary-ab1d4b-predictor-669544d645-n5l22\" (UID: \"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e\") " pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" Apr 24 21:43:43.634082 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.633960 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-proxy-tls\") pod \"isvc-primary-ab1d4b-predictor-669544d645-n5l22\" (UID: \"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e\") " pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" Apr 24 21:43:43.634082 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.633991 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-ab1d4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-isvc-primary-ab1d4b-kube-rbac-proxy-sar-config\") pod \"isvc-primary-ab1d4b-predictor-669544d645-n5l22\" (UID: \"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e\") " pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" Apr 24 21:43:43.634082 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.634025 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77gw6\" (UniqueName: \"kubernetes.io/projected/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-kube-api-access-77gw6\") pod \"isvc-primary-ab1d4b-predictor-669544d645-n5l22\" (UID: \"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e\") " pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" Apr 24 21:43:43.734604 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.734566 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-proxy-tls\") pod \"isvc-primary-ab1d4b-predictor-669544d645-n5l22\" (UID: \"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e\") " pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" Apr 24 21:43:43.734604 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.734610 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-ab1d4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-isvc-primary-ab1d4b-kube-rbac-proxy-sar-config\") pod \"isvc-primary-ab1d4b-predictor-669544d645-n5l22\" (UID: \"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e\") " pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" Apr 24 21:43:43.734864 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.734640 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77gw6\" (UniqueName: \"kubernetes.io/projected/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-kube-api-access-77gw6\") pod \"isvc-primary-ab1d4b-predictor-669544d645-n5l22\" (UID: \"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e\") " pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" Apr 24 21:43:43.734864 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.734659 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-kserve-provision-location\") pod \"isvc-primary-ab1d4b-predictor-669544d645-n5l22\" (UID: \"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e\") " pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" Apr 24 21:43:43.734864 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:43:43.734725 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-serving-cert: secret "isvc-primary-ab1d4b-predictor-serving-cert" not found Apr 24 21:43:43.734864 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:43:43.734795 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-proxy-tls podName:19f3e54f-d5d6-4a7a-b4d9-b86fea03954e nodeName:}" failed. No retries permitted until 2026-04-24 21:43:44.234774917 +0000 UTC m=+981.062645048 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-proxy-tls") pod "isvc-primary-ab1d4b-predictor-669544d645-n5l22" (UID: "19f3e54f-d5d6-4a7a-b4d9-b86fea03954e") : secret "isvc-primary-ab1d4b-predictor-serving-cert" not found Apr 24 21:43:43.735102 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.735086 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-kserve-provision-location\") pod \"isvc-primary-ab1d4b-predictor-669544d645-n5l22\" (UID: \"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e\") " pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" Apr 24 21:43:43.735327 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.735307 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-ab1d4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-isvc-primary-ab1d4b-kube-rbac-proxy-sar-config\") pod \"isvc-primary-ab1d4b-predictor-669544d645-n5l22\" (UID: \"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e\") " pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" Apr 24 21:43:43.745801 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.745772 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77gw6\" (UniqueName: \"kubernetes.io/projected/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-kube-api-access-77gw6\") pod \"isvc-primary-ab1d4b-predictor-669544d645-n5l22\" (UID: \"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e\") " pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" Apr 24 21:43:43.889523 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.889487 2578 generic.go:358] "Generic (PLEG): container finished" podID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerID="9d296da6937fbc2ee041343f03ae22f6150a905eb207c8894e316d191b93aa1b" exitCode=2 Apr 24 21:43:43.890008 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:43.889550 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" event={"ID":"e7a6b280-586b-4d38-8a28-672f8fb0c8f7","Type":"ContainerDied","Data":"9d296da6937fbc2ee041343f03ae22f6150a905eb207c8894e316d191b93aa1b"} Apr 24 21:43:44.239934 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:44.239872 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-proxy-tls\") pod \"isvc-primary-ab1d4b-predictor-669544d645-n5l22\" (UID: \"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e\") " pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" Apr 24 21:43:44.242206 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:44.242185 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-proxy-tls\") pod \"isvc-primary-ab1d4b-predictor-669544d645-n5l22\" (UID: \"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e\") " pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" Apr 24 21:43:44.410733 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:44.410691 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" Apr 24 21:43:44.537478 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:44.537411 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22"] Apr 24 21:43:44.539947 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:43:44.539913 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19f3e54f_d5d6_4a7a_b4d9_b86fea03954e.slice/crio-bdade82c6a3703078705ab3cec39669971d161b7b3b36fcfccf626ef9084c12c WatchSource:0}: Error finding container bdade82c6a3703078705ab3cec39669971d161b7b3b36fcfccf626ef9084c12c: Status 404 returned error can't find the container with id bdade82c6a3703078705ab3cec39669971d161b7b3b36fcfccf626ef9084c12c Apr 24 21:43:44.541943 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:44.541923 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:43:44.893981 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:44.893881 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" event={"ID":"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e","Type":"ContainerStarted","Data":"55090098533ec652b151cb4b57aebe0e9ebc57618417b0816fcf4a921afde20a"} Apr 24 21:43:44.893981 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:44.893934 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" event={"ID":"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e","Type":"ContainerStarted","Data":"bdade82c6a3703078705ab3cec39669971d161b7b3b36fcfccf626ef9084c12c"} Apr 24 21:43:46.484979 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:46.484931 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 24 21:43:48.909072 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:48.909037 2578 generic.go:358] "Generic (PLEG): container finished" podID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" containerID="55090098533ec652b151cb4b57aebe0e9ebc57618417b0816fcf4a921afde20a" exitCode=0 Apr 24 21:43:48.909457 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:48.909106 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" event={"ID":"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e","Type":"ContainerDied","Data":"55090098533ec652b151cb4b57aebe0e9ebc57618417b0816fcf4a921afde20a"} Apr 24 21:43:49.913911 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:49.913863 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" event={"ID":"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e","Type":"ContainerStarted","Data":"2d4e8146aabc7854a9a91c9d9fcc8223226b090eb1569fbe93319fd67be169cb"} Apr 24 21:43:49.914295 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:49.913928 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" event={"ID":"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e","Type":"ContainerStarted","Data":"521398d8cc3cd878301a8c797f5532a9593e048d6068dade4388b07c2848abce"} Apr 24 21:43:49.914295 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:49.914165 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" Apr 24 21:43:49.935152 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:49.935102 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" podStartSLOduration=6.935085877 podStartE2EDuration="6.935085877s" podCreationTimestamp="2026-04-24 21:43:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:43:49.933660306 +0000 UTC m=+986.761530441" watchObservedRunningTime="2026-04-24 21:43:49.935085877 +0000 UTC m=+986.762956008" Apr 24 21:43:50.916872 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:50.916840 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" Apr 24 21:43:50.918160 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:50.918131 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" podUID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 21:43:51.485051 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:51.485008 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 24 21:43:51.760168 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:51.760062 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:43:51.919389 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:51.919354 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" podUID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 21:43:52.730783 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.730758 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" Apr 24 21:43:52.810079 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.810031 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-df4lb\" (UniqueName: \"kubernetes.io/projected/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-kube-api-access-df4lb\") pod \"e7a6b280-586b-4d38-8a28-672f8fb0c8f7\" (UID: \"e7a6b280-586b-4d38-8a28-672f8fb0c8f7\") " Apr 24 21:43:52.810079 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.810097 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-kserve-provision-location\") pod \"e7a6b280-586b-4d38-8a28-672f8fb0c8f7\" (UID: \"e7a6b280-586b-4d38-8a28-672f8fb0c8f7\") " Apr 24 21:43:52.810296 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.810141 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-scale-raw-9e9e5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-isvc-sklearn-scale-raw-9e9e5-kube-rbac-proxy-sar-config\") pod \"e7a6b280-586b-4d38-8a28-672f8fb0c8f7\" (UID: \"e7a6b280-586b-4d38-8a28-672f8fb0c8f7\") " Apr 24 21:43:52.810296 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.810172 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-proxy-tls\") pod \"e7a6b280-586b-4d38-8a28-672f8fb0c8f7\" (UID: \"e7a6b280-586b-4d38-8a28-672f8fb0c8f7\") " Apr 24 21:43:52.810509 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.810479 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e7a6b280-586b-4d38-8a28-672f8fb0c8f7" (UID: "e7a6b280-586b-4d38-8a28-672f8fb0c8f7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:43:52.810574 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.810515 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-isvc-sklearn-scale-raw-9e9e5-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-scale-raw-9e9e5-kube-rbac-proxy-sar-config") pod "e7a6b280-586b-4d38-8a28-672f8fb0c8f7" (UID: "e7a6b280-586b-4d38-8a28-672f8fb0c8f7"). InnerVolumeSpecName "isvc-sklearn-scale-raw-9e9e5-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:43:52.812275 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.812245 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-kube-api-access-df4lb" (OuterVolumeSpecName: "kube-api-access-df4lb") pod "e7a6b280-586b-4d38-8a28-672f8fb0c8f7" (UID: "e7a6b280-586b-4d38-8a28-672f8fb0c8f7"). InnerVolumeSpecName "kube-api-access-df4lb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:43:52.812275 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.812250 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e7a6b280-586b-4d38-8a28-672f8fb0c8f7" (UID: "e7a6b280-586b-4d38-8a28-672f8fb0c8f7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:43:52.910957 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.910845 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-df4lb\" (UniqueName: \"kubernetes.io/projected/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-kube-api-access-df4lb\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:43:52.910957 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.910876 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-kserve-provision-location\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:43:52.910957 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.910908 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-scale-raw-9e9e5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-isvc-sklearn-scale-raw-9e9e5-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:43:52.910957 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.910918 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7a6b280-586b-4d38-8a28-672f8fb0c8f7-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:43:52.923089 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.923054 2578 generic.go:358] "Generic (PLEG): container finished" podID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerID="37e13817a521eca88a1579845b93e82df06f96b84274ad0a41f77b44e5f771a0" exitCode=0 Apr 24 21:43:52.923518 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.923139 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" event={"ID":"e7a6b280-586b-4d38-8a28-672f8fb0c8f7","Type":"ContainerDied","Data":"37e13817a521eca88a1579845b93e82df06f96b84274ad0a41f77b44e5f771a0"} Apr 24 21:43:52.923518 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.923178 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" event={"ID":"e7a6b280-586b-4d38-8a28-672f8fb0c8f7","Type":"ContainerDied","Data":"f11fa32fa4dc6d23e6dbe6015773acfb3a626e19332410d4293a0152ead8af77"} Apr 24 21:43:52.923518 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.923194 2578 scope.go:117] "RemoveContainer" containerID="9d296da6937fbc2ee041343f03ae22f6150a905eb207c8894e316d191b93aa1b" Apr 24 21:43:52.923518 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.923144 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd" Apr 24 21:43:52.931526 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.931508 2578 scope.go:117] "RemoveContainer" containerID="37e13817a521eca88a1579845b93e82df06f96b84274ad0a41f77b44e5f771a0" Apr 24 21:43:52.938651 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.938632 2578 scope.go:117] "RemoveContainer" containerID="ad501262ea56b797897ea9f034e5d00ecd8a804a19eb4bc33d1476af6c8af4b5" Apr 24 21:43:52.945469 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.945446 2578 scope.go:117] "RemoveContainer" containerID="9d296da6937fbc2ee041343f03ae22f6150a905eb207c8894e316d191b93aa1b" Apr 24 21:43:52.946165 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:43:52.945883 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d296da6937fbc2ee041343f03ae22f6150a905eb207c8894e316d191b93aa1b\": container with ID starting with 9d296da6937fbc2ee041343f03ae22f6150a905eb207c8894e316d191b93aa1b not found: ID does not exist" containerID="9d296da6937fbc2ee041343f03ae22f6150a905eb207c8894e316d191b93aa1b" Apr 24 21:43:52.946165 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.945940 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d296da6937fbc2ee041343f03ae22f6150a905eb207c8894e316d191b93aa1b"} err="failed to get container status \"9d296da6937fbc2ee041343f03ae22f6150a905eb207c8894e316d191b93aa1b\": rpc error: code = NotFound desc = could not find container \"9d296da6937fbc2ee041343f03ae22f6150a905eb207c8894e316d191b93aa1b\": container with ID starting with 9d296da6937fbc2ee041343f03ae22f6150a905eb207c8894e316d191b93aa1b not found: ID does not exist" Apr 24 21:43:52.946165 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.945966 2578 scope.go:117] "RemoveContainer" containerID="37e13817a521eca88a1579845b93e82df06f96b84274ad0a41f77b44e5f771a0" Apr 24 21:43:52.946400 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:43:52.946249 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37e13817a521eca88a1579845b93e82df06f96b84274ad0a41f77b44e5f771a0\": container with ID starting with 37e13817a521eca88a1579845b93e82df06f96b84274ad0a41f77b44e5f771a0 not found: ID does not exist" containerID="37e13817a521eca88a1579845b93e82df06f96b84274ad0a41f77b44e5f771a0" Apr 24 21:43:52.946400 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.946283 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e13817a521eca88a1579845b93e82df06f96b84274ad0a41f77b44e5f771a0"} err="failed to get container status \"37e13817a521eca88a1579845b93e82df06f96b84274ad0a41f77b44e5f771a0\": rpc error: code = NotFound desc = could not find container \"37e13817a521eca88a1579845b93e82df06f96b84274ad0a41f77b44e5f771a0\": container with ID starting with 37e13817a521eca88a1579845b93e82df06f96b84274ad0a41f77b44e5f771a0 not found: ID does not exist" Apr 24 21:43:52.946400 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.946302 2578 scope.go:117] "RemoveContainer" containerID="ad501262ea56b797897ea9f034e5d00ecd8a804a19eb4bc33d1476af6c8af4b5" Apr 24 21:43:52.946573 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:43:52.946552 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad501262ea56b797897ea9f034e5d00ecd8a804a19eb4bc33d1476af6c8af4b5\": container with ID starting with ad501262ea56b797897ea9f034e5d00ecd8a804a19eb4bc33d1476af6c8af4b5 not found: ID does not exist" containerID="ad501262ea56b797897ea9f034e5d00ecd8a804a19eb4bc33d1476af6c8af4b5" Apr 24 21:43:52.946626 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.946580 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad501262ea56b797897ea9f034e5d00ecd8a804a19eb4bc33d1476af6c8af4b5"} err="failed to get container status \"ad501262ea56b797897ea9f034e5d00ecd8a804a19eb4bc33d1476af6c8af4b5\": rpc error: code = NotFound desc = could not find container \"ad501262ea56b797897ea9f034e5d00ecd8a804a19eb4bc33d1476af6c8af4b5\": container with ID starting with ad501262ea56b797897ea9f034e5d00ecd8a804a19eb4bc33d1476af6c8af4b5 not found: ID does not exist" Apr 24 21:43:52.947705 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.947686 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd"] Apr 24 21:43:52.952025 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:52.952005 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9e9e5-predictor-db45f49bd-nqbvd"] Apr 24 21:43:53.762861 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:53.762819 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" path="/var/lib/kubelet/pods/e7a6b280-586b-4d38-8a28-672f8fb0c8f7/volumes" Apr 24 21:43:56.923736 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:56.923704 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" Apr 24 21:43:56.924250 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:43:56.924227 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" podUID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 21:44:06.924961 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:44:06.924919 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" podUID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 21:44:16.924734 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:44:16.924697 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" podUID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 21:44:26.924639 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:44:26.924596 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" podUID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 21:44:36.924885 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:44:36.924846 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" podUID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 21:44:46.924688 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:44:46.924644 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" podUID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 21:44:56.924818 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:44:56.924791 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" Apr 24 21:45:03.656656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.656618 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f"] Apr 24 21:45:03.657054 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.656948 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="storage-initializer" Apr 24 21:45:03.657054 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.656959 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="storage-initializer" Apr 24 21:45:03.657054 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.656970 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kserve-container" Apr 24 21:45:03.657054 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.656976 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kserve-container" Apr 24 21:45:03.657054 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.656986 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kube-rbac-proxy" Apr 24 21:45:03.657054 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.656992 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kube-rbac-proxy" Apr 24 21:45:03.657054 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.657036 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kserve-container" Apr 24 21:45:03.657054 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.657045 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7a6b280-586b-4d38-8a28-672f8fb0c8f7" containerName="kube-rbac-proxy" Apr 24 21:45:03.660326 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.660307 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" Apr 24 21:45:03.662955 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.662931 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-ab1d4b-kube-rbac-proxy-sar-config\"" Apr 24 21:45:03.663083 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.662942 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-ab1d4b-predictor-serving-cert\"" Apr 24 21:45:03.663083 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.662987 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-ab1d4b-dockercfg-5wzkj\"" Apr 24 21:45:03.663083 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.663047 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-ab1d4b\"" Apr 24 21:45:03.663231 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.663218 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 21:45:03.673481 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.672148 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f"] Apr 24 21:45:03.737881 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.737843 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a04a75f1-4fff-4cef-b530-d1dba24e65e1-proxy-tls\") pod \"isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f\" (UID: \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\") " pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" Apr 24 21:45:03.738080 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.737918 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-ab1d4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a04a75f1-4fff-4cef-b530-d1dba24e65e1-isvc-secondary-ab1d4b-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f\" (UID: \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\") " pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" Apr 24 21:45:03.738080 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.737958 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jlcd\" (UniqueName: \"kubernetes.io/projected/a04a75f1-4fff-4cef-b530-d1dba24e65e1-kube-api-access-6jlcd\") pod \"isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f\" (UID: \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\") " pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" Apr 24 21:45:03.738080 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.737995 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a04a75f1-4fff-4cef-b530-d1dba24e65e1-cabundle-cert\") pod \"isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f\" (UID: \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\") " pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" Apr 24 21:45:03.738080 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.738014 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a04a75f1-4fff-4cef-b530-d1dba24e65e1-kserve-provision-location\") pod \"isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f\" (UID: \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\") " pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" Apr 24 21:45:03.838498 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.838458 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a04a75f1-4fff-4cef-b530-d1dba24e65e1-proxy-tls\") pod \"isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f\" (UID: \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\") " pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" Apr 24 21:45:03.838703 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.838538 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-ab1d4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a04a75f1-4fff-4cef-b530-d1dba24e65e1-isvc-secondary-ab1d4b-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f\" (UID: \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\") " pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" Apr 24 21:45:03.838775 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:45:03.838705 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-serving-cert: secret "isvc-secondary-ab1d4b-predictor-serving-cert" not found Apr 24 21:45:03.838832 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:45:03.838792 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a04a75f1-4fff-4cef-b530-d1dba24e65e1-proxy-tls podName:a04a75f1-4fff-4cef-b530-d1dba24e65e1 nodeName:}" failed. No retries permitted until 2026-04-24 21:45:04.33876751 +0000 UTC m=+1061.166637625 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a04a75f1-4fff-4cef-b530-d1dba24e65e1-proxy-tls") pod "isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" (UID: "a04a75f1-4fff-4cef-b530-d1dba24e65e1") : secret "isvc-secondary-ab1d4b-predictor-serving-cert" not found Apr 24 21:45:03.838927 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.838839 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jlcd\" (UniqueName: \"kubernetes.io/projected/a04a75f1-4fff-4cef-b530-d1dba24e65e1-kube-api-access-6jlcd\") pod \"isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f\" (UID: \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\") " pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" Apr 24 21:45:03.838997 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.838976 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a04a75f1-4fff-4cef-b530-d1dba24e65e1-cabundle-cert\") pod \"isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f\" (UID: \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\") " pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" Apr 24 21:45:03.839054 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.839027 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a04a75f1-4fff-4cef-b530-d1dba24e65e1-kserve-provision-location\") pod \"isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f\" (UID: \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\") " pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" Apr 24 21:45:03.839394 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.839371 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-ab1d4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a04a75f1-4fff-4cef-b530-d1dba24e65e1-isvc-secondary-ab1d4b-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f\" (UID: \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\") " pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" Apr 24 21:45:03.839394 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.839389 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a04a75f1-4fff-4cef-b530-d1dba24e65e1-kserve-provision-location\") pod \"isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f\" (UID: \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\") " pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" Apr 24 21:45:03.839580 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.839562 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a04a75f1-4fff-4cef-b530-d1dba24e65e1-cabundle-cert\") pod \"isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f\" (UID: \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\") " pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" Apr 24 21:45:03.850086 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:03.850066 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jlcd\" (UniqueName: \"kubernetes.io/projected/a04a75f1-4fff-4cef-b530-d1dba24e65e1-kube-api-access-6jlcd\") pod \"isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f\" (UID: \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\") " pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" Apr 24 21:45:04.343853 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:04.343812 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a04a75f1-4fff-4cef-b530-d1dba24e65e1-proxy-tls\") pod \"isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f\" (UID: \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\") " pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" Apr 24 21:45:04.346247 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:04.346211 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a04a75f1-4fff-4cef-b530-d1dba24e65e1-proxy-tls\") pod \"isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f\" (UID: \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\") " pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" Apr 24 21:45:04.572011 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:04.571955 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" Apr 24 21:45:04.692905 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:04.692737 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f"] Apr 24 21:45:04.695747 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:45:04.695721 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda04a75f1_4fff_4cef_b530_d1dba24e65e1.slice/crio-bf083e86d15f564933d2fe8fef436b720c6516eb80a3726273155532456ee9d0 WatchSource:0}: Error finding container bf083e86d15f564933d2fe8fef436b720c6516eb80a3726273155532456ee9d0: Status 404 returned error can't find the container with id bf083e86d15f564933d2fe8fef436b720c6516eb80a3726273155532456ee9d0 Apr 24 21:45:05.134655 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:05.134623 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" event={"ID":"a04a75f1-4fff-4cef-b530-d1dba24e65e1","Type":"ContainerStarted","Data":"780faf2aa91c980ecb0b7799e6d0ee8fc22ce850d9de92cf4270a2217b1f751d"} Apr 24 21:45:05.134655 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:05.134658 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" event={"ID":"a04a75f1-4fff-4cef-b530-d1dba24e65e1","Type":"ContainerStarted","Data":"bf083e86d15f564933d2fe8fef436b720c6516eb80a3726273155532456ee9d0"} Apr 24 21:45:11.152499 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:11.152469 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f_a04a75f1-4fff-4cef-b530-d1dba24e65e1/storage-initializer/0.log" Apr 24 21:45:11.152934 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:11.152509 2578 generic.go:358] "Generic (PLEG): container finished" podID="a04a75f1-4fff-4cef-b530-d1dba24e65e1" containerID="780faf2aa91c980ecb0b7799e6d0ee8fc22ce850d9de92cf4270a2217b1f751d" exitCode=1 Apr 24 21:45:11.152934 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:11.152562 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" event={"ID":"a04a75f1-4fff-4cef-b530-d1dba24e65e1","Type":"ContainerDied","Data":"780faf2aa91c980ecb0b7799e6d0ee8fc22ce850d9de92cf4270a2217b1f751d"} Apr 24 21:45:12.157750 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:12.157720 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f_a04a75f1-4fff-4cef-b530-d1dba24e65e1/storage-initializer/0.log" Apr 24 21:45:12.158155 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:12.157827 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" event={"ID":"a04a75f1-4fff-4cef-b530-d1dba24e65e1","Type":"ContainerStarted","Data":"92ef3c3377dcde3ef65db3bb3dd78ce9cbd13e4bac56522d416b0fc9c8c89baa"} Apr 24 21:45:16.170723 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:16.170695 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f_a04a75f1-4fff-4cef-b530-d1dba24e65e1/storage-initializer/1.log" Apr 24 21:45:16.171138 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:16.171029 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f_a04a75f1-4fff-4cef-b530-d1dba24e65e1/storage-initializer/0.log" Apr 24 21:45:16.171138 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:16.171064 2578 generic.go:358] "Generic (PLEG): container finished" podID="a04a75f1-4fff-4cef-b530-d1dba24e65e1" containerID="92ef3c3377dcde3ef65db3bb3dd78ce9cbd13e4bac56522d416b0fc9c8c89baa" exitCode=1 Apr 24 21:45:16.171210 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:16.171139 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" event={"ID":"a04a75f1-4fff-4cef-b530-d1dba24e65e1","Type":"ContainerDied","Data":"92ef3c3377dcde3ef65db3bb3dd78ce9cbd13e4bac56522d416b0fc9c8c89baa"} Apr 24 21:45:16.171210 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:16.171179 2578 scope.go:117] "RemoveContainer" containerID="780faf2aa91c980ecb0b7799e6d0ee8fc22ce850d9de92cf4270a2217b1f751d" Apr 24 21:45:16.171574 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:16.171559 2578 scope.go:117] "RemoveContainer" containerID="780faf2aa91c980ecb0b7799e6d0ee8fc22ce850d9de92cf4270a2217b1f751d" Apr 24 21:45:16.181468 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:45:16.181444 2578 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f_kserve-ci-e2e-test_a04a75f1-4fff-4cef-b530-d1dba24e65e1_0 in pod sandbox bf083e86d15f564933d2fe8fef436b720c6516eb80a3726273155532456ee9d0 from index: no such id: '780faf2aa91c980ecb0b7799e6d0ee8fc22ce850d9de92cf4270a2217b1f751d'" containerID="780faf2aa91c980ecb0b7799e6d0ee8fc22ce850d9de92cf4270a2217b1f751d" Apr 24 21:45:16.181544 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:45:16.181486 2578 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f_kserve-ci-e2e-test_a04a75f1-4fff-4cef-b530-d1dba24e65e1_0 in pod sandbox bf083e86d15f564933d2fe8fef436b720c6516eb80a3726273155532456ee9d0 from index: no such id: '780faf2aa91c980ecb0b7799e6d0ee8fc22ce850d9de92cf4270a2217b1f751d'; Skipping pod \"isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f_kserve-ci-e2e-test(a04a75f1-4fff-4cef-b530-d1dba24e65e1)\"" logger="UnhandledError" Apr 24 21:45:16.182794 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:45:16.182774 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f_kserve-ci-e2e-test(a04a75f1-4fff-4cef-b530-d1dba24e65e1)\"" pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" podUID="a04a75f1-4fff-4cef-b530-d1dba24e65e1" Apr 24 21:45:17.175982 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:17.175956 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f_a04a75f1-4fff-4cef-b530-d1dba24e65e1/storage-initializer/1.log" Apr 24 21:45:21.690832 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.690804 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22"] Apr 24 21:45:21.691230 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.691126 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" podUID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" containerName="kserve-container" containerID="cri-o://521398d8cc3cd878301a8c797f5532a9593e048d6068dade4388b07c2848abce" gracePeriod=30 Apr 24 21:45:21.691230 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.691182 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" podUID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" containerName="kube-rbac-proxy" containerID="cri-o://2d4e8146aabc7854a9a91c9d9fcc8223226b090eb1569fbe93319fd67be169cb" gracePeriod=30 Apr 24 21:45:21.780533 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.780504 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f"] Apr 24 21:45:21.890532 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.890503 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx"] Apr 24 21:45:21.894021 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.893998 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" Apr 24 21:45:21.896988 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.896965 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-e9425b-predictor-serving-cert\"" Apr 24 21:45:21.897180 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.897156 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-e9425b-kube-rbac-proxy-sar-config\"" Apr 24 21:45:21.897399 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.897384 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-e9425b\"" Apr 24 21:45:21.897511 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.897493 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-e9425b-dockercfg-9dcqj\"" Apr 24 21:45:21.907055 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.907038 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f_a04a75f1-4fff-4cef-b530-d1dba24e65e1/storage-initializer/1.log" Apr 24 21:45:21.907137 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.907093 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" Apr 24 21:45:21.911905 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.911834 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx"] Apr 24 21:45:21.919776 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.919751 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" podUID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.28:8643/healthz\": dial tcp 10.133.0.28:8643: connect: connection refused" Apr 24 21:45:21.975901 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.975833 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a04a75f1-4fff-4cef-b530-d1dba24e65e1-kserve-provision-location\") pod \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\" (UID: \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\") " Apr 24 21:45:21.975901 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.975876 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a04a75f1-4fff-4cef-b530-d1dba24e65e1-proxy-tls\") pod \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\" (UID: \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\") " Apr 24 21:45:21.976035 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.975933 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jlcd\" (UniqueName: \"kubernetes.io/projected/a04a75f1-4fff-4cef-b530-d1dba24e65e1-kube-api-access-6jlcd\") pod \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\" (UID: \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\") " Apr 24 21:45:21.976035 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.975962 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-ab1d4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a04a75f1-4fff-4cef-b530-d1dba24e65e1-isvc-secondary-ab1d4b-kube-rbac-proxy-sar-config\") pod \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\" (UID: \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\") " Apr 24 21:45:21.976035 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.975989 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a04a75f1-4fff-4cef-b530-d1dba24e65e1-cabundle-cert\") pod \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\" (UID: \"a04a75f1-4fff-4cef-b530-d1dba24e65e1\") " Apr 24 21:45:21.976186 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.976131 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4a20173a-6851-4bc3-b559-cfa77ead9848-proxy-tls\") pod \"isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx\" (UID: \"4a20173a-6851-4bc3-b559-cfa77ead9848\") " pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" Apr 24 21:45:21.976186 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.976143 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a04a75f1-4fff-4cef-b530-d1dba24e65e1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a04a75f1-4fff-4cef-b530-d1dba24e65e1" (UID: "a04a75f1-4fff-4cef-b530-d1dba24e65e1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:21.976293 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.976212 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r4np\" (UniqueName: \"kubernetes.io/projected/4a20173a-6851-4bc3-b559-cfa77ead9848-kube-api-access-9r4np\") pod \"isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx\" (UID: \"4a20173a-6851-4bc3-b559-cfa77ead9848\") " pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" Apr 24 21:45:21.976293 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.976288 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-e9425b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4a20173a-6851-4bc3-b559-cfa77ead9848-isvc-init-fail-e9425b-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx\" (UID: \"4a20173a-6851-4bc3-b559-cfa77ead9848\") " pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" Apr 24 21:45:21.976407 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.976311 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4a20173a-6851-4bc3-b559-cfa77ead9848-cabundle-cert\") pod \"isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx\" (UID: \"4a20173a-6851-4bc3-b559-cfa77ead9848\") " pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" Apr 24 21:45:21.976407 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.976333 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a04a75f1-4fff-4cef-b530-d1dba24e65e1-isvc-secondary-ab1d4b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-ab1d4b-kube-rbac-proxy-sar-config") pod "a04a75f1-4fff-4cef-b530-d1dba24e65e1" (UID: "a04a75f1-4fff-4cef-b530-d1dba24e65e1"). InnerVolumeSpecName "isvc-secondary-ab1d4b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:45:21.976407 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.976374 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a20173a-6851-4bc3-b559-cfa77ead9848-kserve-provision-location\") pod \"isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx\" (UID: \"4a20173a-6851-4bc3-b559-cfa77ead9848\") " pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" Apr 24 21:45:21.976407 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.976390 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a04a75f1-4fff-4cef-b530-d1dba24e65e1-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "a04a75f1-4fff-4cef-b530-d1dba24e65e1" (UID: "a04a75f1-4fff-4cef-b530-d1dba24e65e1"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:45:21.976546 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.976444 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-ab1d4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a04a75f1-4fff-4cef-b530-d1dba24e65e1-isvc-secondary-ab1d4b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:45:21.976546 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.976458 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a04a75f1-4fff-4cef-b530-d1dba24e65e1-kserve-provision-location\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:45:21.977920 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.977881 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a04a75f1-4fff-4cef-b530-d1dba24e65e1-kube-api-access-6jlcd" (OuterVolumeSpecName: "kube-api-access-6jlcd") pod "a04a75f1-4fff-4cef-b530-d1dba24e65e1" (UID: "a04a75f1-4fff-4cef-b530-d1dba24e65e1"). InnerVolumeSpecName "kube-api-access-6jlcd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:45:21.978066 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:21.978050 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a04a75f1-4fff-4cef-b530-d1dba24e65e1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a04a75f1-4fff-4cef-b530-d1dba24e65e1" (UID: "a04a75f1-4fff-4cef-b530-d1dba24e65e1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:45:22.077490 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.077463 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-e9425b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4a20173a-6851-4bc3-b559-cfa77ead9848-isvc-init-fail-e9425b-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx\" (UID: \"4a20173a-6851-4bc3-b559-cfa77ead9848\") " pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" Apr 24 21:45:22.077598 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.077495 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4a20173a-6851-4bc3-b559-cfa77ead9848-cabundle-cert\") pod \"isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx\" (UID: \"4a20173a-6851-4bc3-b559-cfa77ead9848\") " pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" Apr 24 21:45:22.077598 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.077516 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a20173a-6851-4bc3-b559-cfa77ead9848-kserve-provision-location\") pod \"isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx\" (UID: \"4a20173a-6851-4bc3-b559-cfa77ead9848\") " pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" Apr 24 21:45:22.077706 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.077624 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4a20173a-6851-4bc3-b559-cfa77ead9848-proxy-tls\") pod \"isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx\" (UID: \"4a20173a-6851-4bc3-b559-cfa77ead9848\") " pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" Apr 24 21:45:22.077706 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.077684 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9r4np\" (UniqueName: \"kubernetes.io/projected/4a20173a-6851-4bc3-b559-cfa77ead9848-kube-api-access-9r4np\") pod \"isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx\" (UID: \"4a20173a-6851-4bc3-b559-cfa77ead9848\") " pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" Apr 24 21:45:22.077792 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.077752 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a04a75f1-4fff-4cef-b530-d1dba24e65e1-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:45:22.077792 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:45:22.077771 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-serving-cert: secret "isvc-init-fail-e9425b-predictor-serving-cert" not found Apr 24 21:45:22.077792 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.077770 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6jlcd\" (UniqueName: \"kubernetes.io/projected/a04a75f1-4fff-4cef-b530-d1dba24e65e1-kube-api-access-6jlcd\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:45:22.077922 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.077798 2578 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a04a75f1-4fff-4cef-b530-d1dba24e65e1-cabundle-cert\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:45:22.077922 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:45:22.077852 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a20173a-6851-4bc3-b559-cfa77ead9848-proxy-tls podName:4a20173a-6851-4bc3-b559-cfa77ead9848 nodeName:}" failed. No retries permitted until 2026-04-24 21:45:22.577829943 +0000 UTC m=+1079.405700054 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4a20173a-6851-4bc3-b559-cfa77ead9848-proxy-tls") pod "isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" (UID: "4a20173a-6851-4bc3-b559-cfa77ead9848") : secret "isvc-init-fail-e9425b-predictor-serving-cert" not found Apr 24 21:45:22.078000 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.077922 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a20173a-6851-4bc3-b559-cfa77ead9848-kserve-provision-location\") pod \"isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx\" (UID: \"4a20173a-6851-4bc3-b559-cfa77ead9848\") " pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" Apr 24 21:45:22.078165 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.078148 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-e9425b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4a20173a-6851-4bc3-b559-cfa77ead9848-isvc-init-fail-e9425b-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx\" (UID: \"4a20173a-6851-4bc3-b559-cfa77ead9848\") " pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" Apr 24 21:45:22.078218 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.078201 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4a20173a-6851-4bc3-b559-cfa77ead9848-cabundle-cert\") pod \"isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx\" (UID: \"4a20173a-6851-4bc3-b559-cfa77ead9848\") " pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" Apr 24 21:45:22.086787 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.086767 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r4np\" (UniqueName: \"kubernetes.io/projected/4a20173a-6851-4bc3-b559-cfa77ead9848-kube-api-access-9r4np\") pod \"isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx\" (UID: \"4a20173a-6851-4bc3-b559-cfa77ead9848\") " pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" Apr 24 21:45:22.192571 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.192539 2578 generic.go:358] "Generic (PLEG): container finished" podID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" containerID="2d4e8146aabc7854a9a91c9d9fcc8223226b090eb1569fbe93319fd67be169cb" exitCode=2 Apr 24 21:45:22.192685 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.192605 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" event={"ID":"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e","Type":"ContainerDied","Data":"2d4e8146aabc7854a9a91c9d9fcc8223226b090eb1569fbe93319fd67be169cb"} Apr 24 21:45:22.193781 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.193764 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f_a04a75f1-4fff-4cef-b530-d1dba24e65e1/storage-initializer/1.log" Apr 24 21:45:22.193923 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.193836 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" event={"ID":"a04a75f1-4fff-4cef-b530-d1dba24e65e1","Type":"ContainerDied","Data":"bf083e86d15f564933d2fe8fef436b720c6516eb80a3726273155532456ee9d0"} Apr 24 21:45:22.193923 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.193860 2578 scope.go:117] "RemoveContainer" containerID="92ef3c3377dcde3ef65db3bb3dd78ce9cbd13e4bac56522d416b0fc9c8c89baa" Apr 24 21:45:22.193923 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.193905 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f" Apr 24 21:45:22.231983 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.231880 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f"] Apr 24 21:45:22.237284 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.237262 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-ab1d4b-predictor-5c9b76987d-qd29f"] Apr 24 21:45:22.581722 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.581631 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4a20173a-6851-4bc3-b559-cfa77ead9848-proxy-tls\") pod \"isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx\" (UID: \"4a20173a-6851-4bc3-b559-cfa77ead9848\") " pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" Apr 24 21:45:22.583784 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.583757 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4a20173a-6851-4bc3-b559-cfa77ead9848-proxy-tls\") pod \"isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx\" (UID: \"4a20173a-6851-4bc3-b559-cfa77ead9848\") " pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" Apr 24 21:45:22.816326 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.816293 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" Apr 24 21:45:22.932971 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:22.932917 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx"] Apr 24 21:45:22.935161 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:45:22.935133 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a20173a_6851_4bc3_b559_cfa77ead9848.slice/crio-fbf5ac55690b57586aa757e87f1dbd45adbee77fdac31ef36f811a0e4dc20924 WatchSource:0}: Error finding container fbf5ac55690b57586aa757e87f1dbd45adbee77fdac31ef36f811a0e4dc20924: Status 404 returned error can't find the container with id fbf5ac55690b57586aa757e87f1dbd45adbee77fdac31ef36f811a0e4dc20924 Apr 24 21:45:23.198659 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:23.198622 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" event={"ID":"4a20173a-6851-4bc3-b559-cfa77ead9848","Type":"ContainerStarted","Data":"fe6acd5d19bd1d6398d81cf6198790e9af61a74bf53ad9f02cb8534e8a776c26"} Apr 24 21:45:23.198659 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:23.198658 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" event={"ID":"4a20173a-6851-4bc3-b559-cfa77ead9848","Type":"ContainerStarted","Data":"fbf5ac55690b57586aa757e87f1dbd45adbee77fdac31ef36f811a0e4dc20924"} Apr 24 21:45:23.762571 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:23.762543 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a04a75f1-4fff-4cef-b530-d1dba24e65e1" path="/var/lib/kubelet/pods/a04a75f1-4fff-4cef-b530-d1dba24e65e1/volumes" Apr 24 21:45:26.030955 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.030935 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" Apr 24 21:45:26.107551 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.107486 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77gw6\" (UniqueName: \"kubernetes.io/projected/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-kube-api-access-77gw6\") pod \"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e\" (UID: \"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e\") " Apr 24 21:45:26.107676 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.107559 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-kserve-provision-location\") pod \"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e\" (UID: \"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e\") " Apr 24 21:45:26.107676 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.107580 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-proxy-tls\") pod \"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e\" (UID: \"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e\") " Apr 24 21:45:26.107676 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.107628 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-ab1d4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-isvc-primary-ab1d4b-kube-rbac-proxy-sar-config\") pod \"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e\" (UID: \"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e\") " Apr 24 21:45:26.107989 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.107961 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" (UID: "19f3e54f-d5d6-4a7a-b4d9-b86fea03954e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:26.108077 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.108029 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-isvc-primary-ab1d4b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-ab1d4b-kube-rbac-proxy-sar-config") pod "19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" (UID: "19f3e54f-d5d6-4a7a-b4d9-b86fea03954e"). InnerVolumeSpecName "isvc-primary-ab1d4b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:45:26.109576 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.109556 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-kube-api-access-77gw6" (OuterVolumeSpecName: "kube-api-access-77gw6") pod "19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" (UID: "19f3e54f-d5d6-4a7a-b4d9-b86fea03954e"). InnerVolumeSpecName "kube-api-access-77gw6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:45:26.109642 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.109606 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" (UID: "19f3e54f-d5d6-4a7a-b4d9-b86fea03954e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:45:26.208229 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.208200 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-kserve-provision-location\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:45:26.208229 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.208226 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:45:26.208367 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.208237 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-ab1d4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-isvc-primary-ab1d4b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:45:26.208367 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.208247 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-77gw6\" (UniqueName: \"kubernetes.io/projected/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e-kube-api-access-77gw6\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:45:26.211090 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.211065 2578 generic.go:358] "Generic (PLEG): container finished" podID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" containerID="521398d8cc3cd878301a8c797f5532a9593e048d6068dade4388b07c2848abce" exitCode=0 Apr 24 21:45:26.211179 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.211121 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" event={"ID":"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e","Type":"ContainerDied","Data":"521398d8cc3cd878301a8c797f5532a9593e048d6068dade4388b07c2848abce"} Apr 24 21:45:26.211179 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.211155 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" event={"ID":"19f3e54f-d5d6-4a7a-b4d9-b86fea03954e","Type":"ContainerDied","Data":"bdade82c6a3703078705ab3cec39669971d161b7b3b36fcfccf626ef9084c12c"} Apr 24 21:45:26.211179 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.211158 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22" Apr 24 21:45:26.211284 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.211224 2578 scope.go:117] "RemoveContainer" containerID="2d4e8146aabc7854a9a91c9d9fcc8223226b090eb1569fbe93319fd67be169cb" Apr 24 21:45:26.218881 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.218825 2578 scope.go:117] "RemoveContainer" containerID="521398d8cc3cd878301a8c797f5532a9593e048d6068dade4388b07c2848abce" Apr 24 21:45:26.225664 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.225646 2578 scope.go:117] "RemoveContainer" containerID="55090098533ec652b151cb4b57aebe0e9ebc57618417b0816fcf4a921afde20a" Apr 24 21:45:26.231857 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.231841 2578 scope.go:117] "RemoveContainer" containerID="2d4e8146aabc7854a9a91c9d9fcc8223226b090eb1569fbe93319fd67be169cb" Apr 24 21:45:26.232289 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:45:26.232180 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d4e8146aabc7854a9a91c9d9fcc8223226b090eb1569fbe93319fd67be169cb\": container with ID starting with 2d4e8146aabc7854a9a91c9d9fcc8223226b090eb1569fbe93319fd67be169cb not found: ID does not exist" containerID="2d4e8146aabc7854a9a91c9d9fcc8223226b090eb1569fbe93319fd67be169cb" Apr 24 21:45:26.232289 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.232216 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d4e8146aabc7854a9a91c9d9fcc8223226b090eb1569fbe93319fd67be169cb"} err="failed to get container status \"2d4e8146aabc7854a9a91c9d9fcc8223226b090eb1569fbe93319fd67be169cb\": rpc error: code = NotFound desc = could not find container \"2d4e8146aabc7854a9a91c9d9fcc8223226b090eb1569fbe93319fd67be169cb\": container with ID starting with 2d4e8146aabc7854a9a91c9d9fcc8223226b090eb1569fbe93319fd67be169cb not found: ID does not exist" Apr 24 21:45:26.232289 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.232241 2578 scope.go:117] "RemoveContainer" containerID="521398d8cc3cd878301a8c797f5532a9593e048d6068dade4388b07c2848abce" Apr 24 21:45:26.232712 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:45:26.232678 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"521398d8cc3cd878301a8c797f5532a9593e048d6068dade4388b07c2848abce\": container with ID starting with 521398d8cc3cd878301a8c797f5532a9593e048d6068dade4388b07c2848abce not found: ID does not exist" containerID="521398d8cc3cd878301a8c797f5532a9593e048d6068dade4388b07c2848abce" Apr 24 21:45:26.232821 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.232709 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"521398d8cc3cd878301a8c797f5532a9593e048d6068dade4388b07c2848abce"} err="failed to get container status \"521398d8cc3cd878301a8c797f5532a9593e048d6068dade4388b07c2848abce\": rpc error: code = NotFound desc = could not find container \"521398d8cc3cd878301a8c797f5532a9593e048d6068dade4388b07c2848abce\": container with ID starting with 521398d8cc3cd878301a8c797f5532a9593e048d6068dade4388b07c2848abce not found: ID does not exist" Apr 24 21:45:26.232821 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.232733 2578 scope.go:117] "RemoveContainer" containerID="55090098533ec652b151cb4b57aebe0e9ebc57618417b0816fcf4a921afde20a" Apr 24 21:45:26.233291 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:45:26.233265 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55090098533ec652b151cb4b57aebe0e9ebc57618417b0816fcf4a921afde20a\": container with ID starting with 55090098533ec652b151cb4b57aebe0e9ebc57618417b0816fcf4a921afde20a not found: ID does not exist" containerID="55090098533ec652b151cb4b57aebe0e9ebc57618417b0816fcf4a921afde20a" Apr 24 21:45:26.233401 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.233297 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55090098533ec652b151cb4b57aebe0e9ebc57618417b0816fcf4a921afde20a"} err="failed to get container status \"55090098533ec652b151cb4b57aebe0e9ebc57618417b0816fcf4a921afde20a\": rpc error: code = NotFound desc = could not find container \"55090098533ec652b151cb4b57aebe0e9ebc57618417b0816fcf4a921afde20a\": container with ID starting with 55090098533ec652b151cb4b57aebe0e9ebc57618417b0816fcf4a921afde20a not found: ID does not exist" Apr 24 21:45:26.234617 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.234597 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22"] Apr 24 21:45:26.238211 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:26.238190 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-ab1d4b-predictor-669544d645-n5l22"] Apr 24 21:45:27.762711 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:27.762668 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" path="/var/lib/kubelet/pods/19f3e54f-d5d6-4a7a-b4d9-b86fea03954e/volumes" Apr 24 21:45:29.222503 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:29.222479 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx_4a20173a-6851-4bc3-b559-cfa77ead9848/storage-initializer/0.log" Apr 24 21:45:29.222813 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:29.222514 2578 generic.go:358] "Generic (PLEG): container finished" podID="4a20173a-6851-4bc3-b559-cfa77ead9848" containerID="fe6acd5d19bd1d6398d81cf6198790e9af61a74bf53ad9f02cb8534e8a776c26" exitCode=1 Apr 24 21:45:29.222813 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:29.222544 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" event={"ID":"4a20173a-6851-4bc3-b559-cfa77ead9848","Type":"ContainerDied","Data":"fe6acd5d19bd1d6398d81cf6198790e9af61a74bf53ad9f02cb8534e8a776c26"} Apr 24 21:45:30.227182 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:30.227156 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx_4a20173a-6851-4bc3-b559-cfa77ead9848/storage-initializer/0.log" Apr 24 21:45:30.227549 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:30.227262 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" event={"ID":"4a20173a-6851-4bc3-b559-cfa77ead9848","Type":"ContainerStarted","Data":"e0691e0c98dded516bb94a6baab21120d0a072a9bfc56530539e2440200532b5"} Apr 24 21:45:31.849513 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.849428 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx"] Apr 24 21:45:31.849957 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.849681 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" podUID="4a20173a-6851-4bc3-b559-cfa77ead9848" containerName="storage-initializer" containerID="cri-o://e0691e0c98dded516bb94a6baab21120d0a072a9bfc56530539e2440200532b5" gracePeriod=30 Apr 24 21:45:31.977649 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.977627 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx_4a20173a-6851-4bc3-b559-cfa77ead9848/storage-initializer/1.log" Apr 24 21:45:31.978002 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.977988 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx_4a20173a-6851-4bc3-b559-cfa77ead9848/storage-initializer/0.log" Apr 24 21:45:31.978091 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.978050 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" Apr 24 21:45:31.982186 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.982163 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44"] Apr 24 21:45:31.982452 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.982440 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" containerName="kube-rbac-proxy" Apr 24 21:45:31.982504 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.982453 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" containerName="kube-rbac-proxy" Apr 24 21:45:31.982504 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.982487 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a04a75f1-4fff-4cef-b530-d1dba24e65e1" containerName="storage-initializer" Apr 24 21:45:31.982504 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.982494 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a04a75f1-4fff-4cef-b530-d1dba24e65e1" containerName="storage-initializer" Apr 24 21:45:31.982609 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.982506 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" containerName="storage-initializer" Apr 24 21:45:31.982609 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.982512 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" containerName="storage-initializer" Apr 24 21:45:31.982609 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.982520 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" containerName="kserve-container" Apr 24 21:45:31.982609 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.982525 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" containerName="kserve-container" Apr 24 21:45:31.982609 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.982536 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a20173a-6851-4bc3-b559-cfa77ead9848" containerName="storage-initializer" Apr 24 21:45:31.982609 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.982541 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a20173a-6851-4bc3-b559-cfa77ead9848" containerName="storage-initializer" Apr 24 21:45:31.982609 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.982547 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a20173a-6851-4bc3-b559-cfa77ead9848" containerName="storage-initializer" Apr 24 21:45:31.982609 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.982565 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a20173a-6851-4bc3-b559-cfa77ead9848" containerName="storage-initializer" Apr 24 21:45:31.982870 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.982638 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a20173a-6851-4bc3-b559-cfa77ead9848" containerName="storage-initializer" Apr 24 21:45:31.982870 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.982650 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a04a75f1-4fff-4cef-b530-d1dba24e65e1" containerName="storage-initializer" Apr 24 21:45:31.982870 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.982659 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" containerName="kube-rbac-proxy" Apr 24 21:45:31.982870 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.982666 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="19f3e54f-d5d6-4a7a-b4d9-b86fea03954e" containerName="kserve-container" Apr 24 21:45:31.982870 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.982686 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a04a75f1-4fff-4cef-b530-d1dba24e65e1" containerName="storage-initializer" Apr 24 21:45:31.982870 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.982692 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a20173a-6851-4bc3-b559-cfa77ead9848" containerName="storage-initializer" Apr 24 21:45:31.982870 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.982773 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a04a75f1-4fff-4cef-b530-d1dba24e65e1" containerName="storage-initializer" Apr 24 21:45:31.982870 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.982782 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a04a75f1-4fff-4cef-b530-d1dba24e65e1" containerName="storage-initializer" Apr 24 21:45:31.985787 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.985771 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" Apr 24 21:45:31.988117 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.988098 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-3a3ae-predictor-serving-cert\"" Apr 24 21:45:31.988220 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.988117 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-8t9rb\"" Apr 24 21:45:31.988220 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.988138 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-3a3ae-kube-rbac-proxy-sar-config\"" Apr 24 21:45:31.993712 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:31.993693 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44"] Apr 24 21:45:32.049284 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.049255 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a20173a-6851-4bc3-b559-cfa77ead9848-kserve-provision-location\") pod \"4a20173a-6851-4bc3-b559-cfa77ead9848\" (UID: \"4a20173a-6851-4bc3-b559-cfa77ead9848\") " Apr 24 21:45:32.049473 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.049290 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-e9425b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4a20173a-6851-4bc3-b559-cfa77ead9848-isvc-init-fail-e9425b-kube-rbac-proxy-sar-config\") pod \"4a20173a-6851-4bc3-b559-cfa77ead9848\" (UID: \"4a20173a-6851-4bc3-b559-cfa77ead9848\") " Apr 24 21:45:32.049473 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.049311 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4a20173a-6851-4bc3-b559-cfa77ead9848-cabundle-cert\") pod \"4a20173a-6851-4bc3-b559-cfa77ead9848\" (UID: \"4a20173a-6851-4bc3-b559-cfa77ead9848\") " Apr 24 21:45:32.049473 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.049349 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4a20173a-6851-4bc3-b559-cfa77ead9848-proxy-tls\") pod \"4a20173a-6851-4bc3-b559-cfa77ead9848\" (UID: \"4a20173a-6851-4bc3-b559-cfa77ead9848\") " Apr 24 21:45:32.049473 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.049375 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r4np\" (UniqueName: \"kubernetes.io/projected/4a20173a-6851-4bc3-b559-cfa77ead9848-kube-api-access-9r4np\") pod \"4a20173a-6851-4bc3-b559-cfa77ead9848\" (UID: \"4a20173a-6851-4bc3-b559-cfa77ead9848\") " Apr 24 21:45:32.049705 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.049516 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9163e3dd-d311-4173-ae91-0ca8da093322-proxy-tls\") pod \"raw-sklearn-3a3ae-predictor-7769cb5696-vds44\" (UID: \"9163e3dd-d311-4173-ae91-0ca8da093322\") " pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" Apr 24 21:45:32.049705 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.049566 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-3a3ae-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9163e3dd-d311-4173-ae91-0ca8da093322-raw-sklearn-3a3ae-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-3a3ae-predictor-7769cb5696-vds44\" (UID: \"9163e3dd-d311-4173-ae91-0ca8da093322\") " pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" Apr 24 21:45:32.049705 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.049586 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a20173a-6851-4bc3-b559-cfa77ead9848-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4a20173a-6851-4bc3-b559-cfa77ead9848" (UID: "4a20173a-6851-4bc3-b559-cfa77ead9848"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:32.049705 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.049617 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfcnw\" (UniqueName: \"kubernetes.io/projected/9163e3dd-d311-4173-ae91-0ca8da093322-kube-api-access-pfcnw\") pod \"raw-sklearn-3a3ae-predictor-7769cb5696-vds44\" (UID: \"9163e3dd-d311-4173-ae91-0ca8da093322\") " pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" Apr 24 21:45:32.049705 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.049654 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9163e3dd-d311-4173-ae91-0ca8da093322-kserve-provision-location\") pod \"raw-sklearn-3a3ae-predictor-7769cb5696-vds44\" (UID: \"9163e3dd-d311-4173-ae91-0ca8da093322\") " pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" Apr 24 21:45:32.049705 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.049667 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a20173a-6851-4bc3-b559-cfa77ead9848-isvc-init-fail-e9425b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-e9425b-kube-rbac-proxy-sar-config") pod "4a20173a-6851-4bc3-b559-cfa77ead9848" (UID: "4a20173a-6851-4bc3-b559-cfa77ead9848"). InnerVolumeSpecName "isvc-init-fail-e9425b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:45:32.049944 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.049704 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a20173a-6851-4bc3-b559-cfa77ead9848-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "4a20173a-6851-4bc3-b559-cfa77ead9848" (UID: "4a20173a-6851-4bc3-b559-cfa77ead9848"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:45:32.049944 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.049725 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a20173a-6851-4bc3-b559-cfa77ead9848-kserve-provision-location\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:45:32.051547 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.051524 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a20173a-6851-4bc3-b559-cfa77ead9848-kube-api-access-9r4np" (OuterVolumeSpecName: "kube-api-access-9r4np") pod "4a20173a-6851-4bc3-b559-cfa77ead9848" (UID: "4a20173a-6851-4bc3-b559-cfa77ead9848"). InnerVolumeSpecName "kube-api-access-9r4np". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:45:32.051612 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.051556 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a20173a-6851-4bc3-b559-cfa77ead9848-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4a20173a-6851-4bc3-b559-cfa77ead9848" (UID: "4a20173a-6851-4bc3-b559-cfa77ead9848"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:45:32.150649 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.150558 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9163e3dd-d311-4173-ae91-0ca8da093322-proxy-tls\") pod \"raw-sklearn-3a3ae-predictor-7769cb5696-vds44\" (UID: \"9163e3dd-d311-4173-ae91-0ca8da093322\") " pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" Apr 24 21:45:32.150649 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.150613 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-3a3ae-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9163e3dd-d311-4173-ae91-0ca8da093322-raw-sklearn-3a3ae-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-3a3ae-predictor-7769cb5696-vds44\" (UID: \"9163e3dd-d311-4173-ae91-0ca8da093322\") " pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" Apr 24 21:45:32.150649 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.150642 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pfcnw\" (UniqueName: \"kubernetes.io/projected/9163e3dd-d311-4173-ae91-0ca8da093322-kube-api-access-pfcnw\") pod \"raw-sklearn-3a3ae-predictor-7769cb5696-vds44\" (UID: \"9163e3dd-d311-4173-ae91-0ca8da093322\") " pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" Apr 24 21:45:32.150918 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.150669 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9163e3dd-d311-4173-ae91-0ca8da093322-kserve-provision-location\") pod \"raw-sklearn-3a3ae-predictor-7769cb5696-vds44\" (UID: \"9163e3dd-d311-4173-ae91-0ca8da093322\") " pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" Apr 24 21:45:32.150918 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.150712 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4a20173a-6851-4bc3-b559-cfa77ead9848-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:45:32.150918 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.150727 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9r4np\" (UniqueName: \"kubernetes.io/projected/4a20173a-6851-4bc3-b559-cfa77ead9848-kube-api-access-9r4np\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:45:32.150918 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.150741 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-e9425b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4a20173a-6851-4bc3-b559-cfa77ead9848-isvc-init-fail-e9425b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:45:32.150918 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.150755 2578 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4a20173a-6851-4bc3-b559-cfa77ead9848-cabundle-cert\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:45:32.151158 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.151134 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9163e3dd-d311-4173-ae91-0ca8da093322-kserve-provision-location\") pod \"raw-sklearn-3a3ae-predictor-7769cb5696-vds44\" (UID: \"9163e3dd-d311-4173-ae91-0ca8da093322\") " pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" Apr 24 21:45:32.151451 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.151428 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-3a3ae-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9163e3dd-d311-4173-ae91-0ca8da093322-raw-sklearn-3a3ae-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-3a3ae-predictor-7769cb5696-vds44\" (UID: \"9163e3dd-d311-4173-ae91-0ca8da093322\") " pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" Apr 24 21:45:32.153136 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.153114 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9163e3dd-d311-4173-ae91-0ca8da093322-proxy-tls\") pod \"raw-sklearn-3a3ae-predictor-7769cb5696-vds44\" (UID: \"9163e3dd-d311-4173-ae91-0ca8da093322\") " pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" Apr 24 21:45:32.161271 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.161244 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfcnw\" (UniqueName: \"kubernetes.io/projected/9163e3dd-d311-4173-ae91-0ca8da093322-kube-api-access-pfcnw\") pod \"raw-sklearn-3a3ae-predictor-7769cb5696-vds44\" (UID: \"9163e3dd-d311-4173-ae91-0ca8da093322\") " pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" Apr 24 21:45:32.233970 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.233944 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx_4a20173a-6851-4bc3-b559-cfa77ead9848/storage-initializer/1.log" Apr 24 21:45:32.234296 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.234282 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx_4a20173a-6851-4bc3-b559-cfa77ead9848/storage-initializer/0.log" Apr 24 21:45:32.234351 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.234329 2578 generic.go:358] "Generic (PLEG): container finished" podID="4a20173a-6851-4bc3-b559-cfa77ead9848" containerID="e0691e0c98dded516bb94a6baab21120d0a072a9bfc56530539e2440200532b5" exitCode=1 Apr 24 21:45:32.234416 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.234401 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" Apr 24 21:45:32.234452 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.234409 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" event={"ID":"4a20173a-6851-4bc3-b559-cfa77ead9848","Type":"ContainerDied","Data":"e0691e0c98dded516bb94a6baab21120d0a072a9bfc56530539e2440200532b5"} Apr 24 21:45:32.234452 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.234446 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx" event={"ID":"4a20173a-6851-4bc3-b559-cfa77ead9848","Type":"ContainerDied","Data":"fbf5ac55690b57586aa757e87f1dbd45adbee77fdac31ef36f811a0e4dc20924"} Apr 24 21:45:32.234526 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.234463 2578 scope.go:117] "RemoveContainer" containerID="e0691e0c98dded516bb94a6baab21120d0a072a9bfc56530539e2440200532b5" Apr 24 21:45:32.242639 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.242620 2578 scope.go:117] "RemoveContainer" containerID="fe6acd5d19bd1d6398d81cf6198790e9af61a74bf53ad9f02cb8534e8a776c26" Apr 24 21:45:32.249337 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.249320 2578 scope.go:117] "RemoveContainer" containerID="e0691e0c98dded516bb94a6baab21120d0a072a9bfc56530539e2440200532b5" Apr 24 21:45:32.249598 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:45:32.249580 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0691e0c98dded516bb94a6baab21120d0a072a9bfc56530539e2440200532b5\": container with ID starting with e0691e0c98dded516bb94a6baab21120d0a072a9bfc56530539e2440200532b5 not found: ID does not exist" containerID="e0691e0c98dded516bb94a6baab21120d0a072a9bfc56530539e2440200532b5" Apr 24 21:45:32.249650 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.249606 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0691e0c98dded516bb94a6baab21120d0a072a9bfc56530539e2440200532b5"} err="failed to get container status \"e0691e0c98dded516bb94a6baab21120d0a072a9bfc56530539e2440200532b5\": rpc error: code = NotFound desc = could not find container \"e0691e0c98dded516bb94a6baab21120d0a072a9bfc56530539e2440200532b5\": container with ID starting with e0691e0c98dded516bb94a6baab21120d0a072a9bfc56530539e2440200532b5 not found: ID does not exist" Apr 24 21:45:32.249650 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.249624 2578 scope.go:117] "RemoveContainer" containerID="fe6acd5d19bd1d6398d81cf6198790e9af61a74bf53ad9f02cb8534e8a776c26" Apr 24 21:45:32.249852 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:45:32.249836 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe6acd5d19bd1d6398d81cf6198790e9af61a74bf53ad9f02cb8534e8a776c26\": container with ID starting with fe6acd5d19bd1d6398d81cf6198790e9af61a74bf53ad9f02cb8534e8a776c26 not found: ID does not exist" containerID="fe6acd5d19bd1d6398d81cf6198790e9af61a74bf53ad9f02cb8534e8a776c26" Apr 24 21:45:32.249965 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.249857 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe6acd5d19bd1d6398d81cf6198790e9af61a74bf53ad9f02cb8534e8a776c26"} err="failed to get container status \"fe6acd5d19bd1d6398d81cf6198790e9af61a74bf53ad9f02cb8534e8a776c26\": rpc error: code = NotFound desc = could not find container \"fe6acd5d19bd1d6398d81cf6198790e9af61a74bf53ad9f02cb8534e8a776c26\": container with ID starting with fe6acd5d19bd1d6398d81cf6198790e9af61a74bf53ad9f02cb8534e8a776c26 not found: ID does not exist" Apr 24 21:45:32.271049 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.271023 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx"] Apr 24 21:45:32.275392 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.275368 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-e9425b-predictor-75f7ff4474-bpdjx"] Apr 24 21:45:32.296163 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.296139 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" Apr 24 21:45:32.413940 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:32.413911 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44"] Apr 24 21:45:32.416223 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:45:32.416191 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9163e3dd_d311_4173_ae91_0ca8da093322.slice/crio-7234da7f4d4a8db8d180dde0f032785e187b8cf7f13be451900724afb3c788d7 WatchSource:0}: Error finding container 7234da7f4d4a8db8d180dde0f032785e187b8cf7f13be451900724afb3c788d7: Status 404 returned error can't find the container with id 7234da7f4d4a8db8d180dde0f032785e187b8cf7f13be451900724afb3c788d7 Apr 24 21:45:33.238156 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:33.238120 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" event={"ID":"9163e3dd-d311-4173-ae91-0ca8da093322","Type":"ContainerStarted","Data":"cbc3d8a42d8bdb2009c998abdaaa2fb8dc7e24c1126d8b997a3f310ed96fb356"} Apr 24 21:45:33.238156 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:33.238159 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" event={"ID":"9163e3dd-d311-4173-ae91-0ca8da093322","Type":"ContainerStarted","Data":"7234da7f4d4a8db8d180dde0f032785e187b8cf7f13be451900724afb3c788d7"} Apr 24 21:45:33.763155 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:33.763125 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a20173a-6851-4bc3-b559-cfa77ead9848" path="/var/lib/kubelet/pods/4a20173a-6851-4bc3-b559-cfa77ead9848/volumes" Apr 24 21:45:36.248131 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:36.248101 2578 generic.go:358] "Generic (PLEG): container finished" podID="9163e3dd-d311-4173-ae91-0ca8da093322" containerID="cbc3d8a42d8bdb2009c998abdaaa2fb8dc7e24c1126d8b997a3f310ed96fb356" exitCode=0 Apr 24 21:45:36.248487 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:36.248163 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" event={"ID":"9163e3dd-d311-4173-ae91-0ca8da093322","Type":"ContainerDied","Data":"cbc3d8a42d8bdb2009c998abdaaa2fb8dc7e24c1126d8b997a3f310ed96fb356"} Apr 24 21:45:37.253068 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:37.253036 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" event={"ID":"9163e3dd-d311-4173-ae91-0ca8da093322","Type":"ContainerStarted","Data":"ae9db03199a9673643033fed62700a2c79df12b4174af2cbb9e2c2d911dc15e9"} Apr 24 21:45:37.253068 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:37.253070 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" event={"ID":"9163e3dd-d311-4173-ae91-0ca8da093322","Type":"ContainerStarted","Data":"b003f4c699c0445e22a045f39c197d3737ab4a8c4153994bcf5e55fe438af921"} Apr 24 21:45:37.253476 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:37.253274 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" Apr 24 21:45:37.273387 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:37.273343 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" podStartSLOduration=6.27332957 podStartE2EDuration="6.27332957s" podCreationTimestamp="2026-04-24 21:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:45:37.272165377 +0000 UTC m=+1094.100035536" watchObservedRunningTime="2026-04-24 21:45:37.27332957 +0000 UTC m=+1094.101199702" Apr 24 21:45:38.256131 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:38.256098 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" Apr 24 21:45:38.257472 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:38.257444 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" podUID="9163e3dd-d311-4173-ae91-0ca8da093322" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 21:45:39.259268 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:39.259214 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" podUID="9163e3dd-d311-4173-ae91-0ca8da093322" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 21:45:44.263505 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:44.263476 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" Apr 24 21:45:44.264067 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:44.264040 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" podUID="9163e3dd-d311-4173-ae91-0ca8da093322" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 21:45:54.264584 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:45:54.264542 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" podUID="9163e3dd-d311-4173-ae91-0ca8da093322" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 21:46:04.264885 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:04.264843 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" podUID="9163e3dd-d311-4173-ae91-0ca8da093322" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 21:46:14.264844 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:14.264802 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" podUID="9163e3dd-d311-4173-ae91-0ca8da093322" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 21:46:24.265012 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:24.264971 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" podUID="9163e3dd-d311-4173-ae91-0ca8da093322" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 21:46:34.264305 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:34.264263 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" podUID="9163e3dd-d311-4173-ae91-0ca8da093322" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 21:46:44.264609 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:44.264531 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" Apr 24 21:46:52.089379 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.089350 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44"] Apr 24 21:46:52.089864 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.089660 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" podUID="9163e3dd-d311-4173-ae91-0ca8da093322" containerName="kserve-container" containerID="cri-o://b003f4c699c0445e22a045f39c197d3737ab4a8c4153994bcf5e55fe438af921" gracePeriod=30 Apr 24 21:46:52.089864 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.089703 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" podUID="9163e3dd-d311-4173-ae91-0ca8da093322" containerName="kube-rbac-proxy" containerID="cri-o://ae9db03199a9673643033fed62700a2c79df12b4174af2cbb9e2c2d911dc15e9" gracePeriod=30 Apr 24 21:46:52.177422 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.177390 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2"] Apr 24 21:46:52.180947 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.180926 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" Apr 24 21:46:52.183686 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.183663 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-fba15-predictor-serving-cert\"" Apr 24 21:46:52.183776 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.183689 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-fba15-kube-rbac-proxy-sar-config\"" Apr 24 21:46:52.190877 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.190854 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2"] Apr 24 21:46:52.296284 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.296239 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f621ee3d-e2a2-4f8a-85f4-155e3135575c-kserve-provision-location\") pod \"raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2\" (UID: \"f621ee3d-e2a2-4f8a-85f4-155e3135575c\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" Apr 24 21:46:52.296459 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.296371 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxklm\" (UniqueName: \"kubernetes.io/projected/f621ee3d-e2a2-4f8a-85f4-155e3135575c-kube-api-access-rxklm\") pod \"raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2\" (UID: \"f621ee3d-e2a2-4f8a-85f4-155e3135575c\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" Apr 24 21:46:52.296459 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.296431 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f621ee3d-e2a2-4f8a-85f4-155e3135575c-proxy-tls\") pod \"raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2\" (UID: \"f621ee3d-e2a2-4f8a-85f4-155e3135575c\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" Apr 24 21:46:52.296548 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.296461 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-runtime-fba15-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f621ee3d-e2a2-4f8a-85f4-155e3135575c-raw-sklearn-runtime-fba15-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2\" (UID: \"f621ee3d-e2a2-4f8a-85f4-155e3135575c\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" Apr 24 21:46:52.397330 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.397233 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f621ee3d-e2a2-4f8a-85f4-155e3135575c-kserve-provision-location\") pod \"raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2\" (UID: \"f621ee3d-e2a2-4f8a-85f4-155e3135575c\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" Apr 24 21:46:52.397330 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.397304 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxklm\" (UniqueName: \"kubernetes.io/projected/f621ee3d-e2a2-4f8a-85f4-155e3135575c-kube-api-access-rxklm\") pod \"raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2\" (UID: \"f621ee3d-e2a2-4f8a-85f4-155e3135575c\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" Apr 24 21:46:52.397571 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.397351 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f621ee3d-e2a2-4f8a-85f4-155e3135575c-proxy-tls\") pod \"raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2\" (UID: \"f621ee3d-e2a2-4f8a-85f4-155e3135575c\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" Apr 24 21:46:52.397571 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.397382 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-runtime-fba15-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f621ee3d-e2a2-4f8a-85f4-155e3135575c-raw-sklearn-runtime-fba15-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2\" (UID: \"f621ee3d-e2a2-4f8a-85f4-155e3135575c\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" Apr 24 21:46:52.397571 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:46:52.397492 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-serving-cert: secret "raw-sklearn-runtime-fba15-predictor-serving-cert" not found Apr 24 21:46:52.397571 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:46:52.397565 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f621ee3d-e2a2-4f8a-85f4-155e3135575c-proxy-tls podName:f621ee3d-e2a2-4f8a-85f4-155e3135575c nodeName:}" failed. No retries permitted until 2026-04-24 21:46:52.897543194 +0000 UTC m=+1169.725413307 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f621ee3d-e2a2-4f8a-85f4-155e3135575c-proxy-tls") pod "raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" (UID: "f621ee3d-e2a2-4f8a-85f4-155e3135575c") : secret "raw-sklearn-runtime-fba15-predictor-serving-cert" not found Apr 24 21:46:52.397798 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.397722 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f621ee3d-e2a2-4f8a-85f4-155e3135575c-kserve-provision-location\") pod \"raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2\" (UID: \"f621ee3d-e2a2-4f8a-85f4-155e3135575c\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" Apr 24 21:46:52.398010 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.397992 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-runtime-fba15-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f621ee3d-e2a2-4f8a-85f4-155e3135575c-raw-sklearn-runtime-fba15-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2\" (UID: \"f621ee3d-e2a2-4f8a-85f4-155e3135575c\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" Apr 24 21:46:52.406766 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.406742 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxklm\" (UniqueName: \"kubernetes.io/projected/f621ee3d-e2a2-4f8a-85f4-155e3135575c-kube-api-access-rxklm\") pod \"raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2\" (UID: \"f621ee3d-e2a2-4f8a-85f4-155e3135575c\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" Apr 24 21:46:52.465055 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.465025 2578 generic.go:358] "Generic (PLEG): container finished" podID="9163e3dd-d311-4173-ae91-0ca8da093322" containerID="ae9db03199a9673643033fed62700a2c79df12b4174af2cbb9e2c2d911dc15e9" exitCode=2 Apr 24 21:46:52.465202 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.465095 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" event={"ID":"9163e3dd-d311-4173-ae91-0ca8da093322","Type":"ContainerDied","Data":"ae9db03199a9673643033fed62700a2c79df12b4174af2cbb9e2c2d911dc15e9"} Apr 24 21:46:52.901244 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.901208 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f621ee3d-e2a2-4f8a-85f4-155e3135575c-proxy-tls\") pod \"raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2\" (UID: \"f621ee3d-e2a2-4f8a-85f4-155e3135575c\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" Apr 24 21:46:52.903622 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:52.903601 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f621ee3d-e2a2-4f8a-85f4-155e3135575c-proxy-tls\") pod \"raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2\" (UID: \"f621ee3d-e2a2-4f8a-85f4-155e3135575c\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" Apr 24 21:46:53.091643 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:53.091599 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" Apr 24 21:46:53.220548 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:53.220343 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2"] Apr 24 21:46:53.223317 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:46:53.223285 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf621ee3d_e2a2_4f8a_85f4_155e3135575c.slice/crio-2fd1c86fda6f722666d4a82fed4276e563ba6eae31704b682554acae87a89f63 WatchSource:0}: Error finding container 2fd1c86fda6f722666d4a82fed4276e563ba6eae31704b682554acae87a89f63: Status 404 returned error can't find the container with id 2fd1c86fda6f722666d4a82fed4276e563ba6eae31704b682554acae87a89f63 Apr 24 21:46:53.475232 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:53.475189 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" event={"ID":"f621ee3d-e2a2-4f8a-85f4-155e3135575c","Type":"ContainerStarted","Data":"2d2e38ed012bd0a9f1aa83895641c7c96719e523f62459b355e62d266b9ec555"} Apr 24 21:46:53.475232 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:53.475232 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" event={"ID":"f621ee3d-e2a2-4f8a-85f4-155e3135575c","Type":"ContainerStarted","Data":"2fd1c86fda6f722666d4a82fed4276e563ba6eae31704b682554acae87a89f63"} Apr 24 21:46:54.259815 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:54.259773 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" podUID="9163e3dd-d311-4173-ae91-0ca8da093322" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.31:8643/healthz\": dial tcp 10.133.0.31:8643: connect: connection refused" Apr 24 21:46:54.264128 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:54.264099 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" podUID="9163e3dd-d311-4173-ae91-0ca8da093322" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 21:46:55.841009 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:55.840986 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" Apr 24 21:46:55.923708 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:55.923682 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9163e3dd-d311-4173-ae91-0ca8da093322-proxy-tls\") pod \"9163e3dd-d311-4173-ae91-0ca8da093322\" (UID: \"9163e3dd-d311-4173-ae91-0ca8da093322\") " Apr 24 21:46:55.923852 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:55.923723 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfcnw\" (UniqueName: \"kubernetes.io/projected/9163e3dd-d311-4173-ae91-0ca8da093322-kube-api-access-pfcnw\") pod \"9163e3dd-d311-4173-ae91-0ca8da093322\" (UID: \"9163e3dd-d311-4173-ae91-0ca8da093322\") " Apr 24 21:46:55.923852 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:55.923751 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9163e3dd-d311-4173-ae91-0ca8da093322-kserve-provision-location\") pod \"9163e3dd-d311-4173-ae91-0ca8da093322\" (UID: \"9163e3dd-d311-4173-ae91-0ca8da093322\") " Apr 24 21:46:55.923852 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:55.923788 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-3a3ae-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9163e3dd-d311-4173-ae91-0ca8da093322-raw-sklearn-3a3ae-kube-rbac-proxy-sar-config\") pod \"9163e3dd-d311-4173-ae91-0ca8da093322\" (UID: \"9163e3dd-d311-4173-ae91-0ca8da093322\") " Apr 24 21:46:55.924169 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:55.924132 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9163e3dd-d311-4173-ae91-0ca8da093322-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9163e3dd-d311-4173-ae91-0ca8da093322" (UID: "9163e3dd-d311-4173-ae91-0ca8da093322"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:55.924263 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:55.924189 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9163e3dd-d311-4173-ae91-0ca8da093322-raw-sklearn-3a3ae-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-3a3ae-kube-rbac-proxy-sar-config") pod "9163e3dd-d311-4173-ae91-0ca8da093322" (UID: "9163e3dd-d311-4173-ae91-0ca8da093322"). InnerVolumeSpecName "raw-sklearn-3a3ae-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:46:55.925715 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:55.925691 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9163e3dd-d311-4173-ae91-0ca8da093322-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9163e3dd-d311-4173-ae91-0ca8da093322" (UID: "9163e3dd-d311-4173-ae91-0ca8da093322"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:46:55.925803 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:55.925715 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9163e3dd-d311-4173-ae91-0ca8da093322-kube-api-access-pfcnw" (OuterVolumeSpecName: "kube-api-access-pfcnw") pod "9163e3dd-d311-4173-ae91-0ca8da093322" (UID: "9163e3dd-d311-4173-ae91-0ca8da093322"). InnerVolumeSpecName "kube-api-access-pfcnw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:46:56.024611 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:56.024559 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9163e3dd-d311-4173-ae91-0ca8da093322-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:46:56.024611 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:56.024581 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pfcnw\" (UniqueName: \"kubernetes.io/projected/9163e3dd-d311-4173-ae91-0ca8da093322-kube-api-access-pfcnw\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:46:56.024611 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:56.024592 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9163e3dd-d311-4173-ae91-0ca8da093322-kserve-provision-location\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:46:56.024611 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:56.024603 2578 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-3a3ae-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9163e3dd-d311-4173-ae91-0ca8da093322-raw-sklearn-3a3ae-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:46:56.486981 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:56.486947 2578 generic.go:358] "Generic (PLEG): container finished" podID="9163e3dd-d311-4173-ae91-0ca8da093322" containerID="b003f4c699c0445e22a045f39c197d3737ab4a8c4153994bcf5e55fe438af921" exitCode=0 Apr 24 21:46:56.487119 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:56.487044 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" Apr 24 21:46:56.487165 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:56.487033 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" event={"ID":"9163e3dd-d311-4173-ae91-0ca8da093322","Type":"ContainerDied","Data":"b003f4c699c0445e22a045f39c197d3737ab4a8c4153994bcf5e55fe438af921"} Apr 24 21:46:56.487165 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:56.487162 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44" event={"ID":"9163e3dd-d311-4173-ae91-0ca8da093322","Type":"ContainerDied","Data":"7234da7f4d4a8db8d180dde0f032785e187b8cf7f13be451900724afb3c788d7"} Apr 24 21:46:56.487236 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:56.487180 2578 scope.go:117] "RemoveContainer" containerID="ae9db03199a9673643033fed62700a2c79df12b4174af2cbb9e2c2d911dc15e9" Apr 24 21:46:56.496036 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:56.496016 2578 scope.go:117] "RemoveContainer" containerID="b003f4c699c0445e22a045f39c197d3737ab4a8c4153994bcf5e55fe438af921" Apr 24 21:46:56.503151 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:56.503131 2578 scope.go:117] "RemoveContainer" containerID="cbc3d8a42d8bdb2009c998abdaaa2fb8dc7e24c1126d8b997a3f310ed96fb356" Apr 24 21:46:56.510251 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:56.510226 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44"] Apr 24 21:46:56.510972 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:56.510947 2578 scope.go:117] "RemoveContainer" containerID="ae9db03199a9673643033fed62700a2c79df12b4174af2cbb9e2c2d911dc15e9" Apr 24 21:46:56.511307 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:46:56.511281 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae9db03199a9673643033fed62700a2c79df12b4174af2cbb9e2c2d911dc15e9\": container with ID starting with ae9db03199a9673643033fed62700a2c79df12b4174af2cbb9e2c2d911dc15e9 not found: ID does not exist" containerID="ae9db03199a9673643033fed62700a2c79df12b4174af2cbb9e2c2d911dc15e9" Apr 24 21:46:56.511411 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:56.511313 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae9db03199a9673643033fed62700a2c79df12b4174af2cbb9e2c2d911dc15e9"} err="failed to get container status \"ae9db03199a9673643033fed62700a2c79df12b4174af2cbb9e2c2d911dc15e9\": rpc error: code = NotFound desc = could not find container \"ae9db03199a9673643033fed62700a2c79df12b4174af2cbb9e2c2d911dc15e9\": container with ID starting with ae9db03199a9673643033fed62700a2c79df12b4174af2cbb9e2c2d911dc15e9 not found: ID does not exist" Apr 24 21:46:56.511411 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:56.511331 2578 scope.go:117] "RemoveContainer" containerID="b003f4c699c0445e22a045f39c197d3737ab4a8c4153994bcf5e55fe438af921" Apr 24 21:46:56.511616 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:46:56.511596 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b003f4c699c0445e22a045f39c197d3737ab4a8c4153994bcf5e55fe438af921\": container with ID starting with b003f4c699c0445e22a045f39c197d3737ab4a8c4153994bcf5e55fe438af921 not found: ID does not exist" containerID="b003f4c699c0445e22a045f39c197d3737ab4a8c4153994bcf5e55fe438af921" Apr 24 21:46:56.511664 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:56.511627 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b003f4c699c0445e22a045f39c197d3737ab4a8c4153994bcf5e55fe438af921"} err="failed to get container status \"b003f4c699c0445e22a045f39c197d3737ab4a8c4153994bcf5e55fe438af921\": rpc error: code = NotFound desc = could not find container \"b003f4c699c0445e22a045f39c197d3737ab4a8c4153994bcf5e55fe438af921\": container with ID starting with b003f4c699c0445e22a045f39c197d3737ab4a8c4153994bcf5e55fe438af921 not found: ID does not exist" Apr 24 21:46:56.511664 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:56.511645 2578 scope.go:117] "RemoveContainer" containerID="cbc3d8a42d8bdb2009c998abdaaa2fb8dc7e24c1126d8b997a3f310ed96fb356" Apr 24 21:46:56.511924 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:46:56.511905 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbc3d8a42d8bdb2009c998abdaaa2fb8dc7e24c1126d8b997a3f310ed96fb356\": container with ID starting with cbc3d8a42d8bdb2009c998abdaaa2fb8dc7e24c1126d8b997a3f310ed96fb356 not found: ID does not exist" containerID="cbc3d8a42d8bdb2009c998abdaaa2fb8dc7e24c1126d8b997a3f310ed96fb356" Apr 24 21:46:56.511993 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:56.511935 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbc3d8a42d8bdb2009c998abdaaa2fb8dc7e24c1126d8b997a3f310ed96fb356"} err="failed to get container status \"cbc3d8a42d8bdb2009c998abdaaa2fb8dc7e24c1126d8b997a3f310ed96fb356\": rpc error: code = NotFound desc = could not find container \"cbc3d8a42d8bdb2009c998abdaaa2fb8dc7e24c1126d8b997a3f310ed96fb356\": container with ID starting with cbc3d8a42d8bdb2009c998abdaaa2fb8dc7e24c1126d8b997a3f310ed96fb356 not found: ID does not exist" Apr 24 21:46:56.514514 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:56.514494 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-3a3ae-predictor-7769cb5696-vds44"] Apr 24 21:46:57.491345 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:57.491316 2578 generic.go:358] "Generic (PLEG): container finished" podID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerID="2d2e38ed012bd0a9f1aa83895641c7c96719e523f62459b355e62d266b9ec555" exitCode=0 Apr 24 21:46:57.491670 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:57.491348 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" event={"ID":"f621ee3d-e2a2-4f8a-85f4-155e3135575c","Type":"ContainerDied","Data":"2d2e38ed012bd0a9f1aa83895641c7c96719e523f62459b355e62d266b9ec555"} Apr 24 21:46:57.767415 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:57.767342 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9163e3dd-d311-4173-ae91-0ca8da093322" path="/var/lib/kubelet/pods/9163e3dd-d311-4173-ae91-0ca8da093322/volumes" Apr 24 21:46:58.496377 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:58.496340 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" event={"ID":"f621ee3d-e2a2-4f8a-85f4-155e3135575c","Type":"ContainerStarted","Data":"d40fe2e4231a9645d2158833c846e06f91d59471e8778e64023c1e7c62ad0dea"} Apr 24 21:46:58.496723 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:58.496383 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" event={"ID":"f621ee3d-e2a2-4f8a-85f4-155e3135575c","Type":"ContainerStarted","Data":"e859088e7a93b136834b6e6a4e009d4d3a20d3affce3ac883890f7096aa19169"} Apr 24 21:46:58.496723 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:58.496615 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" Apr 24 21:46:58.521496 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:58.521456 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" podStartSLOduration=6.521445211 podStartE2EDuration="6.521445211s" podCreationTimestamp="2026-04-24 21:46:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:46:58.520594058 +0000 UTC m=+1175.348464190" watchObservedRunningTime="2026-04-24 21:46:58.521445211 +0000 UTC m=+1175.349315343" Apr 24 21:46:59.499709 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:59.499672 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" Apr 24 21:46:59.500901 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:46:59.500858 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" podUID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 21:47:00.502711 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:47:00.502673 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" podUID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 21:47:05.508018 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:47:05.507982 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" Apr 24 21:47:05.508588 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:47:05.508557 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" podUID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 21:47:15.508480 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:47:15.508441 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" podUID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 21:47:23.697080 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:47:23.697049 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krb9p_cde5bc87-530f-4ee7-8f38-39b875bbd4e6/ovn-acl-logging/0.log" Apr 24 21:47:23.699832 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:47:23.699810 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krb9p_cde5bc87-530f-4ee7-8f38-39b875bbd4e6/ovn-acl-logging/0.log" Apr 24 21:47:25.509404 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:47:25.509366 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" podUID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 21:47:35.509425 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:47:35.509381 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" podUID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 21:47:45.508846 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:47:45.508805 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" podUID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 21:47:55.509049 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:47:55.509008 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" podUID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 21:48:05.509219 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:05.509183 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" Apr 24 21:48:12.262403 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:12.262365 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2"] Apr 24 21:48:12.262850 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:12.262676 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" podUID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerName="kserve-container" containerID="cri-o://e859088e7a93b136834b6e6a4e009d4d3a20d3affce3ac883890f7096aa19169" gracePeriod=30 Apr 24 21:48:12.262850 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:12.262715 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" podUID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerName="kube-rbac-proxy" containerID="cri-o://d40fe2e4231a9645d2158833c846e06f91d59471e8778e64023c1e7c62ad0dea" gracePeriod=30 Apr 24 21:48:12.713319 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:12.713286 2578 generic.go:358] "Generic (PLEG): container finished" podID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerID="d40fe2e4231a9645d2158833c846e06f91d59471e8778e64023c1e7c62ad0dea" exitCode=2 Apr 24 21:48:12.713474 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:12.713362 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" event={"ID":"f621ee3d-e2a2-4f8a-85f4-155e3135575c","Type":"ContainerDied","Data":"d40fe2e4231a9645d2158833c846e06f91d59471e8778e64023c1e7c62ad0dea"} Apr 24 21:48:13.696269 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:13.696235 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6d6gw/must-gather-w9rsw"] Apr 24 21:48:13.696671 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:13.696547 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9163e3dd-d311-4173-ae91-0ca8da093322" containerName="storage-initializer" Apr 24 21:48:13.696671 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:13.696559 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9163e3dd-d311-4173-ae91-0ca8da093322" containerName="storage-initializer" Apr 24 21:48:13.696671 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:13.696576 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9163e3dd-d311-4173-ae91-0ca8da093322" containerName="kserve-container" Apr 24 21:48:13.696671 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:13.696582 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9163e3dd-d311-4173-ae91-0ca8da093322" containerName="kserve-container" Apr 24 21:48:13.696671 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:13.696589 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9163e3dd-d311-4173-ae91-0ca8da093322" containerName="kube-rbac-proxy" Apr 24 21:48:13.696671 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:13.696595 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9163e3dd-d311-4173-ae91-0ca8da093322" containerName="kube-rbac-proxy" Apr 24 21:48:13.696671 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:13.696637 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9163e3dd-d311-4173-ae91-0ca8da093322" containerName="kserve-container" Apr 24 21:48:13.696671 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:13.696647 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9163e3dd-d311-4173-ae91-0ca8da093322" containerName="kube-rbac-proxy" Apr 24 21:48:13.699592 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:13.699575 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6d6gw/must-gather-w9rsw" Apr 24 21:48:13.702323 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:13.702293 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6d6gw\"/\"kube-root-ca.crt\"" Apr 24 21:48:13.703419 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:13.703404 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6d6gw\"/\"default-dockercfg-5x5n6\"" Apr 24 21:48:13.703472 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:13.703408 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6d6gw\"/\"openshift-service-ca.crt\"" Apr 24 21:48:13.707119 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:13.707095 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6d6gw/must-gather-w9rsw"] Apr 24 21:48:13.753095 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:13.753055 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c0675631-e176-4fa8-a420-59807bf7b747-must-gather-output\") pod \"must-gather-w9rsw\" (UID: \"c0675631-e176-4fa8-a420-59807bf7b747\") " pod="openshift-must-gather-6d6gw/must-gather-w9rsw" Apr 24 21:48:13.753278 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:13.753106 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd5f6\" (UniqueName: \"kubernetes.io/projected/c0675631-e176-4fa8-a420-59807bf7b747-kube-api-access-pd5f6\") pod \"must-gather-w9rsw\" (UID: \"c0675631-e176-4fa8-a420-59807bf7b747\") " pod="openshift-must-gather-6d6gw/must-gather-w9rsw" Apr 24 21:48:13.853878 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:13.853840 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c0675631-e176-4fa8-a420-59807bf7b747-must-gather-output\") pod \"must-gather-w9rsw\" (UID: \"c0675631-e176-4fa8-a420-59807bf7b747\") " pod="openshift-must-gather-6d6gw/must-gather-w9rsw" Apr 24 21:48:13.854079 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:13.853917 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pd5f6\" (UniqueName: \"kubernetes.io/projected/c0675631-e176-4fa8-a420-59807bf7b747-kube-api-access-pd5f6\") pod \"must-gather-w9rsw\" (UID: \"c0675631-e176-4fa8-a420-59807bf7b747\") " pod="openshift-must-gather-6d6gw/must-gather-w9rsw" Apr 24 21:48:13.854203 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:13.854178 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c0675631-e176-4fa8-a420-59807bf7b747-must-gather-output\") pod \"must-gather-w9rsw\" (UID: \"c0675631-e176-4fa8-a420-59807bf7b747\") " pod="openshift-must-gather-6d6gw/must-gather-w9rsw" Apr 24 21:48:13.863198 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:13.863177 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd5f6\" (UniqueName: \"kubernetes.io/projected/c0675631-e176-4fa8-a420-59807bf7b747-kube-api-access-pd5f6\") pod \"must-gather-w9rsw\" (UID: \"c0675631-e176-4fa8-a420-59807bf7b747\") " pod="openshift-must-gather-6d6gw/must-gather-w9rsw" Apr 24 21:48:14.020524 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:14.020431 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6d6gw/must-gather-w9rsw" Apr 24 21:48:14.146736 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:14.146705 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6d6gw/must-gather-w9rsw"] Apr 24 21:48:14.149422 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:48:14.149381 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0675631_e176_4fa8_a420_59807bf7b747.slice/crio-d2346821c53beda4a54ff94a385662ca3752b56a48d04a21a7e030b7f7e50c40 WatchSource:0}: Error finding container d2346821c53beda4a54ff94a385662ca3752b56a48d04a21a7e030b7f7e50c40: Status 404 returned error can't find the container with id d2346821c53beda4a54ff94a385662ca3752b56a48d04a21a7e030b7f7e50c40 Apr 24 21:48:14.720452 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:14.720413 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6d6gw/must-gather-w9rsw" event={"ID":"c0675631-e176-4fa8-a420-59807bf7b747","Type":"ContainerStarted","Data":"d2346821c53beda4a54ff94a385662ca3752b56a48d04a21a7e030b7f7e50c40"} Apr 24 21:48:15.503445 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:15.503399 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" podUID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 24 21:48:15.508788 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:15.508750 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" podUID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 21:48:17.732370 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:17.732332 2578 generic.go:358] "Generic (PLEG): container finished" podID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerID="e859088e7a93b136834b6e6a4e009d4d3a20d3affce3ac883890f7096aa19169" exitCode=0 Apr 24 21:48:17.732758 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:17.732413 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" event={"ID":"f621ee3d-e2a2-4f8a-85f4-155e3135575c","Type":"ContainerDied","Data":"e859088e7a93b136834b6e6a4e009d4d3a20d3affce3ac883890f7096aa19169"} Apr 24 21:48:18.721154 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:18.721129 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" Apr 24 21:48:18.736306 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:18.736274 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" event={"ID":"f621ee3d-e2a2-4f8a-85f4-155e3135575c","Type":"ContainerDied","Data":"2fd1c86fda6f722666d4a82fed4276e563ba6eae31704b682554acae87a89f63"} Apr 24 21:48:18.736655 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:18.736314 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2" Apr 24 21:48:18.736655 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:18.736326 2578 scope.go:117] "RemoveContainer" containerID="d40fe2e4231a9645d2158833c846e06f91d59471e8778e64023c1e7c62ad0dea" Apr 24 21:48:18.751134 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:18.751091 2578 scope.go:117] "RemoveContainer" containerID="e859088e7a93b136834b6e6a4e009d4d3a20d3affce3ac883890f7096aa19169" Apr 24 21:48:18.760145 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:18.760019 2578 scope.go:117] "RemoveContainer" containerID="2d2e38ed012bd0a9f1aa83895641c7c96719e523f62459b355e62d266b9ec555" Apr 24 21:48:18.796998 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:18.796967 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-runtime-fba15-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f621ee3d-e2a2-4f8a-85f4-155e3135575c-raw-sklearn-runtime-fba15-kube-rbac-proxy-sar-config\") pod \"f621ee3d-e2a2-4f8a-85f4-155e3135575c\" (UID: \"f621ee3d-e2a2-4f8a-85f4-155e3135575c\") " Apr 24 21:48:18.797134 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:18.797095 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f621ee3d-e2a2-4f8a-85f4-155e3135575c-kserve-provision-location\") pod \"f621ee3d-e2a2-4f8a-85f4-155e3135575c\" (UID: \"f621ee3d-e2a2-4f8a-85f4-155e3135575c\") " Apr 24 21:48:18.797289 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:18.797263 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxklm\" (UniqueName: \"kubernetes.io/projected/f621ee3d-e2a2-4f8a-85f4-155e3135575c-kube-api-access-rxklm\") pod \"f621ee3d-e2a2-4f8a-85f4-155e3135575c\" (UID: \"f621ee3d-e2a2-4f8a-85f4-155e3135575c\") " Apr 24 21:48:18.797399 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:18.797310 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f621ee3d-e2a2-4f8a-85f4-155e3135575c-proxy-tls\") pod \"f621ee3d-e2a2-4f8a-85f4-155e3135575c\" (UID: \"f621ee3d-e2a2-4f8a-85f4-155e3135575c\") " Apr 24 21:48:18.797399 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:18.797337 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f621ee3d-e2a2-4f8a-85f4-155e3135575c-raw-sklearn-runtime-fba15-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-runtime-fba15-kube-rbac-proxy-sar-config") pod "f621ee3d-e2a2-4f8a-85f4-155e3135575c" (UID: "f621ee3d-e2a2-4f8a-85f4-155e3135575c"). InnerVolumeSpecName "raw-sklearn-runtime-fba15-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:48:18.797537 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:18.797405 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f621ee3d-e2a2-4f8a-85f4-155e3135575c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f621ee3d-e2a2-4f8a-85f4-155e3135575c" (UID: "f621ee3d-e2a2-4f8a-85f4-155e3135575c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:18.797596 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:18.797547 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f621ee3d-e2a2-4f8a-85f4-155e3135575c-kserve-provision-location\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:48:18.797596 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:18.797567 2578 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-runtime-fba15-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f621ee3d-e2a2-4f8a-85f4-155e3135575c-raw-sklearn-runtime-fba15-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:48:18.799926 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:18.799868 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f621ee3d-e2a2-4f8a-85f4-155e3135575c-kube-api-access-rxklm" (OuterVolumeSpecName: "kube-api-access-rxklm") pod "f621ee3d-e2a2-4f8a-85f4-155e3135575c" (UID: "f621ee3d-e2a2-4f8a-85f4-155e3135575c"). InnerVolumeSpecName "kube-api-access-rxklm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:48:18.800020 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:18.799975 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f621ee3d-e2a2-4f8a-85f4-155e3135575c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f621ee3d-e2a2-4f8a-85f4-155e3135575c" (UID: "f621ee3d-e2a2-4f8a-85f4-155e3135575c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:48:18.898798 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:18.898769 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f621ee3d-e2a2-4f8a-85f4-155e3135575c-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:48:18.898798 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:18.898799 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rxklm\" (UniqueName: \"kubernetes.io/projected/f621ee3d-e2a2-4f8a-85f4-155e3135575c-kube-api-access-rxklm\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:48:19.064985 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:19.064953 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2"] Apr 24 21:48:19.068929 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:19.068880 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-fba15-predictor-5985f556c8-fcbb2"] Apr 24 21:48:19.740807 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:19.740770 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6d6gw/must-gather-w9rsw" event={"ID":"c0675631-e176-4fa8-a420-59807bf7b747","Type":"ContainerStarted","Data":"465fe0e8920f730354d9d85c03f284ce9f57349a430c832dc6e9a4abf58286da"} Apr 24 21:48:19.740807 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:19.740812 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6d6gw/must-gather-w9rsw" event={"ID":"c0675631-e176-4fa8-a420-59807bf7b747","Type":"ContainerStarted","Data":"2c4ecd94ed4a2dde3be5e8c0c2e47adc73e452f2929e2e3f55decea8bcdd8e61"} Apr 24 21:48:19.756883 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:19.756827 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6d6gw/must-gather-w9rsw" podStartSLOduration=2.132619339 podStartE2EDuration="6.756809374s" podCreationTimestamp="2026-04-24 21:48:13 +0000 UTC" firstStartedPulling="2026-04-24 21:48:14.151095424 +0000 UTC m=+1250.978965537" lastFinishedPulling="2026-04-24 21:48:18.775285462 +0000 UTC m=+1255.603155572" observedRunningTime="2026-04-24 21:48:19.756201671 +0000 UTC m=+1256.584071803" watchObservedRunningTime="2026-04-24 21:48:19.756809374 +0000 UTC m=+1256.584679506" Apr 24 21:48:19.762482 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:19.762449 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" path="/var/lib/kubelet/pods/f621ee3d-e2a2-4f8a-85f4-155e3135575c/volumes" Apr 24 21:48:38.804627 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:38.804597 2578 generic.go:358] "Generic (PLEG): container finished" podID="c0675631-e176-4fa8-a420-59807bf7b747" containerID="2c4ecd94ed4a2dde3be5e8c0c2e47adc73e452f2929e2e3f55decea8bcdd8e61" exitCode=0 Apr 24 21:48:38.805058 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:38.804677 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6d6gw/must-gather-w9rsw" event={"ID":"c0675631-e176-4fa8-a420-59807bf7b747","Type":"ContainerDied","Data":"2c4ecd94ed4a2dde3be5e8c0c2e47adc73e452f2929e2e3f55decea8bcdd8e61"} Apr 24 21:48:38.805058 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:38.804968 2578 scope.go:117] "RemoveContainer" containerID="2c4ecd94ed4a2dde3be5e8c0c2e47adc73e452f2929e2e3f55decea8bcdd8e61" Apr 24 21:48:38.842602 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:38.842577 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6d6gw_must-gather-w9rsw_c0675631-e176-4fa8-a420-59807bf7b747/gather/0.log" Apr 24 21:48:39.379480 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:39.379450 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ngn69/must-gather-dwgpb"] Apr 24 21:48:39.379770 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:39.379757 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerName="kube-rbac-proxy" Apr 24 21:48:39.379816 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:39.379773 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerName="kube-rbac-proxy" Apr 24 21:48:39.379816 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:39.379788 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerName="kserve-container" Apr 24 21:48:39.379816 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:39.379793 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerName="kserve-container" Apr 24 21:48:39.379816 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:39.379810 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerName="storage-initializer" Apr 24 21:48:39.379816 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:39.379816 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerName="storage-initializer" Apr 24 21:48:39.380029 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:39.379870 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerName="kube-rbac-proxy" Apr 24 21:48:39.380029 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:39.379881 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f621ee3d-e2a2-4f8a-85f4-155e3135575c" containerName="kserve-container" Apr 24 21:48:39.382871 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:39.382851 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngn69/must-gather-dwgpb" Apr 24 21:48:39.385614 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:39.385597 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-ngn69\"/\"kube-root-ca.crt\"" Apr 24 21:48:39.386716 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:39.386700 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-ngn69\"/\"default-dockercfg-b5l6f\"" Apr 24 21:48:39.386787 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:39.386703 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-ngn69\"/\"openshift-service-ca.crt\"" Apr 24 21:48:39.390422 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:39.390400 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ngn69/must-gather-dwgpb"] Apr 24 21:48:39.478221 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:39.478194 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjqjv\" (UniqueName: \"kubernetes.io/projected/d90f6785-c349-4543-a074-4cbe08eb1897-kube-api-access-zjqjv\") pod \"must-gather-dwgpb\" (UID: \"d90f6785-c349-4543-a074-4cbe08eb1897\") " pod="openshift-must-gather-ngn69/must-gather-dwgpb" Apr 24 21:48:39.478317 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:39.478250 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d90f6785-c349-4543-a074-4cbe08eb1897-must-gather-output\") pod \"must-gather-dwgpb\" (UID: \"d90f6785-c349-4543-a074-4cbe08eb1897\") " pod="openshift-must-gather-ngn69/must-gather-dwgpb" Apr 24 21:48:39.578584 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:39.578558 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjqjv\" (UniqueName: \"kubernetes.io/projected/d90f6785-c349-4543-a074-4cbe08eb1897-kube-api-access-zjqjv\") pod \"must-gather-dwgpb\" (UID: \"d90f6785-c349-4543-a074-4cbe08eb1897\") " pod="openshift-must-gather-ngn69/must-gather-dwgpb" Apr 24 21:48:39.578690 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:39.578602 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d90f6785-c349-4543-a074-4cbe08eb1897-must-gather-output\") pod \"must-gather-dwgpb\" (UID: \"d90f6785-c349-4543-a074-4cbe08eb1897\") " pod="openshift-must-gather-ngn69/must-gather-dwgpb" Apr 24 21:48:39.578876 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:39.578850 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d90f6785-c349-4543-a074-4cbe08eb1897-must-gather-output\") pod \"must-gather-dwgpb\" (UID: \"d90f6785-c349-4543-a074-4cbe08eb1897\") " pod="openshift-must-gather-ngn69/must-gather-dwgpb" Apr 24 21:48:39.588421 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:39.588399 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjqjv\" (UniqueName: \"kubernetes.io/projected/d90f6785-c349-4543-a074-4cbe08eb1897-kube-api-access-zjqjv\") pod \"must-gather-dwgpb\" (UID: \"d90f6785-c349-4543-a074-4cbe08eb1897\") " pod="openshift-must-gather-ngn69/must-gather-dwgpb" Apr 24 21:48:39.692212 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:39.692191 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngn69/must-gather-dwgpb" Apr 24 21:48:39.809456 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:39.809420 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ngn69/must-gather-dwgpb"] Apr 24 21:48:39.812065 ip-10-0-139-184 kubenswrapper[2578]: W0424 21:48:39.812039 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd90f6785_c349_4543_a074_4cbe08eb1897.slice/crio-c086610005ac095f683f4c80a9dc1f45abfa758d66ce1bb8057006202ce909a5 WatchSource:0}: Error finding container c086610005ac095f683f4c80a9dc1f45abfa758d66ce1bb8057006202ce909a5: Status 404 returned error can't find the container with id c086610005ac095f683f4c80a9dc1f45abfa758d66ce1bb8057006202ce909a5 Apr 24 21:48:40.811750 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:40.811715 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ngn69/must-gather-dwgpb" event={"ID":"d90f6785-c349-4543-a074-4cbe08eb1897","Type":"ContainerStarted","Data":"c086610005ac095f683f4c80a9dc1f45abfa758d66ce1bb8057006202ce909a5"} Apr 24 21:48:41.818588 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:41.818552 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ngn69/must-gather-dwgpb" event={"ID":"d90f6785-c349-4543-a074-4cbe08eb1897","Type":"ContainerStarted","Data":"a8f8969d22b9628e179fcd5f1db853a0e2b9859fed27f23bf113c20a123e279d"} Apr 24 21:48:41.819158 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:41.819131 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ngn69/must-gather-dwgpb" event={"ID":"d90f6785-c349-4543-a074-4cbe08eb1897","Type":"ContainerStarted","Data":"e722b989578a8e727284d80938f987e2ef42ea086593398970b9f1f1dff357f2"} Apr 24 21:48:41.842798 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:41.842733 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ngn69/must-gather-dwgpb" podStartSLOduration=1.9181126750000002 podStartE2EDuration="2.842713892s" podCreationTimestamp="2026-04-24 21:48:39 +0000 UTC" firstStartedPulling="2026-04-24 21:48:39.81376614 +0000 UTC m=+1276.641636255" lastFinishedPulling="2026-04-24 21:48:40.738367359 +0000 UTC m=+1277.566237472" observedRunningTime="2026-04-24 21:48:41.834729716 +0000 UTC m=+1278.662599874" watchObservedRunningTime="2026-04-24 21:48:41.842713892 +0000 UTC m=+1278.670584025" Apr 24 21:48:42.347068 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:42.347031 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hxdgr_7a2b19a8-7cce-48ea-a91f-3306187c2d2a/global-pull-secret-syncer/0.log" Apr 24 21:48:42.428224 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:42.428192 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-sprws_70296c6e-ab82-4b02-8f53-04c16225df28/konnectivity-agent/0.log" Apr 24 21:48:42.542221 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:42.542191 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-184.ec2.internal_e3440d423eacb4ec58cf1cc320321a41/haproxy/0.log" Apr 24 21:48:44.220440 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.220406 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6d6gw/must-gather-w9rsw"] Apr 24 21:48:44.221425 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.221390 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-6d6gw/must-gather-w9rsw" podUID="c0675631-e176-4fa8-a420-59807bf7b747" containerName="copy" containerID="cri-o://465fe0e8920f730354d9d85c03f284ce9f57349a430c832dc6e9a4abf58286da" gracePeriod=2 Apr 24 21:48:44.223836 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.223794 2578 status_manager.go:895] "Failed to get status for pod" podUID="c0675631-e176-4fa8-a420-59807bf7b747" pod="openshift-must-gather-6d6gw/must-gather-w9rsw" err="pods \"must-gather-w9rsw\" is forbidden: User \"system:node:ip-10-0-139-184.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-6d6gw\": no relationship found between node 'ip-10-0-139-184.ec2.internal' and this object" Apr 24 21:48:44.226236 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.226210 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6d6gw/must-gather-w9rsw"] Apr 24 21:48:44.572775 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.571634 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6d6gw_must-gather-w9rsw_c0675631-e176-4fa8-a420-59807bf7b747/copy/0.log" Apr 24 21:48:44.572775 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.572056 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6d6gw/must-gather-w9rsw" Apr 24 21:48:44.575121 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.575081 2578 status_manager.go:895] "Failed to get status for pod" podUID="c0675631-e176-4fa8-a420-59807bf7b747" pod="openshift-must-gather-6d6gw/must-gather-w9rsw" err="pods \"must-gather-w9rsw\" is forbidden: User \"system:node:ip-10-0-139-184.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-6d6gw\": no relationship found between node 'ip-10-0-139-184.ec2.internal' and this object" Apr 24 21:48:44.724960 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.724277 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c0675631-e176-4fa8-a420-59807bf7b747-must-gather-output\") pod \"c0675631-e176-4fa8-a420-59807bf7b747\" (UID: \"c0675631-e176-4fa8-a420-59807bf7b747\") " Apr 24 21:48:44.724960 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.724391 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd5f6\" (UniqueName: \"kubernetes.io/projected/c0675631-e176-4fa8-a420-59807bf7b747-kube-api-access-pd5f6\") pod \"c0675631-e176-4fa8-a420-59807bf7b747\" (UID: \"c0675631-e176-4fa8-a420-59807bf7b747\") " Apr 24 21:48:44.728048 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.728017 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0675631-e176-4fa8-a420-59807bf7b747-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c0675631-e176-4fa8-a420-59807bf7b747" (UID: "c0675631-e176-4fa8-a420-59807bf7b747"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:44.731401 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.731305 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0675631-e176-4fa8-a420-59807bf7b747-kube-api-access-pd5f6" (OuterVolumeSpecName: "kube-api-access-pd5f6") pod "c0675631-e176-4fa8-a420-59807bf7b747" (UID: "c0675631-e176-4fa8-a420-59807bf7b747"). InnerVolumeSpecName "kube-api-access-pd5f6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:48:44.825953 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.825853 2578 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c0675631-e176-4fa8-a420-59807bf7b747-must-gather-output\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:48:44.826250 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.826230 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pd5f6\" (UniqueName: \"kubernetes.io/projected/c0675631-e176-4fa8-a420-59807bf7b747-kube-api-access-pd5f6\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 24 21:48:44.831072 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.831046 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6d6gw_must-gather-w9rsw_c0675631-e176-4fa8-a420-59807bf7b747/copy/0.log" Apr 24 21:48:44.831461 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.831428 2578 generic.go:358] "Generic (PLEG): container finished" podID="c0675631-e176-4fa8-a420-59807bf7b747" containerID="465fe0e8920f730354d9d85c03f284ce9f57349a430c832dc6e9a4abf58286da" exitCode=143 Apr 24 21:48:44.831578 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.831564 2578 scope.go:117] "RemoveContainer" containerID="465fe0e8920f730354d9d85c03f284ce9f57349a430c832dc6e9a4abf58286da" Apr 24 21:48:44.831722 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.831706 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6d6gw/must-gather-w9rsw" Apr 24 21:48:44.837404 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.837318 2578 status_manager.go:895] "Failed to get status for pod" podUID="c0675631-e176-4fa8-a420-59807bf7b747" pod="openshift-must-gather-6d6gw/must-gather-w9rsw" err="pods \"must-gather-w9rsw\" is forbidden: User \"system:node:ip-10-0-139-184.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-6d6gw\": no relationship found between node 'ip-10-0-139-184.ec2.internal' and this object" Apr 24 21:48:44.847013 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.846939 2578 scope.go:117] "RemoveContainer" containerID="2c4ecd94ed4a2dde3be5e8c0c2e47adc73e452f2929e2e3f55decea8bcdd8e61" Apr 24 21:48:44.855696 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.855655 2578 status_manager.go:895] "Failed to get status for pod" podUID="c0675631-e176-4fa8-a420-59807bf7b747" pod="openshift-must-gather-6d6gw/must-gather-w9rsw" err="pods \"must-gather-w9rsw\" is forbidden: User \"system:node:ip-10-0-139-184.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-6d6gw\": no relationship found between node 'ip-10-0-139-184.ec2.internal' and this object" Apr 24 21:48:44.870661 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.870558 2578 scope.go:117] "RemoveContainer" containerID="465fe0e8920f730354d9d85c03f284ce9f57349a430c832dc6e9a4abf58286da" Apr 24 21:48:44.871672 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:48:44.871402 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"465fe0e8920f730354d9d85c03f284ce9f57349a430c832dc6e9a4abf58286da\": container with ID starting with 465fe0e8920f730354d9d85c03f284ce9f57349a430c832dc6e9a4abf58286da not found: ID does not exist" containerID="465fe0e8920f730354d9d85c03f284ce9f57349a430c832dc6e9a4abf58286da" Apr 24 21:48:44.871672 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.871442 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"465fe0e8920f730354d9d85c03f284ce9f57349a430c832dc6e9a4abf58286da"} err="failed to get container status \"465fe0e8920f730354d9d85c03f284ce9f57349a430c832dc6e9a4abf58286da\": rpc error: code = NotFound desc = could not find container \"465fe0e8920f730354d9d85c03f284ce9f57349a430c832dc6e9a4abf58286da\": container with ID starting with 465fe0e8920f730354d9d85c03f284ce9f57349a430c832dc6e9a4abf58286da not found: ID does not exist" Apr 24 21:48:44.871672 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.871469 2578 scope.go:117] "RemoveContainer" containerID="2c4ecd94ed4a2dde3be5e8c0c2e47adc73e452f2929e2e3f55decea8bcdd8e61" Apr 24 21:48:44.873173 ip-10-0-139-184 kubenswrapper[2578]: E0424 21:48:44.872483 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c4ecd94ed4a2dde3be5e8c0c2e47adc73e452f2929e2e3f55decea8bcdd8e61\": container with ID starting with 2c4ecd94ed4a2dde3be5e8c0c2e47adc73e452f2929e2e3f55decea8bcdd8e61 not found: ID does not exist" containerID="2c4ecd94ed4a2dde3be5e8c0c2e47adc73e452f2929e2e3f55decea8bcdd8e61" Apr 24 21:48:44.873173 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:44.872523 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c4ecd94ed4a2dde3be5e8c0c2e47adc73e452f2929e2e3f55decea8bcdd8e61"} err="failed to get container status \"2c4ecd94ed4a2dde3be5e8c0c2e47adc73e452f2929e2e3f55decea8bcdd8e61\": rpc error: code = NotFound desc = could not find container \"2c4ecd94ed4a2dde3be5e8c0c2e47adc73e452f2929e2e3f55decea8bcdd8e61\": container with ID starting with 2c4ecd94ed4a2dde3be5e8c0c2e47adc73e452f2929e2e3f55decea8bcdd8e61 not found: ID does not exist" Apr 24 21:48:45.539456 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:45.539419 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4d98a752-54c8-4291-8475-dcb2e5810620/alertmanager/0.log" Apr 24 21:48:45.575183 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:45.575042 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4d98a752-54c8-4291-8475-dcb2e5810620/config-reloader/0.log" Apr 24 21:48:45.609773 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:45.609733 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4d98a752-54c8-4291-8475-dcb2e5810620/kube-rbac-proxy-web/0.log" Apr 24 21:48:45.646785 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:45.646726 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4d98a752-54c8-4291-8475-dcb2e5810620/kube-rbac-proxy/0.log" Apr 24 21:48:45.672659 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:45.672512 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4d98a752-54c8-4291-8475-dcb2e5810620/kube-rbac-proxy-metric/0.log" Apr 24 21:48:45.702129 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:45.702024 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4d98a752-54c8-4291-8475-dcb2e5810620/prom-label-proxy/0.log" Apr 24 21:48:45.727691 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:45.727661 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4d98a752-54c8-4291-8475-dcb2e5810620/init-config-reloader/0.log" Apr 24 21:48:45.762980 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:45.762950 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0675631-e176-4fa8-a420-59807bf7b747" path="/var/lib/kubelet/pods/c0675631-e176-4fa8-a420-59807bf7b747/volumes" Apr 24 21:48:45.794209 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:45.794080 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-4nrh7_21d7eb0f-d8da-459c-b217-8af29243babb/kube-state-metrics/0.log" Apr 24 21:48:45.822346 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:45.822315 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-4nrh7_21d7eb0f-d8da-459c-b217-8af29243babb/kube-rbac-proxy-main/0.log" Apr 24 21:48:45.846144 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:45.846113 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-4nrh7_21d7eb0f-d8da-459c-b217-8af29243babb/kube-rbac-proxy-self/0.log" Apr 24 21:48:45.909842 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:45.909810 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-jq98m_c27bccf5-72a8-41fb-9c10-af53b192d121/monitoring-plugin/0.log" Apr 24 21:48:46.034943 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:46.034914 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-88w6j_b016b4cc-60cb-4132-b4b1-86f46bcd2620/node-exporter/0.log" Apr 24 21:48:46.058930 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:46.058829 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-88w6j_b016b4cc-60cb-4132-b4b1-86f46bcd2620/kube-rbac-proxy/0.log" Apr 24 21:48:46.082626 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:46.082599 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-88w6j_b016b4cc-60cb-4132-b4b1-86f46bcd2620/init-textfile/0.log" Apr 24 21:48:46.306542 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:46.306508 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_56428658-1c56-461f-912d-7d5323992858/prometheus/0.log" Apr 24 21:48:46.327849 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:46.327763 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_56428658-1c56-461f-912d-7d5323992858/config-reloader/0.log" Apr 24 21:48:46.353667 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:46.353638 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_56428658-1c56-461f-912d-7d5323992858/thanos-sidecar/0.log" Apr 24 21:48:46.380304 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:46.380201 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_56428658-1c56-461f-912d-7d5323992858/kube-rbac-proxy-web/0.log" Apr 24 21:48:46.409949 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:46.409922 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_56428658-1c56-461f-912d-7d5323992858/kube-rbac-proxy/0.log" Apr 24 21:48:46.437589 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:46.437557 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_56428658-1c56-461f-912d-7d5323992858/kube-rbac-proxy-thanos/0.log" Apr 24 21:48:46.464879 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:46.464847 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_56428658-1c56-461f-912d-7d5323992858/init-config-reloader/0.log" Apr 24 21:48:46.502211 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:46.502182 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-zjcvd_f5b38541-c21b-4c73-a0c6-d8a6907a689c/prometheus-operator/0.log" Apr 24 21:48:46.528984 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:46.528947 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-zjcvd_f5b38541-c21b-4c73-a0c6-d8a6907a689c/kube-rbac-proxy/0.log" Apr 24 21:48:49.448434 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.448378 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54"] Apr 24 21:48:49.449034 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.448845 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0675631-e176-4fa8-a420-59807bf7b747" containerName="gather" Apr 24 21:48:49.449034 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.448865 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0675631-e176-4fa8-a420-59807bf7b747" containerName="gather" Apr 24 21:48:49.449034 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.448904 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0675631-e176-4fa8-a420-59807bf7b747" containerName="copy" Apr 24 21:48:49.449034 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.448915 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0675631-e176-4fa8-a420-59807bf7b747" containerName="copy" Apr 24 21:48:49.449034 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.449027 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0675631-e176-4fa8-a420-59807bf7b747" containerName="copy" Apr 24 21:48:49.449326 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.449044 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0675631-e176-4fa8-a420-59807bf7b747" containerName="gather" Apr 24 21:48:49.452336 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.452315 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54" Apr 24 21:48:49.463553 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.463524 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54"] Apr 24 21:48:49.568203 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.568112 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f3b3e602-6112-4608-91fc-143478a8ea18-proc\") pod \"perf-node-gather-daemonset-gmj54\" (UID: \"f3b3e602-6112-4608-91fc-143478a8ea18\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54" Apr 24 21:48:49.568402 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.568239 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f3b3e602-6112-4608-91fc-143478a8ea18-podres\") pod \"perf-node-gather-daemonset-gmj54\" (UID: \"f3b3e602-6112-4608-91fc-143478a8ea18\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54" Apr 24 21:48:49.568402 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.568315 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grxfz\" (UniqueName: \"kubernetes.io/projected/f3b3e602-6112-4608-91fc-143478a8ea18-kube-api-access-grxfz\") pod \"perf-node-gather-daemonset-gmj54\" (UID: \"f3b3e602-6112-4608-91fc-143478a8ea18\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54" Apr 24 21:48:49.568560 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.568349 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f3b3e602-6112-4608-91fc-143478a8ea18-lib-modules\") pod \"perf-node-gather-daemonset-gmj54\" (UID: \"f3b3e602-6112-4608-91fc-143478a8ea18\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54" Apr 24 21:48:49.568618 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.568607 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3b3e602-6112-4608-91fc-143478a8ea18-sys\") pod \"perf-node-gather-daemonset-gmj54\" (UID: \"f3b3e602-6112-4608-91fc-143478a8ea18\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54" Apr 24 21:48:49.669000 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.668967 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f3b3e602-6112-4608-91fc-143478a8ea18-proc\") pod \"perf-node-gather-daemonset-gmj54\" (UID: \"f3b3e602-6112-4608-91fc-143478a8ea18\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54" Apr 24 21:48:49.669176 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.669010 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f3b3e602-6112-4608-91fc-143478a8ea18-podres\") pod \"perf-node-gather-daemonset-gmj54\" (UID: \"f3b3e602-6112-4608-91fc-143478a8ea18\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54" Apr 24 21:48:49.669176 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.669048 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grxfz\" (UniqueName: \"kubernetes.io/projected/f3b3e602-6112-4608-91fc-143478a8ea18-kube-api-access-grxfz\") pod \"perf-node-gather-daemonset-gmj54\" (UID: \"f3b3e602-6112-4608-91fc-143478a8ea18\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54" Apr 24 21:48:49.669176 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.669070 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f3b3e602-6112-4608-91fc-143478a8ea18-lib-modules\") pod \"perf-node-gather-daemonset-gmj54\" (UID: \"f3b3e602-6112-4608-91fc-143478a8ea18\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54" Apr 24 21:48:49.669176 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.669081 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f3b3e602-6112-4608-91fc-143478a8ea18-proc\") pod \"perf-node-gather-daemonset-gmj54\" (UID: \"f3b3e602-6112-4608-91fc-143478a8ea18\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54" Apr 24 21:48:49.669176 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.669138 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3b3e602-6112-4608-91fc-143478a8ea18-sys\") pod \"perf-node-gather-daemonset-gmj54\" (UID: \"f3b3e602-6112-4608-91fc-143478a8ea18\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54" Apr 24 21:48:49.669394 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.669191 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f3b3e602-6112-4608-91fc-143478a8ea18-lib-modules\") pod \"perf-node-gather-daemonset-gmj54\" (UID: \"f3b3e602-6112-4608-91fc-143478a8ea18\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54" Apr 24 21:48:49.669394 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.669196 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f3b3e602-6112-4608-91fc-143478a8ea18-podres\") pod \"perf-node-gather-daemonset-gmj54\" (UID: \"f3b3e602-6112-4608-91fc-143478a8ea18\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54" Apr 24 21:48:49.669394 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.669266 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3b3e602-6112-4608-91fc-143478a8ea18-sys\") pod \"perf-node-gather-daemonset-gmj54\" (UID: \"f3b3e602-6112-4608-91fc-143478a8ea18\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54" Apr 24 21:48:49.678650 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.678619 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grxfz\" (UniqueName: \"kubernetes.io/projected/f3b3e602-6112-4608-91fc-143478a8ea18-kube-api-access-grxfz\") pod \"perf-node-gather-daemonset-gmj54\" (UID: \"f3b3e602-6112-4608-91fc-143478a8ea18\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54" Apr 24 21:48:49.765884 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.765791 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54" Apr 24 21:48:49.911469 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.911435 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54"] Apr 24 21:48:49.920907 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:49.920062 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:48:50.296760 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:50.296729 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-thx4s_632b9ab1-38e5-4787-8970-d57a96875bdb/dns/0.log" Apr 24 21:48:50.318576 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:50.318545 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-thx4s_632b9ab1-38e5-4787-8970-d57a96875bdb/kube-rbac-proxy/0.log" Apr 24 21:48:50.392846 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:50.392820 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w7q5f_2556ea37-119f-46c5-bee7-7cfb12afca0f/dns-node-resolver/0.log" Apr 24 21:48:50.852656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:50.852623 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54" event={"ID":"f3b3e602-6112-4608-91fc-143478a8ea18","Type":"ContainerStarted","Data":"d59afa2e648d069d14a8a2f60af392e70b781b99dcbc26dfd8675d927d4a9f8a"} Apr 24 21:48:50.852656 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:50.852663 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54" event={"ID":"f3b3e602-6112-4608-91fc-143478a8ea18","Type":"ContainerStarted","Data":"170b835b50c3378c04dfd9e757a77a3d5a5bc24ffeae24e8ee811a23bcb6a762"} Apr 24 21:48:50.853146 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:50.852756 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54" Apr 24 21:48:50.869502 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:50.869452 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54" podStartSLOduration=1.8694345079999999 podStartE2EDuration="1.869434508s" podCreationTimestamp="2026-04-24 21:48:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:48:50.868862724 +0000 UTC m=+1287.696732857" watchObservedRunningTime="2026-04-24 21:48:50.869434508 +0000 UTC m=+1287.697304652" Apr 24 21:48:50.911987 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:50.911961 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jzqgz_d0f7d715-6263-49f8-ac7b-21d48a5b4438/node-ca/0.log" Apr 24 21:48:52.058728 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:52.058693 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9nl9q_917804a7-0eca-4b76-8504-d8f2e2dc5b74/serve-healthcheck-canary/0.log" Apr 24 21:48:52.662020 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:52.661993 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hdm54_af5e838b-1b18-4b92-ba27-f6f304af2d94/kube-rbac-proxy/0.log" Apr 24 21:48:52.687978 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:52.687952 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hdm54_af5e838b-1b18-4b92-ba27-f6f304af2d94/exporter/0.log" Apr 24 21:48:52.724550 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:52.724529 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hdm54_af5e838b-1b18-4b92-ba27-f6f304af2d94/extractor/0.log" Apr 24 21:48:54.651087 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:54.651012 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-84b6647887-v847j_866f00a9-a524-4cfb-a324-423d668078f9/manager/0.log" Apr 24 21:48:54.833709 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:54.833681 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-z7thh_0d43d473-09e2-4943-958f-11ec70b2b290/s3-init/0.log" Apr 24 21:48:54.864165 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:54.864134 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-w6nkn_6ce76489-f504-4d62-9d64-cc62a81ad768/seaweedfs/0.log" Apr 24 21:48:56.865297 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:48:56.865267 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-gmj54" Apr 24 21:49:01.075326 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:49:01.075294 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xs9hq_6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc/kube-multus-additional-cni-plugins/0.log" Apr 24 21:49:01.111110 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:49:01.111086 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xs9hq_6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc/egress-router-binary-copy/0.log" Apr 24 21:49:01.143355 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:49:01.143329 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xs9hq_6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc/cni-plugins/0.log" Apr 24 21:49:01.180813 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:49:01.180789 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xs9hq_6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc/bond-cni-plugin/0.log" Apr 24 21:49:01.207074 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:49:01.207007 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xs9hq_6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc/routeoverride-cni/0.log" Apr 24 21:49:01.237260 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:49:01.237237 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xs9hq_6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc/whereabouts-cni-bincopy/0.log" Apr 24 21:49:01.261368 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:49:01.261351 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xs9hq_6e5f4710-6ba6-44f1-ac71-c7d39ea48ffc/whereabouts-cni/0.log" Apr 24 21:49:01.377551 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:49:01.377521 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xp275_94172220-e322-483c-ae3c-254c0bface83/kube-multus/0.log" Apr 24 21:49:01.468662 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:49:01.468585 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lkk5b_20a034bb-c3d2-4d05-92de-ed16d2eda707/network-metrics-daemon/0.log" Apr 24 21:49:01.503626 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:49:01.503600 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lkk5b_20a034bb-c3d2-4d05-92de-ed16d2eda707/kube-rbac-proxy/0.log" Apr 24 21:49:02.664552 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:49:02.664509 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krb9p_cde5bc87-530f-4ee7-8f38-39b875bbd4e6/ovn-controller/0.log" Apr 24 21:49:02.689286 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:49:02.689240 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krb9p_cde5bc87-530f-4ee7-8f38-39b875bbd4e6/ovn-acl-logging/0.log" Apr 24 21:49:02.699434 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:49:02.699403 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krb9p_cde5bc87-530f-4ee7-8f38-39b875bbd4e6/ovn-acl-logging/1.log" Apr 24 21:49:02.719376 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:49:02.719352 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krb9p_cde5bc87-530f-4ee7-8f38-39b875bbd4e6/kube-rbac-proxy-node/0.log" Apr 24 21:49:02.753586 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:49:02.753558 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krb9p_cde5bc87-530f-4ee7-8f38-39b875bbd4e6/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 21:49:02.777232 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:49:02.777204 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krb9p_cde5bc87-530f-4ee7-8f38-39b875bbd4e6/northd/0.log" Apr 24 21:49:02.801741 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:49:02.801708 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krb9p_cde5bc87-530f-4ee7-8f38-39b875bbd4e6/nbdb/0.log" Apr 24 21:49:02.825183 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:49:02.825155 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krb9p_cde5bc87-530f-4ee7-8f38-39b875bbd4e6/sbdb/0.log" Apr 24 21:49:02.968569 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:49:02.968536 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krb9p_cde5bc87-530f-4ee7-8f38-39b875bbd4e6/ovnkube-controller/0.log" Apr 24 21:49:04.270823 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:49:04.270793 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-c56pf_af45c5fb-e377-44ea-ad83-ad7e5bea725b/network-check-target-container/0.log" Apr 24 21:49:05.309546 ip-10-0-139-184 kubenswrapper[2578]: I0424 21:49:05.309519 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-rvc97_15075605-e0df-4d3a-90f3-7c8811d07731/iptables-alerter/0.log"