Apr 20 20:03:22.226800 ip-10-0-135-184 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 20:03:22.226813 ip-10-0-135-184 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 20:03:22.226820 ip-10-0-135-184 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 20:03:22.227038 ip-10-0-135-184 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 20:03:32.243094 ip-10-0-135-184 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 20:03:32.243112 ip-10-0-135-184 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot ce1fb4cec2dd4567802c918208ff9d9c -- Apr 20 20:05:58.845716 ip-10-0-135-184 systemd[1]: Starting Kubernetes Kubelet... Apr 20 20:05:59.266292 ip-10-0-135-184 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:05:59.266292 ip-10-0-135-184 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 20:05:59.266292 ip-10-0-135-184 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:05:59.266292 ip-10-0-135-184 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 20:05:59.266292 ip-10-0-135-184 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:05:59.268768 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.268682 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 20:05:59.274704 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274681 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:59.274704 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274699 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:59.274704 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274703 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:59.274704 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274707 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:59.274704 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274710 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:59.274902 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274713 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:59.274902 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274717 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:59.274902 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274729 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:59.274902 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274733 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:59.274902 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274735 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:59.274902 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274738 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:59.274902 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274741 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:59.274902 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274744 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:59.274902 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274746 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:59.274902 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274749 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:59.274902 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274752 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:59.274902 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274754 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:59.274902 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274757 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:59.274902 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274759 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:59.274902 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274763 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:59.274902 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274765 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:59.274902 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274768 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:59.274902 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274771 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:59.274902 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274774 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:59.274902 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274777 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:59.275386 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274780 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:59.275386 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274782 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:59.275386 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274785 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:59.275386 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274787 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:59.275386 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274790 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:59.275386 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274792 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:59.275386 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274795 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:59.275386 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274798 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:59.275386 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274800 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:59.275386 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274803 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:59.275386 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274805 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:59.275386 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274808 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:59.275386 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274811 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:59.275386 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274814 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:59.275386 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274817 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:59.275386 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274821 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:59.275386 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274825 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:59.275386 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274828 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:59.275386 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274831 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:59.275896 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274833 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:59.275896 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274836 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:59.275896 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274838 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:59.275896 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274841 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:59.275896 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274844 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:59.275896 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274846 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:59.275896 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274849 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:59.275896 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274852 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:59.275896 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274854 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:59.275896 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274857 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:59.275896 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274860 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:59.275896 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274864 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:59.275896 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274868 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:59.275896 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274871 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:59.275896 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274874 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:59.275896 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274876 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:59.275896 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274879 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:59.275896 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274883 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:59.275896 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274885 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:59.276363 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274888 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:59.276363 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274890 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:59.276363 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274893 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:59.276363 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274895 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:59.276363 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274898 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:59.276363 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274900 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:59.276363 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274903 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:59.276363 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274907 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:59.276363 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274910 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:59.276363 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274913 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:59.276363 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274915 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:59.276363 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274918 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:59.276363 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274921 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:59.276363 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274924 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:59.276363 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274927 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:59.276363 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274930 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:59.276363 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274933 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:59.276363 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274936 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:59.276363 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274938 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:59.276363 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274941 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:59.276870 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274944 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:59.276870 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274947 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:59.276870 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.274950 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:59.276870 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275308 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:59.276870 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275313 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:59.276870 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275317 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:59.276870 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275320 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:59.276870 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275322 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:59.276870 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275325 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:59.276870 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275328 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:59.276870 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275331 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:59.276870 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275334 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:59.276870 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275337 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:59.276870 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275339 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:59.276870 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275342 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:59.276870 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275345 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:59.276870 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275347 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:59.276870 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275350 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:59.276870 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275353 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:59.276870 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275356 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:59.277459 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275359 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:59.277459 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275362 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:59.277459 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275365 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:59.277459 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275367 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:59.277459 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275370 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:59.277459 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275373 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:59.277459 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275375 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:59.277459 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275379 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:59.277459 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275381 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:59.277459 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275384 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:59.277459 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275387 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:59.277459 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275389 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:59.277459 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275392 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:59.277459 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275395 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:59.277459 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275398 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:59.277459 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275400 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:59.277459 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275403 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:59.277459 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275405 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:59.277459 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275408 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:59.277459 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275410 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:59.277994 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275414 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:59.277994 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275418 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:59.277994 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275440 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:59.277994 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275443 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:59.277994 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275446 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:59.277994 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275449 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:59.277994 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275452 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:59.277994 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275455 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:59.277994 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275457 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:59.277994 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275460 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:59.277994 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275463 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:59.277994 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275466 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:59.277994 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275469 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:59.277994 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275471 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:59.277994 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275474 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:59.277994 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275476 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:59.277994 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275479 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:59.277994 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275482 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:59.277994 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275485 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:59.278495 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275487 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:59.278495 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275490 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:59.278495 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275493 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:59.278495 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275496 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:59.278495 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275498 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:59.278495 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275501 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:59.278495 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275504 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:59.278495 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275506 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:59.278495 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275509 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:59.278495 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275511 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:59.278495 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275514 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:59.278495 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275516 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:59.278495 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275519 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:59.278495 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275521 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:59.278495 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275524 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:59.278495 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275527 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:59.278495 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275529 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:59.278495 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275532 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:59.278495 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275535 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:59.278974 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275537 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:59.278974 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275540 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:59.278974 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275550 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:59.278974 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275553 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:59.278974 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275556 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:59.278974 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275559 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:59.278974 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275562 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:59.278974 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275565 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:59.278974 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275567 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:59.278974 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275570 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:59.278974 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.275573 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:59.278974 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276813 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 20:05:59.278974 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276822 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 20:05:59.278974 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276830 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 20:05:59.278974 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276834 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 20:05:59.278974 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276839 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 20:05:59.278974 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276843 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 20:05:59.278974 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276847 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 20:05:59.278974 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276852 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 20:05:59.278974 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276855 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 20:05:59.278974 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276858 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276862 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276865 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276868 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276871 2571 flags.go:64] FLAG: --cgroup-root="" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276874 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276877 2571 flags.go:64] FLAG: --client-ca-file="" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276880 2571 flags.go:64] FLAG: --cloud-config="" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276883 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276886 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276891 2571 flags.go:64] FLAG: --cluster-domain="" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276893 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276897 2571 flags.go:64] FLAG: --config-dir="" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276900 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276904 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276908 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276911 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276915 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276918 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276922 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276925 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276928 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276931 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276935 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276939 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 20:05:59.279511 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276942 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276945 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276948 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276952 2571 flags.go:64] FLAG: --enable-server="true" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276955 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276960 2571 flags.go:64] FLAG: --event-burst="100" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276963 2571 flags.go:64] FLAG: --event-qps="50" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276966 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276969 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276972 2571 flags.go:64] FLAG: --eviction-hard="" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276976 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276979 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276982 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276985 2571 flags.go:64] FLAG: --eviction-soft="" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276988 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276991 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276994 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.276997 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277000 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277003 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277006 2571 flags.go:64] FLAG: --feature-gates="" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277010 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277013 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277016 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277019 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277022 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 20 20:05:59.280119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277026 2571 flags.go:64] FLAG: --help="false" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277029 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-135-184.ec2.internal" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277032 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277035 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277038 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277041 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277045 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277048 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277051 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277054 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277057 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277060 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277063 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277066 2571 flags.go:64] FLAG: --kube-reserved="" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277068 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277071 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277074 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277077 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277080 2571 flags.go:64] FLAG: --lock-file="" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277082 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277085 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277088 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277093 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 20:05:59.280809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277099 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277102 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277104 2571 flags.go:64] FLAG: --logging-format="text" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277108 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277111 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277114 2571 flags.go:64] FLAG: --manifest-url="" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277117 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277121 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277125 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277129 2571 flags.go:64] FLAG: --max-pods="110" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277133 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277136 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277139 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277142 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277145 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277147 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277150 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277158 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277162 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277165 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277168 2571 flags.go:64] FLAG: --pod-cidr="" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277171 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277177 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277180 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 20:05:59.281374 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277183 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277186 2571 flags.go:64] FLAG: --port="10250" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277189 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277192 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0fd9bb93d328bc90b" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277195 2571 flags.go:64] FLAG: --qos-reserved="" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277198 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277201 2571 flags.go:64] FLAG: --register-node="true" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277204 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277208 2571 flags.go:64] FLAG: --register-with-taints="" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277212 2571 flags.go:64] FLAG: --registry-burst="10" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277214 2571 flags.go:64] FLAG: --registry-qps="5" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277217 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277220 2571 flags.go:64] FLAG: --reserved-memory="" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277224 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277227 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277230 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277233 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277236 2571 flags.go:64] FLAG: --runonce="false" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277239 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277242 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277245 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277247 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277250 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277253 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277257 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277260 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 20:05:59.281984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277262 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277265 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277268 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277271 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277274 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277277 2571 flags.go:64] FLAG: --system-cgroups="" Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277280 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277286 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277289 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277292 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277296 2571 flags.go:64] FLAG: --tls-min-version="" Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277299 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277302 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277305 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277309 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277312 2571 flags.go:64] FLAG: --v="2" Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277316 2571 flags.go:64] FLAG: --version="false" Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277320 2571 flags.go:64] FLAG: --vmodule="" Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277324 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277328 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277413 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277432 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277436 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277439 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:59.282628 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277442 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:59.283277 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277444 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:59.283277 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277447 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:59.283277 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277450 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:59.283277 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277453 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:59.283277 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277455 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:59.283277 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277458 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:59.283277 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277461 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:59.283277 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277464 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:59.283277 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277468 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:59.283277 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277472 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:59.283277 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277475 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:59.283277 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277479 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:59.283277 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277483 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:59.283277 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277485 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:59.283277 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277488 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:59.283277 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277491 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:59.283277 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277493 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:59.283277 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277496 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:59.283277 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277499 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:59.283783 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277501 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:59.283783 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277504 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:59.283783 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277507 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:59.283783 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277510 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:59.283783 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277513 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:59.283783 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277515 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:59.283783 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277518 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:59.283783 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277521 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:59.283783 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277524 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:59.283783 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277529 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:59.283783 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277533 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:59.283783 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277536 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:59.283783 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277538 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:59.283783 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277541 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:59.283783 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277543 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:59.283783 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277546 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:59.283783 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277549 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:59.283783 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277551 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:59.283783 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277554 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:59.284259 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277557 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:59.284259 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277560 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:59.284259 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277562 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:59.284259 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277565 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:59.284259 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277568 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:59.284259 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277570 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:59.284259 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277574 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:59.284259 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277576 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:59.284259 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277579 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:59.284259 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277581 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:59.284259 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277584 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:59.284259 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277587 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:59.284259 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277589 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:59.284259 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277592 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:59.284259 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277594 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:59.284259 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277601 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:59.284259 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277604 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:59.284259 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277607 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:59.284259 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277609 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:59.284259 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277612 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:59.284785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277614 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:59.284785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277617 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:59.284785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277621 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:59.284785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277624 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:59.284785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277626 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:59.284785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277629 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:59.284785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277632 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:59.284785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277635 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:59.284785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277637 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:59.284785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277640 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:59.284785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277642 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:59.284785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277646 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:59.284785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277648 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:59.284785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277651 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:59.284785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277654 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:59.284785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277656 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:59.284785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277659 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:59.284785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277662 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:59.284785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277665 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:59.284785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277668 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:59.285278 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277671 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:59.285278 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277674 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:59.285278 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.277676 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:59.285278 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.277688 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:05:59.285278 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.284082 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 20:05:59.285278 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.284098 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 20:05:59.285278 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284157 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:59.285278 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284164 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:59.285278 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284167 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:59.285278 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284171 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:59.285278 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284174 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:59.285278 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284177 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:59.285278 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284180 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:59.285278 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284183 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:59.285278 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284187 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:59.285278 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284190 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:59.285785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284193 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:59.285785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284196 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:59.285785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284199 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:59.285785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284201 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:59.285785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284204 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:59.285785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284207 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:59.285785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284209 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:59.285785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284212 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:59.285785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284215 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:59.285785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284218 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:59.285785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284221 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:59.285785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284223 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:59.285785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284226 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:59.285785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284229 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:59.285785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284231 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:59.285785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284243 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:59.285785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284246 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:59.285785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284248 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:59.285785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284251 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:59.285785 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284254 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:59.286306 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284256 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:59.286306 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284260 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:59.286306 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284264 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:59.286306 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284266 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:59.286306 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284269 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:59.286306 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284272 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:59.286306 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284275 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:59.286306 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284277 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:59.286306 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284280 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:59.286306 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284283 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:59.286306 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284285 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:59.286306 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284288 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:59.286306 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284291 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:59.286306 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284293 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:59.286306 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284296 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:59.286306 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284299 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:59.286306 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284302 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:59.286306 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284304 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:59.286306 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284307 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:59.286306 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284309 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:59.286814 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284312 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:59.286814 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284314 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:59.286814 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284317 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:59.286814 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284320 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:59.286814 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284322 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:59.286814 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284325 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:59.286814 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284327 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:59.286814 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284330 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:59.286814 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284333 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:59.286814 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284336 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:59.286814 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284338 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:59.286814 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284343 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:59.286814 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284346 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:59.286814 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284350 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:59.286814 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284353 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:59.286814 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284355 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:59.286814 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284358 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:59.286814 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284360 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:59.286814 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284363 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:59.287292 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284365 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:59.287292 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284368 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:59.287292 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284370 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:59.287292 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284373 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:59.287292 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284376 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:59.287292 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284378 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:59.287292 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284381 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:59.287292 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284384 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:59.287292 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284386 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:59.287292 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284389 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:59.287292 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284391 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:59.287292 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284394 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:59.287292 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284396 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:59.287292 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284399 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:59.287292 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284402 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:59.287292 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284406 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:59.287292 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284410 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:59.287732 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.284415 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:05:59.287732 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284527 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:59.287732 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284533 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:59.287732 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284535 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:59.287732 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284539 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:59.287732 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284542 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:59.287732 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284545 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:59.287732 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284548 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:59.287732 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284550 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:59.287732 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284553 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:59.287732 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284556 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:59.287732 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284559 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:59.287732 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284562 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:59.287732 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284564 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:59.287732 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284567 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:59.287732 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284570 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:59.288139 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284572 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:59.288139 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284575 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:59.288139 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284577 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:59.288139 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284580 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:59.288139 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284582 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:59.288139 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284587 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:59.288139 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284591 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:59.288139 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284594 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:59.288139 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284597 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:59.288139 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284599 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:59.288139 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284602 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:59.288139 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284605 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:59.288139 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284608 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:59.288139 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284611 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:59.288139 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284613 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:59.288139 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284616 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:59.288139 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284619 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:59.288139 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284622 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:59.288139 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284626 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:59.288623 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284629 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:59.288623 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284631 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:59.288623 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284634 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:59.288623 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284637 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:59.288623 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284640 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:59.288623 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284643 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:59.288623 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284645 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:59.288623 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284648 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:59.288623 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284651 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:59.288623 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284654 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:59.288623 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284656 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:59.288623 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284659 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:59.288623 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284662 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:59.288623 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284664 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:59.288623 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284667 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:59.288623 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284669 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:59.288623 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284672 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:59.288623 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284674 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:59.288623 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284677 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:59.288623 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284679 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:59.289117 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284682 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:59.289117 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284685 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:59.289117 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284687 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:59.289117 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284690 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:59.289117 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284692 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:59.289117 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284695 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:59.289117 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284697 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:59.289117 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284700 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:59.289117 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284703 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:59.289117 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284705 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:59.289117 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284708 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:59.289117 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284711 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:59.289117 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284714 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:59.289117 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284716 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:59.289117 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284719 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:59.289117 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284721 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:59.289117 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284729 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:59.289117 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284732 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:59.289117 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284735 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:59.289117 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284738 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:59.289616 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284740 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:59.289616 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284743 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:59.289616 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284745 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:59.289616 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284748 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:59.289616 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284751 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:59.289616 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284754 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:59.289616 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284756 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:59.289616 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284759 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:59.289616 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284762 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:59.289616 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284764 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:59.289616 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284767 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:59.289616 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:05:59.284769 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:59.289616 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.284774 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:05:59.289616 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.285940 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 20:05:59.289616 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.287806 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 20:05:59.289616 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.288733 2571 server.go:1019] "Starting client certificate rotation" Apr 20 20:05:59.290041 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.288833 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:05:59.290041 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.288872 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:05:59.311177 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.311158 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:05:59.316901 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.316884 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:05:59.329213 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.329188 2571 log.go:25] "Validated CRI v1 runtime API" Apr 20 20:05:59.335166 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.335148 2571 log.go:25] "Validated CRI v1 image API" Apr 20 20:05:59.337045 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.337018 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 20:05:59.339100 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.339082 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:05:59.341045 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.341023 2571 fs.go:135] Filesystem UUIDs: map[6c279b38-0dd1-496b-8d36-89e4ff4b7d5e:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 a02ac697-f904-42ad-b4c2-e7205cc8fdc6:/dev/nvme0n1p4] Apr 20 20:05:59.341133 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.341043 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 20:05:59.346551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.346435 2571 manager.go:217] Machine: {Timestamp:2026-04-20 20:05:59.344652615 +0000 UTC m=+0.384142152 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3129634 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec219426dec2adca52909815ebc9dcd7 SystemUUID:ec219426-dec2-adca-5290-9815ebc9dcd7 BootID:ce1fb4ce-c2dd-4567-802c-918208ff9d9c Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:1a:c5:a4:41:ef Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:1a:c5:a4:41:ef Speed:0 Mtu:9001} {Name:ovs-system MacAddress:76:99:30:19:08:01 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 20:05:59.346551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.346538 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 20:05:59.346729 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.346636 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 20:05:59.347667 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.347643 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 20:05:59.347830 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.347669 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-184.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 20:05:59.347907 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.347844 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 20:05:59.347907 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.347855 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 20:05:59.347907 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.347880 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:05:59.348483 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.348471 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:05:59.349187 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.349175 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:05:59.349314 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.349303 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 20:05:59.351453 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.351441 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 20 20:05:59.351523 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.351465 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 20:05:59.351523 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.351483 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 20:05:59.351523 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.351496 2571 kubelet.go:397] "Adding apiserver pod source" Apr 20 20:05:59.351523 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.351511 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 20:05:59.352457 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.352445 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:05:59.352519 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.352467 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:05:59.355592 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.355577 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 20:05:59.356783 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.356771 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 20:05:59.358509 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.358490 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 20:05:59.358597 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.358513 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 20:05:59.358597 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.358522 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 20:05:59.358597 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.358528 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 20:05:59.358597 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.358533 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 20:05:59.358597 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.358538 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 20:05:59.358597 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.358544 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 20:05:59.358597 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.358549 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 20:05:59.358597 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.358559 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 20:05:59.358597 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.358566 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 20:05:59.358597 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.358585 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 20:05:59.358597 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.358594 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 20:05:59.359480 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.359465 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 20:05:59.359480 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.359475 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 20:05:59.361877 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.361860 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-42qxs" Apr 20 20:05:59.363636 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.363623 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 20:05:59.363693 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.363658 2571 server.go:1295] "Started kubelet" Apr 20 20:05:59.363869 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.363815 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-184.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 20:05:59.363953 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.363836 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 20:05:59.364006 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.363960 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 20:05:59.364588 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:05:59.364541 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-184.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 20:05:59.364588 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.364558 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 20:05:59.364714 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:05:59.364570 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 20:05:59.364619 ip-10-0-135-184 systemd[1]: Started Kubernetes Kubelet. Apr 20 20:05:59.364872 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.364855 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 20:05:59.367835 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.367819 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 20 20:05:59.370290 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.370274 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 20:05:59.370491 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.370273 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 20:05:59.371067 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:05:59.370059 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-184.ec2.internal.18a8296346c76dad default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-184.ec2.internal,UID:ip-10-0-135-184.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-184.ec2.internal,},FirstTimestamp:2026-04-20 20:05:59.363636653 +0000 UTC m=+0.403126191,LastTimestamp:2026-04-20 20:05:59.363636653 +0000 UTC m=+0.403126191,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-184.ec2.internal,}" Apr 20 20:05:59.372476 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.372139 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 20:05:59.372559 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.372481 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 20:05:59.372814 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:05:59.372796 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-184.ec2.internal\" not found" Apr 20 20:05:59.372855 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.372808 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 20 20:05:59.372855 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.372824 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 20 20:05:59.373368 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.373352 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 20:05:59.374474 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.374455 2571 factory.go:153] Registering CRI-O factory Apr 20 20:05:59.374564 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.374486 2571 factory.go:223] Registration of the crio container factory successfully Apr 20 20:05:59.374564 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.374491 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-42qxs" Apr 20 20:05:59.374564 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.374556 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 20:05:59.374564 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.374566 2571 factory.go:55] Registering systemd factory Apr 20 20:05:59.374755 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.374573 2571 factory.go:223] Registration of the systemd container factory successfully Apr 20 20:05:59.374755 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.374603 2571 factory.go:103] Registering Raw factory Apr 20 20:05:59.374755 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.374616 2571 manager.go:1196] Started watching for new ooms in manager Apr 20 20:05:59.375775 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.375758 2571 manager.go:319] Starting recovery of all containers Apr 20 20:05:59.380649 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:05:59.380486 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 20:05:59.380649 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:05:59.380524 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-184.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 20:05:59.380802 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:05:59.380685 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 20:05:59.388517 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.388502 2571 manager.go:324] Recovery completed Apr 20 20:05:59.392219 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.392206 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:59.394592 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.394577 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-184.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:59.394685 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.394604 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:59.394685 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.394623 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-184.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:59.395009 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.394996 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 20:05:59.395055 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.395011 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 20:05:59.395055 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.395028 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:05:59.396959 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.396948 2571 policy_none.go:49] "None policy: Start" Apr 20 20:05:59.396992 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.396964 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 20:05:59.396992 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.396973 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 20 20:05:59.438992 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.438975 2571 manager.go:341] "Starting Device Plugin manager" Apr 20 20:05:59.439100 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:05:59.439013 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 20:05:59.439100 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.439026 2571 server.go:85] "Starting device plugin registration server" Apr 20 20:05:59.439265 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.439253 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 20:05:59.439317 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.439269 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 20:05:59.439360 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.439345 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 20:05:59.439479 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.439467 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 20:05:59.439479 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.439479 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 20:05:59.440075 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:05:59.440057 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 20:05:59.440136 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:05:59.440098 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-184.ec2.internal\" not found" Apr 20 20:05:59.521173 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.521104 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 20:05:59.523238 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.522400 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 20:05:59.523238 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.522443 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 20:05:59.523238 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.522461 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 20:05:59.523238 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.522467 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 20:05:59.523238 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:05:59.522506 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 20:05:59.524708 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.524689 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:59.539401 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.539383 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:59.540270 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.540252 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-184.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:59.540342 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.540282 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:59.540342 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.540292 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-184.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:59.540342 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.540313 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-184.ec2.internal" Apr 20 20:05:59.546579 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.546565 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-184.ec2.internal" Apr 20 20:05:59.546632 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:05:59.546586 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-184.ec2.internal\": node \"ip-10-0-135-184.ec2.internal\" not found" Apr 20 20:05:59.564814 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:05:59.564792 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-184.ec2.internal\" not found" Apr 20 20:05:59.622923 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.622895 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-135-184.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-184.ec2.internal"] Apr 20 20:05:59.622992 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.622960 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:59.625813 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.625759 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-184.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:59.626093 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.626082 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:59.626132 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.626098 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-184.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:59.628285 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.628273 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:59.628473 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.628453 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-184.ec2.internal" Apr 20 20:05:59.628524 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.628487 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:59.629020 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.629005 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-184.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:59.629070 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.629032 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:59.629070 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.629042 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-184.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:59.629139 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.629011 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-184.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:59.629139 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.629110 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:59.629139 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.629120 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-184.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:59.631263 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.631246 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-184.ec2.internal" Apr 20 20:05:59.631311 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.631281 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:59.631889 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.631876 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-184.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:59.631889 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.631900 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:59.631977 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.631913 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-184.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:59.658982 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:05:59.658960 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-184.ec2.internal\" not found" node="ip-10-0-135-184.ec2.internal" Apr 20 20:05:59.663198 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:05:59.663183 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-184.ec2.internal\" not found" node="ip-10-0-135-184.ec2.internal" Apr 20 20:05:59.665257 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:05:59.665242 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-184.ec2.internal\" not found" Apr 20 20:05:59.674886 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.674864 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0fdd1067d66eaa5bc876fed7618c0cd3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-184.ec2.internal\" (UID: \"0fdd1067d66eaa5bc876fed7618c0cd3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-184.ec2.internal" Apr 20 20:05:59.674950 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.674888 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fdd1067d66eaa5bc876fed7618c0cd3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-184.ec2.internal\" (UID: \"0fdd1067d66eaa5bc876fed7618c0cd3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-184.ec2.internal" Apr 20 20:05:59.674950 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.674905 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d93550a1d0c6abebbeb4587739ed181c-config\") pod \"kube-apiserver-proxy-ip-10-0-135-184.ec2.internal\" (UID: \"d93550a1d0c6abebbeb4587739ed181c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-184.ec2.internal" Apr 20 20:05:59.766327 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:05:59.766305 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-184.ec2.internal\" not found" Apr 20 20:05:59.775558 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.775505 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0fdd1067d66eaa5bc876fed7618c0cd3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-184.ec2.internal\" (UID: \"0fdd1067d66eaa5bc876fed7618c0cd3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-184.ec2.internal" Apr 20 20:05:59.775558 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.775538 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fdd1067d66eaa5bc876fed7618c0cd3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-184.ec2.internal\" (UID: \"0fdd1067d66eaa5bc876fed7618c0cd3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-184.ec2.internal" Apr 20 20:05:59.775689 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.775562 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d93550a1d0c6abebbeb4587739ed181c-config\") pod \"kube-apiserver-proxy-ip-10-0-135-184.ec2.internal\" (UID: \"d93550a1d0c6abebbeb4587739ed181c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-184.ec2.internal" Apr 20 20:05:59.775689 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.775608 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d93550a1d0c6abebbeb4587739ed181c-config\") pod \"kube-apiserver-proxy-ip-10-0-135-184.ec2.internal\" (UID: \"d93550a1d0c6abebbeb4587739ed181c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-184.ec2.internal" Apr 20 20:05:59.775689 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.775610 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0fdd1067d66eaa5bc876fed7618c0cd3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-184.ec2.internal\" (UID: \"0fdd1067d66eaa5bc876fed7618c0cd3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-184.ec2.internal" Apr 20 20:05:59.775689 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.775611 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fdd1067d66eaa5bc876fed7618c0cd3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-184.ec2.internal\" (UID: \"0fdd1067d66eaa5bc876fed7618c0cd3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-184.ec2.internal" Apr 20 20:05:59.866793 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:05:59.866770 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-184.ec2.internal\" not found" Apr 20 20:05:59.960410 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.960370 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-184.ec2.internal" Apr 20 20:05:59.965885 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:05:59.965863 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-184.ec2.internal" Apr 20 20:05:59.966923 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:05:59.966908 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-184.ec2.internal\" not found" Apr 20 20:06:00.067891 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:00.067814 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-184.ec2.internal\" not found" Apr 20 20:06:00.168477 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:00.168447 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-184.ec2.internal\" not found" Apr 20 20:06:00.209557 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.209529 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:06:00.271086 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.271062 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-184.ec2.internal" Apr 20 20:06:00.280510 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.280490 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:06:00.282053 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.282041 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-184.ec2.internal" Apr 20 20:06:00.288753 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.288737 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 20:06:00.288843 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.288831 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 20:06:00.288906 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:00.288887 2571 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://a2518cc3fa8d24cac9d060dd995cf11a-800c2023e1f4abc8.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.135.184:38322->100.31.25.254:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-184.ec2.internal" Apr 20 20:06:00.288942 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.288887 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 20:06:00.352071 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.351994 2571 apiserver.go:52] "Watching apiserver" Apr 20 20:06:00.369993 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.369975 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 20:06:00.370616 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.370601 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 20:06:00.372138 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.372116 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-135-184.ec2.internal","openshift-cluster-node-tuning-operator/tuned-89gqs","openshift-dns/node-resolver-7bzrm","openshift-image-registry/node-ca-ntdcl","openshift-multus/multus-24vt9","openshift-network-diagnostics/network-check-target-9bjr7","openshift-ovn-kubernetes/ovnkube-node-kldp5","kube-system/konnectivity-agent-mckzb","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-184.ec2.internal","openshift-multus/multus-additional-cni-plugins-5g757","openshift-multus/network-metrics-daemon-9sbrz","openshift-network-operator/iptables-alerter-rm9m6"] Apr 20 20:06:00.374918 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.374902 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.377028 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.377012 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.377821 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.377798 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 20:06:00.377925 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.377850 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-9vxnk\"" Apr 20 20:06:00.378825 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.378802 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 20:00:59 +0000 UTC" deadline="2028-01-16 06:34:02.71391537 +0000 UTC" Apr 20 20:06:00.378874 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.378824 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15250h28m2.335093027s" Apr 20 20:06:00.378874 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.378848 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-host-cni-netd\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.378935 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.378872 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-node-log\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.378935 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.378891 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.378935 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.378910 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/997e9539-5288-4af5-92f4-55d8ccefbbf7-ovn-node-metrics-cert\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.378935 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.378927 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/997e9539-5288-4af5-92f4-55d8ccefbbf7-ovnkube-script-lib\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.379058 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.378942 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-log-socket\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.379058 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.378977 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-host-kubelet\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.379058 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.379009 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-host-slash\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.379058 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.379045 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/997e9539-5288-4af5-92f4-55d8ccefbbf7-env-overrides\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.379174 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.379071 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dl2k\" (UniqueName: \"kubernetes.io/projected/997e9539-5288-4af5-92f4-55d8ccefbbf7-kube-api-access-6dl2k\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.379174 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.379118 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-host-run-netns\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.379174 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.379147 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-run-systemd\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.379174 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.379150 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7bzrm" Apr 20 20:06:00.379174 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.379163 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-host-run-ovn-kubernetes\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.379340 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.379184 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/997e9539-5288-4af5-92f4-55d8ccefbbf7-ovnkube-config\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.379340 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.379211 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-systemd-units\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.379340 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.379240 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-var-lib-openvswitch\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.379340 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.379264 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-etc-openvswitch\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.379340 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.379301 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-run-openvswitch\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.379340 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.379336 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-run-ovn\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.379572 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.379361 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-host-cni-bin\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.379657 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.379638 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 20:06:00.379775 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.379729 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 20:06:00.379775 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.379742 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 20:06:00.379775 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.379753 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 20:06:00.379932 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.379809 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 20:06:00.380189 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.380176 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:06:00.380911 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.380874 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 20:06:00.382912 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.382888 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 20:06:00.383016 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.383002 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8zqck\"" Apr 20 20:06:00.383497 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.383288 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-wq42t\"" Apr 20 20:06:00.383497 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.383309 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 20:06:00.384027 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.384010 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ntdcl" Apr 20 20:06:00.384117 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.384101 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.386790 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.386773 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:00.386866 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.386839 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 20:06:00.386866 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:00.386841 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9bjr7" podUID="800b4dad-a669-433c-8963-4c9f630913b5" Apr 20 20:06:00.386866 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.386861 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 20:06:00.387154 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.387141 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 20:06:00.387280 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.387269 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-bj44p\"" Apr 20 20:06:00.387338 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.387283 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 20:06:00.387387 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.387336 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 20:06:00.387387 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.387356 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 20:06:00.388634 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.388617 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-tpnm4\"" Apr 20 20:06:00.388682 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.388620 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 20:06:00.388862 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.388848 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mckzb" Apr 20 20:06:00.389237 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.389223 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:06:00.391070 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.391053 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.392573 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.392552 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 20:06:00.392656 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.392626 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-fqqdn\"" Apr 20 20:06:00.393176 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.392949 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 20:06:00.393324 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.393305 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.394077 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.394058 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 20:06:00.394077 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.394072 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-vxb9h\"" Apr 20 20:06:00.394165 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.394151 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 20:06:00.395300 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.395287 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 20:06:00.395631 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.395617 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:00.395711 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:00.395661 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9sbrz" podUID="012dcd86-26f0-4115-bd86-d5066c900541" Apr 20 20:06:00.397546 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.397529 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 20:06:00.397632 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.397581 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-gbmxj\"" Apr 20 20:06:00.397632 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.397594 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 20:06:00.397789 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.397775 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rm9m6" Apr 20 20:06:00.401674 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.401654 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:06:00.401836 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.401822 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 20:06:00.402016 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.402005 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 20:06:00.402415 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.402403 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ffztq\"" Apr 20 20:06:00.411896 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.411880 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-xjcnx" Apr 20 20:06:00.420208 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.420191 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-xjcnx" Apr 20 20:06:00.474295 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.474276 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 20:06:00.480075 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480055 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/997e9539-5288-4af5-92f4-55d8ccefbbf7-env-overrides\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.480271 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480083 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dl2k\" (UniqueName: \"kubernetes.io/projected/997e9539-5288-4af5-92f4-55d8ccefbbf7-kube-api-access-6dl2k\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.480271 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480104 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/43405a48-098c-49ef-95e3-3544654522ad-serviceca\") pod \"node-ca-ntdcl\" (UID: \"43405a48-098c-49ef-95e3-3544654522ad\") " pod="openshift-image-registry/node-ca-ntdcl" Apr 20 20:06:00.480271 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480119 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7cc9261c-1baf-4d71-aae3-b734d559681b-multus-daemon-config\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.480271 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480162 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-etc-kubernetes\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.480271 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480191 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7774b4c6-2299-4070-b786-73a21e70389b-sys-fs\") pod \"aws-ebs-csi-driver-node-xdlm5\" (UID: \"7774b4c6-2299-4070-b786-73a21e70389b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.480271 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480209 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4e90b560-013d-4eb3-83bf-d19971d4fd0c-cni-binary-copy\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.480271 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480239 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2940fed8-94a7-4975-8584-3fcd4e6a7933-host-slash\") pod \"iptables-alerter-rm9m6\" (UID: \"2940fed8-94a7-4975-8584-3fcd4e6a7933\") " pod="openshift-network-operator/iptables-alerter-rm9m6" Apr 20 20:06:00.480603 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480281 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/997e9539-5288-4af5-92f4-55d8ccefbbf7-ovnkube-config\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.480603 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480308 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-etc-sysctl-conf\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.480603 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480353 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-host\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.480603 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480392 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7774b4c6-2299-4070-b786-73a21e70389b-etc-selinux\") pod \"aws-ebs-csi-driver-node-xdlm5\" (UID: \"7774b4c6-2299-4070-b786-73a21e70389b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.480603 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480448 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjwgp\" (UniqueName: \"kubernetes.io/projected/7774b4c6-2299-4070-b786-73a21e70389b-kube-api-access-qjwgp\") pod \"aws-ebs-csi-driver-node-xdlm5\" (UID: \"7774b4c6-2299-4070-b786-73a21e70389b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.480603 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480479 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4e90b560-013d-4eb3-83bf-d19971d4fd0c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.480603 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480510 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-host-cni-bin\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.480603 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480574 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/997e9539-5288-4af5-92f4-55d8ccefbbf7-env-overrides\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.480949 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480651 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-host-cni-netd\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.480949 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480659 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-host-cni-bin\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.480949 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480713 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn942\" (UniqueName: \"kubernetes.io/projected/07ec338a-d16f-4d81-9472-f216291c9dba-kube-api-access-xn942\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.480949 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480743 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-host-cni-netd\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.480949 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480750 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/997e9539-5288-4af5-92f4-55d8ccefbbf7-ovnkube-config\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.480949 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480742 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-cnibin\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.480949 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480796 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7cc9261c-1baf-4d71-aae3-b734d559681b-cni-binary-copy\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.480949 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480813 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-sys\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.480949 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480832 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-host-run-netns\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.480949 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480851 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-host-var-lib-cni-bin\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.480949 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480866 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzxcj\" (UniqueName: \"kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj\") pod \"network-check-target-9bjr7\" (UID: \"800b4dad-a669-433c-8963-4c9f630913b5\") " pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:00.480949 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480882 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-etc-sysconfig\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.480949 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480902 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-lib-modules\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.480949 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480921 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-host-run-netns\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.481464 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480964 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-host-run-netns\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.481464 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.480979 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-host-run-ovn-kubernetes\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.481464 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481019 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-host-run-ovn-kubernetes\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.481464 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481018 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43405a48-098c-49ef-95e3-3544654522ad-host\") pod \"node-ca-ntdcl\" (UID: \"43405a48-098c-49ef-95e3-3544654522ad\") " pod="openshift-image-registry/node-ca-ntdcl" Apr 20 20:06:00.481464 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481048 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-system-cni-dir\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.481464 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481073 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-multus-conf-dir\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.481464 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481107 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7774b4c6-2299-4070-b786-73a21e70389b-registration-dir\") pod \"aws-ebs-csi-driver-node-xdlm5\" (UID: \"7774b4c6-2299-4070-b786-73a21e70389b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.481464 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481138 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-run-openvswitch\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.481464 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481161 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-run-ovn\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.481464 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481182 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07ec338a-d16f-4d81-9472-f216291c9dba-tmp\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.481464 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481205 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/37568615-5b75-4d85-aad5-7bfdbb676856-agent-certs\") pod \"konnectivity-agent-mckzb\" (UID: \"37568615-5b75-4d85-aad5-7bfdbb676856\") " pod="kube-system/konnectivity-agent-mckzb" Apr 20 20:06:00.481464 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481214 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-run-ovn\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.481464 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481218 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-run-openvswitch\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.481464 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481240 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7774b4c6-2299-4070-b786-73a21e70389b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xdlm5\" (UID: \"7774b4c6-2299-4070-b786-73a21e70389b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.481464 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481266 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs\") pod \"network-metrics-daemon-9sbrz\" (UID: \"012dcd86-26f0-4115-bd86-d5066c900541\") " pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:00.481464 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481290 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-node-log\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.481464 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481314 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.482149 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481336 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-var-lib-kubelet\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.482149 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481343 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-node-log\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.482149 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481363 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5qqp\" (UniqueName: \"kubernetes.io/projected/e8fae2ab-f747-4b27-b9a3-55be9806fb45-kube-api-access-r5qqp\") pod \"node-resolver-7bzrm\" (UID: \"e8fae2ab-f747-4b27-b9a3-55be9806fb45\") " pod="openshift-dns/node-resolver-7bzrm" Apr 20 20:06:00.482149 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481383 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-log-socket\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.482149 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481393 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.482149 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481446 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-host-slash\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.482149 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481454 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-log-socket\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.482149 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481489 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-etc-modprobe-d\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.482149 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481511 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-run\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.482149 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481532 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-os-release\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.482149 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481573 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7774b4c6-2299-4070-b786-73a21e70389b-device-dir\") pod \"aws-ebs-csi-driver-node-xdlm5\" (UID: \"7774b4c6-2299-4070-b786-73a21e70389b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.482149 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481547 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-host-slash\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.482149 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481594 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4e90b560-013d-4eb3-83bf-d19971d4fd0c-cnibin\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.482149 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481621 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2940fed8-94a7-4975-8584-3fcd4e6a7933-iptables-alerter-script\") pod \"iptables-alerter-rm9m6\" (UID: \"2940fed8-94a7-4975-8584-3fcd4e6a7933\") " pod="openshift-network-operator/iptables-alerter-rm9m6" Apr 20 20:06:00.482149 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481645 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-run-systemd\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.482149 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481667 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-host-run-k8s-cni-cncf-io\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.482149 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481690 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-host-var-lib-kubelet\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.482831 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481690 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-run-systemd\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.482831 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481712 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e8fae2ab-f747-4b27-b9a3-55be9806fb45-hosts-file\") pod \"node-resolver-7bzrm\" (UID: \"e8fae2ab-f747-4b27-b9a3-55be9806fb45\") " pod="openshift-dns/node-resolver-7bzrm" Apr 20 20:06:00.482831 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481732 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e8fae2ab-f747-4b27-b9a3-55be9806fb45-tmp-dir\") pod \"node-resolver-7bzrm\" (UID: \"e8fae2ab-f747-4b27-b9a3-55be9806fb45\") " pod="openshift-dns/node-resolver-7bzrm" Apr 20 20:06:00.482831 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481756 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/997e9539-5288-4af5-92f4-55d8ccefbbf7-ovn-node-metrics-cert\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.482831 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481778 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/997e9539-5288-4af5-92f4-55d8ccefbbf7-ovnkube-script-lib\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.482831 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481804 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jsnh\" (UniqueName: \"kubernetes.io/projected/43405a48-098c-49ef-95e3-3544654522ad-kube-api-access-7jsnh\") pod \"node-ca-ntdcl\" (UID: \"43405a48-098c-49ef-95e3-3544654522ad\") " pod="openshift-image-registry/node-ca-ntdcl" Apr 20 20:06:00.482831 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481843 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-etc-sysctl-d\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.482831 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481886 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/07ec338a-d16f-4d81-9472-f216291c9dba-etc-tuned\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.482831 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481908 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-multus-cni-dir\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.482831 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481941 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/37568615-5b75-4d85-aad5-7bfdbb676856-konnectivity-ca\") pod \"konnectivity-agent-mckzb\" (UID: \"37568615-5b75-4d85-aad5-7bfdbb676856\") " pod="kube-system/konnectivity-agent-mckzb" Apr 20 20:06:00.482831 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.481992 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4e90b560-013d-4eb3-83bf-d19971d4fd0c-os-release\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.482831 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482017 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-host-kubelet\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.482831 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482033 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-etc-kubernetes\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.482831 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482062 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-etc-systemd\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.482831 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482078 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-multus-socket-dir-parent\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.482831 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482076 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-host-kubelet\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.482831 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482099 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-hostroot\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.483333 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482122 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4e90b560-013d-4eb3-83bf-d19971d4fd0c-system-cni-dir\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.483333 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482145 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 20:06:00.483333 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482166 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2dqs\" (UniqueName: \"kubernetes.io/projected/012dcd86-26f0-4115-bd86-d5066c900541-kube-api-access-p2dqs\") pod \"network-metrics-daemon-9sbrz\" (UID: \"012dcd86-26f0-4115-bd86-d5066c900541\") " pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:00.483333 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482238 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-systemd-units\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.483333 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482256 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-var-lib-openvswitch\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.483333 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482271 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-etc-openvswitch\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.483333 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482290 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-host-var-lib-cni-multus\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.483333 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482333 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-var-lib-openvswitch\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.483333 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482341 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8j8n\" (UniqueName: \"kubernetes.io/projected/7cc9261c-1baf-4d71-aae3-b734d559681b-kube-api-access-p8j8n\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.483333 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482354 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-systemd-units\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.483333 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482354 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/997e9539-5288-4af5-92f4-55d8ccefbbf7-etc-openvswitch\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.483333 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482377 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj8tq\" (UniqueName: \"kubernetes.io/projected/4e90b560-013d-4eb3-83bf-d19971d4fd0c-kube-api-access-dj8tq\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.483333 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482399 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-host-run-multus-certs\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.483333 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482413 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7774b4c6-2299-4070-b786-73a21e70389b-socket-dir\") pod \"aws-ebs-csi-driver-node-xdlm5\" (UID: \"7774b4c6-2299-4070-b786-73a21e70389b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.483333 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482458 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4e90b560-013d-4eb3-83bf-d19971d4fd0c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.483333 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482478 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4e90b560-013d-4eb3-83bf-d19971d4fd0c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.483333 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482502 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzfwl\" (UniqueName: \"kubernetes.io/projected/2940fed8-94a7-4975-8584-3fcd4e6a7933-kube-api-access-vzfwl\") pod \"iptables-alerter-rm9m6\" (UID: \"2940fed8-94a7-4975-8584-3fcd4e6a7933\") " pod="openshift-network-operator/iptables-alerter-rm9m6" Apr 20 20:06:00.484062 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.482810 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/997e9539-5288-4af5-92f4-55d8ccefbbf7-ovnkube-script-lib\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.485774 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.485756 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/997e9539-5288-4af5-92f4-55d8ccefbbf7-ovn-node-metrics-cert\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.488955 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.488931 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dl2k\" (UniqueName: \"kubernetes.io/projected/997e9539-5288-4af5-92f4-55d8ccefbbf7-kube-api-access-6dl2k\") pod \"ovnkube-node-kldp5\" (UID: \"997e9539-5288-4af5-92f4-55d8ccefbbf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.583085 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583051 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzxcj\" (UniqueName: \"kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj\") pod \"network-check-target-9bjr7\" (UID: \"800b4dad-a669-433c-8963-4c9f630913b5\") " pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:00.583181 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583093 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-etc-sysconfig\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.583181 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583109 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-lib-modules\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.583181 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583126 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43405a48-098c-49ef-95e3-3544654522ad-host\") pod \"node-ca-ntdcl\" (UID: \"43405a48-098c-49ef-95e3-3544654522ad\") " pod="openshift-image-registry/node-ca-ntdcl" Apr 20 20:06:00.583181 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583142 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-system-cni-dir\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.583181 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583166 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-multus-conf-dir\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.583408 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583189 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7774b4c6-2299-4070-b786-73a21e70389b-registration-dir\") pod \"aws-ebs-csi-driver-node-xdlm5\" (UID: \"7774b4c6-2299-4070-b786-73a21e70389b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.583408 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583215 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07ec338a-d16f-4d81-9472-f216291c9dba-tmp\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.583408 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583211 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43405a48-098c-49ef-95e3-3544654522ad-host\") pod \"node-ca-ntdcl\" (UID: \"43405a48-098c-49ef-95e3-3544654522ad\") " pod="openshift-image-registry/node-ca-ntdcl" Apr 20 20:06:00.583408 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583230 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-etc-sysconfig\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.583408 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583249 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-lib-modules\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.583408 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583264 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-multus-conf-dir\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.583408 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583298 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7774b4c6-2299-4070-b786-73a21e70389b-registration-dir\") pod \"aws-ebs-csi-driver-node-xdlm5\" (UID: \"7774b4c6-2299-4070-b786-73a21e70389b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.583408 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583252 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-system-cni-dir\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.583408 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583239 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/37568615-5b75-4d85-aad5-7bfdbb676856-agent-certs\") pod \"konnectivity-agent-mckzb\" (UID: \"37568615-5b75-4d85-aad5-7bfdbb676856\") " pod="kube-system/konnectivity-agent-mckzb" Apr 20 20:06:00.583408 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583365 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7774b4c6-2299-4070-b786-73a21e70389b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xdlm5\" (UID: \"7774b4c6-2299-4070-b786-73a21e70389b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.583408 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583383 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs\") pod \"network-metrics-daemon-9sbrz\" (UID: \"012dcd86-26f0-4115-bd86-d5066c900541\") " pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:00.583408 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583406 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-var-lib-kubelet\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.583408 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583441 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5qqp\" (UniqueName: \"kubernetes.io/projected/e8fae2ab-f747-4b27-b9a3-55be9806fb45-kube-api-access-r5qqp\") pod \"node-resolver-7bzrm\" (UID: \"e8fae2ab-f747-4b27-b9a3-55be9806fb45\") " pod="openshift-dns/node-resolver-7bzrm" Apr 20 20:06:00.583947 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583494 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-var-lib-kubelet\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.583947 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:00.583511 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:00.583947 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583521 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7774b4c6-2299-4070-b786-73a21e70389b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xdlm5\" (UID: \"7774b4c6-2299-4070-b786-73a21e70389b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.583947 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583533 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-etc-modprobe-d\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.583947 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583559 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-run\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.583947 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:00.583593 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs podName:012dcd86-26f0-4115-bd86-d5066c900541 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:01.083554577 +0000 UTC m=+2.123044104 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs") pod "network-metrics-daemon-9sbrz" (UID: "012dcd86-26f0-4115-bd86-d5066c900541") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:00.583947 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583617 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-etc-modprobe-d\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.583947 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583596 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-run\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.583947 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583660 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-os-release\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.583947 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583696 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7774b4c6-2299-4070-b786-73a21e70389b-device-dir\") pod \"aws-ebs-csi-driver-node-xdlm5\" (UID: \"7774b4c6-2299-4070-b786-73a21e70389b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.583947 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583718 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4e90b560-013d-4eb3-83bf-d19971d4fd0c-cnibin\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.583947 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583748 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2940fed8-94a7-4975-8584-3fcd4e6a7933-iptables-alerter-script\") pod \"iptables-alerter-rm9m6\" (UID: \"2940fed8-94a7-4975-8584-3fcd4e6a7933\") " pod="openshift-network-operator/iptables-alerter-rm9m6" Apr 20 20:06:00.583947 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583758 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-os-release\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.583947 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583774 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-host-run-k8s-cni-cncf-io\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.583947 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583778 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4e90b560-013d-4eb3-83bf-d19971d4fd0c-cnibin\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.583947 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583759 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7774b4c6-2299-4070-b786-73a21e70389b-device-dir\") pod \"aws-ebs-csi-driver-node-xdlm5\" (UID: \"7774b4c6-2299-4070-b786-73a21e70389b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.583947 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583812 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-host-run-k8s-cni-cncf-io\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.584551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583844 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-host-var-lib-kubelet\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.584551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583869 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e8fae2ab-f747-4b27-b9a3-55be9806fb45-hosts-file\") pod \"node-resolver-7bzrm\" (UID: \"e8fae2ab-f747-4b27-b9a3-55be9806fb45\") " pod="openshift-dns/node-resolver-7bzrm" Apr 20 20:06:00.584551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583891 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e8fae2ab-f747-4b27-b9a3-55be9806fb45-tmp-dir\") pod \"node-resolver-7bzrm\" (UID: \"e8fae2ab-f747-4b27-b9a3-55be9806fb45\") " pod="openshift-dns/node-resolver-7bzrm" Apr 20 20:06:00.584551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583918 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jsnh\" (UniqueName: \"kubernetes.io/projected/43405a48-098c-49ef-95e3-3544654522ad-kube-api-access-7jsnh\") pod \"node-ca-ntdcl\" (UID: \"43405a48-098c-49ef-95e3-3544654522ad\") " pod="openshift-image-registry/node-ca-ntdcl" Apr 20 20:06:00.584551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583942 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-etc-sysctl-d\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.584551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.583943 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-host-var-lib-kubelet\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.584551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584221 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-etc-sysctl-d\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.584551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584262 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/07ec338a-d16f-4d81-9472-f216291c9dba-etc-tuned\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.584551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584266 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2940fed8-94a7-4975-8584-3fcd4e6a7933-iptables-alerter-script\") pod \"iptables-alerter-rm9m6\" (UID: \"2940fed8-94a7-4975-8584-3fcd4e6a7933\") " pod="openshift-network-operator/iptables-alerter-rm9m6" Apr 20 20:06:00.584551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584287 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-multus-cni-dir\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.584551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584311 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/37568615-5b75-4d85-aad5-7bfdbb676856-konnectivity-ca\") pod \"konnectivity-agent-mckzb\" (UID: \"37568615-5b75-4d85-aad5-7bfdbb676856\") " pod="kube-system/konnectivity-agent-mckzb" Apr 20 20:06:00.584551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584332 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4e90b560-013d-4eb3-83bf-d19971d4fd0c-os-release\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.584551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584356 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-etc-kubernetes\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.584551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584364 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e8fae2ab-f747-4b27-b9a3-55be9806fb45-hosts-file\") pod \"node-resolver-7bzrm\" (UID: \"e8fae2ab-f747-4b27-b9a3-55be9806fb45\") " pod="openshift-dns/node-resolver-7bzrm" Apr 20 20:06:00.584551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584379 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-etc-systemd\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.584551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584402 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-multus-socket-dir-parent\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.584551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584459 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-hostroot\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.584551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584485 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4e90b560-013d-4eb3-83bf-d19971d4fd0c-system-cni-dir\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.585383 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584545 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4e90b560-013d-4eb3-83bf-d19971d4fd0c-os-release\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.585383 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584559 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-multus-cni-dir\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.585383 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584550 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-etc-systemd\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.585383 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584574 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p2dqs\" (UniqueName: \"kubernetes.io/projected/012dcd86-26f0-4115-bd86-d5066c900541-kube-api-access-p2dqs\") pod \"network-metrics-daemon-9sbrz\" (UID: \"012dcd86-26f0-4115-bd86-d5066c900541\") " pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:00.585383 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584610 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-host-var-lib-cni-multus\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.585383 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584606 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-etc-kubernetes\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.585383 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584575 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4e90b560-013d-4eb3-83bf-d19971d4fd0c-system-cni-dir\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.585383 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584628 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e8fae2ab-f747-4b27-b9a3-55be9806fb45-tmp-dir\") pod \"node-resolver-7bzrm\" (UID: \"e8fae2ab-f747-4b27-b9a3-55be9806fb45\") " pod="openshift-dns/node-resolver-7bzrm" Apr 20 20:06:00.585383 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584636 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8j8n\" (UniqueName: \"kubernetes.io/projected/7cc9261c-1baf-4d71-aae3-b734d559681b-kube-api-access-p8j8n\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.585383 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584611 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-hostroot\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.585383 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584672 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dj8tq\" (UniqueName: \"kubernetes.io/projected/4e90b560-013d-4eb3-83bf-d19971d4fd0c-kube-api-access-dj8tq\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.585383 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584699 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-host-run-multus-certs\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.585383 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584721 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7774b4c6-2299-4070-b786-73a21e70389b-socket-dir\") pod \"aws-ebs-csi-driver-node-xdlm5\" (UID: \"7774b4c6-2299-4070-b786-73a21e70389b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.585383 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584745 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4e90b560-013d-4eb3-83bf-d19971d4fd0c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.585383 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584772 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4e90b560-013d-4eb3-83bf-d19971d4fd0c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.585383 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584798 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzfwl\" (UniqueName: \"kubernetes.io/projected/2940fed8-94a7-4975-8584-3fcd4e6a7933-kube-api-access-vzfwl\") pod \"iptables-alerter-rm9m6\" (UID: \"2940fed8-94a7-4975-8584-3fcd4e6a7933\") " pod="openshift-network-operator/iptables-alerter-rm9m6" Apr 20 20:06:00.585383 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584906 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-host-var-lib-cni-multus\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.586565 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584916 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/43405a48-098c-49ef-95e3-3544654522ad-serviceca\") pod \"node-ca-ntdcl\" (UID: \"43405a48-098c-49ef-95e3-3544654522ad\") " pod="openshift-image-registry/node-ca-ntdcl" Apr 20 20:06:00.586565 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.584936 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-multus-socket-dir-parent\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.586565 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585023 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7774b4c6-2299-4070-b786-73a21e70389b-socket-dir\") pod \"aws-ebs-csi-driver-node-xdlm5\" (UID: \"7774b4c6-2299-4070-b786-73a21e70389b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.586565 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585057 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7cc9261c-1baf-4d71-aae3-b734d559681b-multus-daemon-config\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.586565 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585070 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4e90b560-013d-4eb3-83bf-d19971d4fd0c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.586565 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585084 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-etc-kubernetes\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.586565 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585100 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/37568615-5b75-4d85-aad5-7bfdbb676856-konnectivity-ca\") pod \"konnectivity-agent-mckzb\" (UID: \"37568615-5b75-4d85-aad5-7bfdbb676856\") " pod="kube-system/konnectivity-agent-mckzb" Apr 20 20:06:00.586565 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585110 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7774b4c6-2299-4070-b786-73a21e70389b-sys-fs\") pod \"aws-ebs-csi-driver-node-xdlm5\" (UID: \"7774b4c6-2299-4070-b786-73a21e70389b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.586565 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585136 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4e90b560-013d-4eb3-83bf-d19971d4fd0c-cni-binary-copy\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.586565 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585181 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2940fed8-94a7-4975-8584-3fcd4e6a7933-host-slash\") pod \"iptables-alerter-rm9m6\" (UID: \"2940fed8-94a7-4975-8584-3fcd4e6a7933\") " pod="openshift-network-operator/iptables-alerter-rm9m6" Apr 20 20:06:00.586565 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585208 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-etc-sysctl-conf\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.586565 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585210 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-host-run-multus-certs\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.586565 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585233 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-host\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.586565 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585257 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7774b4c6-2299-4070-b786-73a21e70389b-etc-selinux\") pod \"aws-ebs-csi-driver-node-xdlm5\" (UID: \"7774b4c6-2299-4070-b786-73a21e70389b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.586565 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585297 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjwgp\" (UniqueName: \"kubernetes.io/projected/7774b4c6-2299-4070-b786-73a21e70389b-kube-api-access-qjwgp\") pod \"aws-ebs-csi-driver-node-xdlm5\" (UID: \"7774b4c6-2299-4070-b786-73a21e70389b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.586565 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585323 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/43405a48-098c-49ef-95e3-3544654522ad-serviceca\") pod \"node-ca-ntdcl\" (UID: \"43405a48-098c-49ef-95e3-3544654522ad\") " pod="openshift-image-registry/node-ca-ntdcl" Apr 20 20:06:00.586565 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585327 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4e90b560-013d-4eb3-83bf-d19971d4fd0c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.587355 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585369 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xn942\" (UniqueName: \"kubernetes.io/projected/07ec338a-d16f-4d81-9472-f216291c9dba-kube-api-access-xn942\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.587355 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585396 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-cnibin\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.587355 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585469 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7cc9261c-1baf-4d71-aae3-b734d559681b-cni-binary-copy\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.587355 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585496 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-sys\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.587355 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585521 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-host-run-netns\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.587355 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585551 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-host-var-lib-cni-bin\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.587355 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585625 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-host-var-lib-cni-bin\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.587355 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585678 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7cc9261c-1baf-4d71-aae3-b734d559681b-multus-daemon-config\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.587355 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585743 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-etc-kubernetes\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.587355 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585747 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-cnibin\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.587355 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585842 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-sys\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.587355 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585877 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07ec338a-d16f-4d81-9472-f216291c9dba-tmp\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.587355 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585896 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7cc9261c-1baf-4d71-aae3-b734d559681b-host-run-netns\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.587355 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585914 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4e90b560-013d-4eb3-83bf-d19971d4fd0c-cni-binary-copy\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.587355 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585883 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4e90b560-013d-4eb3-83bf-d19971d4fd0c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.587355 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585932 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-host\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.587355 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585915 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4e90b560-013d-4eb3-83bf-d19971d4fd0c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.587355 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585947 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7774b4c6-2299-4070-b786-73a21e70389b-etc-selinux\") pod \"aws-ebs-csi-driver-node-xdlm5\" (UID: \"7774b4c6-2299-4070-b786-73a21e70389b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.587968 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585940 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2940fed8-94a7-4975-8584-3fcd4e6a7933-host-slash\") pod \"iptables-alerter-rm9m6\" (UID: \"2940fed8-94a7-4975-8584-3fcd4e6a7933\") " pod="openshift-network-operator/iptables-alerter-rm9m6" Apr 20 20:06:00.587968 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.585982 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7774b4c6-2299-4070-b786-73a21e70389b-sys-fs\") pod \"aws-ebs-csi-driver-node-xdlm5\" (UID: \"7774b4c6-2299-4070-b786-73a21e70389b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.587968 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.586030 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/07ec338a-d16f-4d81-9472-f216291c9dba-etc-sysctl-conf\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.587968 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.586090 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/37568615-5b75-4d85-aad5-7bfdbb676856-agent-certs\") pod \"konnectivity-agent-mckzb\" (UID: \"37568615-5b75-4d85-aad5-7bfdbb676856\") " pod="kube-system/konnectivity-agent-mckzb" Apr 20 20:06:00.587968 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.586162 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7cc9261c-1baf-4d71-aae3-b734d559681b-cni-binary-copy\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.587968 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.587736 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/07ec338a-d16f-4d81-9472-f216291c9dba-etc-tuned\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.590774 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:00.590749 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:06:00.590774 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:00.590769 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:06:00.590774 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:00.590778 2571 projected.go:194] Error preparing data for projected volume kube-api-access-rzxcj for pod openshift-network-diagnostics/network-check-target-9bjr7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:00.590972 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:00.590850 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj podName:800b4dad-a669-433c-8963-4c9f630913b5 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:01.090832829 +0000 UTC m=+2.130322377 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rzxcj" (UniqueName: "kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj") pod "network-check-target-9bjr7" (UID: "800b4dad-a669-433c-8963-4c9f630913b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:00.592956 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.592934 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5qqp\" (UniqueName: \"kubernetes.io/projected/e8fae2ab-f747-4b27-b9a3-55be9806fb45-kube-api-access-r5qqp\") pod \"node-resolver-7bzrm\" (UID: \"e8fae2ab-f747-4b27-b9a3-55be9806fb45\") " pod="openshift-dns/node-resolver-7bzrm" Apr 20 20:06:00.593809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.593774 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8j8n\" (UniqueName: \"kubernetes.io/projected/7cc9261c-1baf-4d71-aae3-b734d559681b-kube-api-access-p8j8n\") pod \"multus-24vt9\" (UID: \"7cc9261c-1baf-4d71-aae3-b734d559681b\") " pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.594218 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.594197 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzfwl\" (UniqueName: \"kubernetes.io/projected/2940fed8-94a7-4975-8584-3fcd4e6a7933-kube-api-access-vzfwl\") pod \"iptables-alerter-rm9m6\" (UID: \"2940fed8-94a7-4975-8584-3fcd4e6a7933\") " pod="openshift-network-operator/iptables-alerter-rm9m6" Apr 20 20:06:00.594740 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.594719 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jsnh\" (UniqueName: \"kubernetes.io/projected/43405a48-098c-49ef-95e3-3544654522ad-kube-api-access-7jsnh\") pod \"node-ca-ntdcl\" (UID: \"43405a48-098c-49ef-95e3-3544654522ad\") " pod="openshift-image-registry/node-ca-ntdcl" Apr 20 20:06:00.594823 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.594769 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2dqs\" (UniqueName: \"kubernetes.io/projected/012dcd86-26f0-4115-bd86-d5066c900541-kube-api-access-p2dqs\") pod \"network-metrics-daemon-9sbrz\" (UID: \"012dcd86-26f0-4115-bd86-d5066c900541\") " pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:00.595070 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.595056 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn942\" (UniqueName: \"kubernetes.io/projected/07ec338a-d16f-4d81-9472-f216291c9dba-kube-api-access-xn942\") pod \"tuned-89gqs\" (UID: \"07ec338a-d16f-4d81-9472-f216291c9dba\") " pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.595105 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.595091 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj8tq\" (UniqueName: \"kubernetes.io/projected/4e90b560-013d-4eb3-83bf-d19971d4fd0c-kube-api-access-dj8tq\") pod \"multus-additional-cni-plugins-5g757\" (UID: \"4e90b560-013d-4eb3-83bf-d19971d4fd0c\") " pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.595149 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.595118 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjwgp\" (UniqueName: \"kubernetes.io/projected/7774b4c6-2299-4070-b786-73a21e70389b-kube-api-access-qjwgp\") pod \"aws-ebs-csi-driver-node-xdlm5\" (UID: \"7774b4c6-2299-4070-b786-73a21e70389b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.608085 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:06:00.608036 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd93550a1d0c6abebbeb4587739ed181c.slice/crio-66890ca15b3f7f01938d6c7975c3c40b17658eaff21dcc3300c35a7a36c6e0dc WatchSource:0}: Error finding container 66890ca15b3f7f01938d6c7975c3c40b17658eaff21dcc3300c35a7a36c6e0dc: Status 404 returned error can't find the container with id 66890ca15b3f7f01938d6c7975c3c40b17658eaff21dcc3300c35a7a36c6e0dc Apr 20 20:06:00.608353 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:06:00.608334 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fdd1067d66eaa5bc876fed7618c0cd3.slice/crio-a5f43e6fc2d4af95ebe4869260c00e31486e0d0646608d7a61d6710f58ed9066 WatchSource:0}: Error finding container a5f43e6fc2d4af95ebe4869260c00e31486e0d0646608d7a61d6710f58ed9066: Status 404 returned error can't find the container with id a5f43e6fc2d4af95ebe4869260c00e31486e0d0646608d7a61d6710f58ed9066 Apr 20 20:06:00.612700 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.612678 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:06:00.683184 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.683161 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:06:00.709464 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.709443 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:00.715344 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:06:00.715324 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod997e9539_5288_4af5_92f4_55d8ccefbbf7.slice/crio-0937af52354ad6cbb71bd3f7c1a6fd2c4f6c15ec5acb9a57676fecf512ad9544 WatchSource:0}: Error finding container 0937af52354ad6cbb71bd3f7c1a6fd2c4f6c15ec5acb9a57676fecf512ad9544: Status 404 returned error can't find the container with id 0937af52354ad6cbb71bd3f7c1a6fd2c4f6c15ec5acb9a57676fecf512ad9544 Apr 20 20:06:00.729366 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.729345 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-89gqs" Apr 20 20:06:00.735124 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:06:00.735103 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07ec338a_d16f_4d81_9472_f216291c9dba.slice/crio-6708ae5855bb409ba9c1e337bb21ae3cfbf6d3af0f4b8805910410ea4007fc62 WatchSource:0}: Error finding container 6708ae5855bb409ba9c1e337bb21ae3cfbf6d3af0f4b8805910410ea4007fc62: Status 404 returned error can't find the container with id 6708ae5855bb409ba9c1e337bb21ae3cfbf6d3af0f4b8805910410ea4007fc62 Apr 20 20:06:00.751779 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.751759 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7bzrm" Apr 20 20:06:00.757229 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:06:00.757205 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8fae2ab_f747_4b27_b9a3_55be9806fb45.slice/crio-b4799fc39506b6aeccf202cf71087fdec67e5dc08393869ad4b821cee4b711a3 WatchSource:0}: Error finding container b4799fc39506b6aeccf202cf71087fdec67e5dc08393869ad4b821cee4b711a3: Status 404 returned error can't find the container with id b4799fc39506b6aeccf202cf71087fdec67e5dc08393869ad4b821cee4b711a3 Apr 20 20:06:00.774256 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.774239 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ntdcl" Apr 20 20:06:00.779563 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:06:00.779547 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43405a48_098c_49ef_95e3_3544654522ad.slice/crio-b9bfd4a155cac0edc7be29bd75386a6ec109eaab96ee45da37ed850c5d779926 WatchSource:0}: Error finding container b9bfd4a155cac0edc7be29bd75386a6ec109eaab96ee45da37ed850c5d779926: Status 404 returned error can't find the container with id b9bfd4a155cac0edc7be29bd75386a6ec109eaab96ee45da37ed850c5d779926 Apr 20 20:06:00.786847 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.786830 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-24vt9" Apr 20 20:06:00.791980 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:06:00.791961 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cc9261c_1baf_4d71_aae3_b734d559681b.slice/crio-7a2220907806d747893de6ec68c0c2c55618df110ec9350bd349a3edc10a8a95 WatchSource:0}: Error finding container 7a2220907806d747893de6ec68c0c2c55618df110ec9350bd349a3edc10a8a95: Status 404 returned error can't find the container with id 7a2220907806d747893de6ec68c0c2c55618df110ec9350bd349a3edc10a8a95 Apr 20 20:06:00.809200 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.809182 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mckzb" Apr 20 20:06:00.814847 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.814824 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" Apr 20 20:06:00.815312 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:06:00.815265 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37568615_5b75_4d85_aad5_7bfdbb676856.slice/crio-53416ae925ed680cffddecb79567db422c962f0cd1435c573060e289344f9d32 WatchSource:0}: Error finding container 53416ae925ed680cffddecb79567db422c962f0cd1435c573060e289344f9d32: Status 404 returned error can't find the container with id 53416ae925ed680cffddecb79567db422c962f0cd1435c573060e289344f9d32 Apr 20 20:06:00.820549 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:06:00.820526 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7774b4c6_2299_4070_b786_73a21e70389b.slice/crio-a3ec2cff4fa99bc84e4ad034e72eac9e00bf0699536f2b79b9d3d064cc3db6d0 WatchSource:0}: Error finding container a3ec2cff4fa99bc84e4ad034e72eac9e00bf0699536f2b79b9d3d064cc3db6d0: Status 404 returned error can't find the container with id a3ec2cff4fa99bc84e4ad034e72eac9e00bf0699536f2b79b9d3d064cc3db6d0 Apr 20 20:06:00.837948 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.837928 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5g757" Apr 20 20:06:00.843030 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.843014 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rm9m6" Apr 20 20:06:00.843238 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:06:00.843219 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e90b560_013d_4eb3_83bf_d19971d4fd0c.slice/crio-c6b56c6a69df7adf60de42fe7740d5d594da78112a86373199f57d53608ac2e2 WatchSource:0}: Error finding container c6b56c6a69df7adf60de42fe7740d5d594da78112a86373199f57d53608ac2e2: Status 404 returned error can't find the container with id c6b56c6a69df7adf60de42fe7740d5d594da78112a86373199f57d53608ac2e2 Apr 20 20:06:00.848463 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:06:00.848444 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2940fed8_94a7_4975_8584_3fcd4e6a7933.slice/crio-ef716f9ecff14484ecd0d6d6b8ce24664dbd036a203bc856600ca88b3ee49de0 WatchSource:0}: Error finding container ef716f9ecff14484ecd0d6d6b8ce24664dbd036a203bc856600ca88b3ee49de0: Status 404 returned error can't find the container with id ef716f9ecff14484ecd0d6d6b8ce24664dbd036a203bc856600ca88b3ee49de0 Apr 20 20:06:00.892985 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:00.892891 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:06:01.089719 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:01.089692 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs\") pod \"network-metrics-daemon-9sbrz\" (UID: \"012dcd86-26f0-4115-bd86-d5066c900541\") " pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:01.089861 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:01.089805 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:01.089906 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:01.089896 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs podName:012dcd86-26f0-4115-bd86-d5066c900541 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:02.089880447 +0000 UTC m=+3.129369982 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs") pod "network-metrics-daemon-9sbrz" (UID: "012dcd86-26f0-4115-bd86-d5066c900541") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:01.190455 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:01.190359 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzxcj\" (UniqueName: \"kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj\") pod \"network-check-target-9bjr7\" (UID: \"800b4dad-a669-433c-8963-4c9f630913b5\") " pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:01.190625 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:01.190562 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:06:01.190625 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:01.190591 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:06:01.190625 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:01.190603 2571 projected.go:194] Error preparing data for projected volume kube-api-access-rzxcj for pod openshift-network-diagnostics/network-check-target-9bjr7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:01.190779 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:01.190657 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj podName:800b4dad-a669-433c-8963-4c9f630913b5 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:02.190639005 +0000 UTC m=+3.230128543 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-rzxcj" (UniqueName: "kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj") pod "network-check-target-9bjr7" (UID: "800b4dad-a669-433c-8963-4c9f630913b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:01.422051 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:01.421955 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 20:01:00 +0000 UTC" deadline="2027-09-27 16:30:10.675211963 +0000 UTC" Apr 20 20:06:01.422051 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:01.421991 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12596h24m9.25322429s" Apr 20 20:06:01.445185 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:01.444869 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:06:01.540314 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:01.540254 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7bzrm" event={"ID":"e8fae2ab-f747-4b27-b9a3-55be9806fb45","Type":"ContainerStarted","Data":"b4799fc39506b6aeccf202cf71087fdec67e5dc08393869ad4b821cee4b711a3"} Apr 20 20:06:01.550712 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:01.550653 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-89gqs" event={"ID":"07ec338a-d16f-4d81-9472-f216291c9dba","Type":"ContainerStarted","Data":"6708ae5855bb409ba9c1e337bb21ae3cfbf6d3af0f4b8805910410ea4007fc62"} Apr 20 20:06:01.553370 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:01.553320 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" event={"ID":"997e9539-5288-4af5-92f4-55d8ccefbbf7","Type":"ContainerStarted","Data":"0937af52354ad6cbb71bd3f7c1a6fd2c4f6c15ec5acb9a57676fecf512ad9544"} Apr 20 20:06:01.561265 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:01.561237 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-184.ec2.internal" event={"ID":"0fdd1067d66eaa5bc876fed7618c0cd3","Type":"ContainerStarted","Data":"a5f43e6fc2d4af95ebe4869260c00e31486e0d0646608d7a61d6710f58ed9066"} Apr 20 20:06:01.572040 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:01.572013 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rm9m6" event={"ID":"2940fed8-94a7-4975-8584-3fcd4e6a7933","Type":"ContainerStarted","Data":"ef716f9ecff14484ecd0d6d6b8ce24664dbd036a203bc856600ca88b3ee49de0"} Apr 20 20:06:01.589956 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:01.589924 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-24vt9" event={"ID":"7cc9261c-1baf-4d71-aae3-b734d559681b","Type":"ContainerStarted","Data":"7a2220907806d747893de6ec68c0c2c55618df110ec9350bd349a3edc10a8a95"} Apr 20 20:06:01.597931 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:01.597892 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ntdcl" event={"ID":"43405a48-098c-49ef-95e3-3544654522ad","Type":"ContainerStarted","Data":"b9bfd4a155cac0edc7be29bd75386a6ec109eaab96ee45da37ed850c5d779926"} Apr 20 20:06:01.604704 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:01.604675 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-184.ec2.internal" event={"ID":"d93550a1d0c6abebbeb4587739ed181c","Type":"ContainerStarted","Data":"66890ca15b3f7f01938d6c7975c3c40b17658eaff21dcc3300c35a7a36c6e0dc"} Apr 20 20:06:01.613128 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:01.613095 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5g757" event={"ID":"4e90b560-013d-4eb3-83bf-d19971d4fd0c","Type":"ContainerStarted","Data":"c6b56c6a69df7adf60de42fe7740d5d594da78112a86373199f57d53608ac2e2"} Apr 20 20:06:01.639656 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:01.639615 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" event={"ID":"7774b4c6-2299-4070-b786-73a21e70389b","Type":"ContainerStarted","Data":"a3ec2cff4fa99bc84e4ad034e72eac9e00bf0699536f2b79b9d3d064cc3db6d0"} Apr 20 20:06:01.661767 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:01.661732 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mckzb" event={"ID":"37568615-5b75-4d85-aad5-7bfdbb676856","Type":"ContainerStarted","Data":"53416ae925ed680cffddecb79567db422c962f0cd1435c573060e289344f9d32"} Apr 20 20:06:02.098860 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:02.098232 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs\") pod \"network-metrics-daemon-9sbrz\" (UID: \"012dcd86-26f0-4115-bd86-d5066c900541\") " pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:02.098860 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:02.098395 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:02.098860 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:02.098475 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs podName:012dcd86-26f0-4115-bd86-d5066c900541 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:04.098455697 +0000 UTC m=+5.137945226 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs") pod "network-metrics-daemon-9sbrz" (UID: "012dcd86-26f0-4115-bd86-d5066c900541") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:02.199443 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:02.199181 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzxcj\" (UniqueName: \"kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj\") pod \"network-check-target-9bjr7\" (UID: \"800b4dad-a669-433c-8963-4c9f630913b5\") " pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:02.199624 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:02.199438 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:06:02.199624 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:02.199469 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:06:02.199624 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:02.199481 2571 projected.go:194] Error preparing data for projected volume kube-api-access-rzxcj for pod openshift-network-diagnostics/network-check-target-9bjr7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:02.199624 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:02.199543 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj podName:800b4dad-a669-433c-8963-4c9f630913b5 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:04.199524661 +0000 UTC m=+5.239014205 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-rzxcj" (UniqueName: "kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj") pod "network-check-target-9bjr7" (UID: "800b4dad-a669-433c-8963-4c9f630913b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:02.422653 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:02.422518 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 20:01:00 +0000 UTC" deadline="2027-11-29 19:47:16.26260651 +0000 UTC" Apr 20 20:06:02.422653 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:02.422555 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14111h41m13.84005477s" Apr 20 20:06:02.523914 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:02.523204 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:02.523914 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:02.523318 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9bjr7" podUID="800b4dad-a669-433c-8963-4c9f630913b5" Apr 20 20:06:02.523914 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:02.523758 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:02.523914 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:02.523869 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9sbrz" podUID="012dcd86-26f0-4115-bd86-d5066c900541" Apr 20 20:06:03.237754 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:03.237510 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:06:04.116494 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:04.116454 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs\") pod \"network-metrics-daemon-9sbrz\" (UID: \"012dcd86-26f0-4115-bd86-d5066c900541\") " pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:04.116957 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:04.116604 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:04.116957 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:04.116665 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs podName:012dcd86-26f0-4115-bd86-d5066c900541 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:08.116646437 +0000 UTC m=+9.156135964 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs") pod "network-metrics-daemon-9sbrz" (UID: "012dcd86-26f0-4115-bd86-d5066c900541") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:04.216993 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:04.216953 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzxcj\" (UniqueName: \"kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj\") pod \"network-check-target-9bjr7\" (UID: \"800b4dad-a669-433c-8963-4c9f630913b5\") " pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:04.217192 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:04.217171 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:06:04.217254 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:04.217200 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:06:04.217254 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:04.217213 2571 projected.go:194] Error preparing data for projected volume kube-api-access-rzxcj for pod openshift-network-diagnostics/network-check-target-9bjr7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:04.217341 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:04.217275 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj podName:800b4dad-a669-433c-8963-4c9f630913b5 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:08.217253827 +0000 UTC m=+9.256743375 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-rzxcj" (UniqueName: "kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj") pod "network-check-target-9bjr7" (UID: "800b4dad-a669-433c-8963-4c9f630913b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:04.523502 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:04.523407 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:04.523676 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:04.523563 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9bjr7" podUID="800b4dad-a669-433c-8963-4c9f630913b5" Apr 20 20:06:04.523676 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:04.523587 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:04.523796 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:04.523698 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9sbrz" podUID="012dcd86-26f0-4115-bd86-d5066c900541" Apr 20 20:06:05.946838 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:05.946801 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-kk6nn"] Apr 20 20:06:05.952465 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:05.952437 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:05.952619 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:05.952521 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kk6nn" podUID="5266cc03-b601-4ec1-a024-fac19615b5da" Apr 20 20:06:06.033856 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:06.033628 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5266cc03-b601-4ec1-a024-fac19615b5da-kubelet-config\") pod \"global-pull-secret-syncer-kk6nn\" (UID: \"5266cc03-b601-4ec1-a024-fac19615b5da\") " pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:06.033856 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:06.033711 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5266cc03-b601-4ec1-a024-fac19615b5da-original-pull-secret\") pod \"global-pull-secret-syncer-kk6nn\" (UID: \"5266cc03-b601-4ec1-a024-fac19615b5da\") " pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:06.033856 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:06.033755 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5266cc03-b601-4ec1-a024-fac19615b5da-dbus\") pod \"global-pull-secret-syncer-kk6nn\" (UID: \"5266cc03-b601-4ec1-a024-fac19615b5da\") " pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:06.134257 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:06.134213 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5266cc03-b601-4ec1-a024-fac19615b5da-kubelet-config\") pod \"global-pull-secret-syncer-kk6nn\" (UID: \"5266cc03-b601-4ec1-a024-fac19615b5da\") " pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:06.134440 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:06.134293 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5266cc03-b601-4ec1-a024-fac19615b5da-original-pull-secret\") pod \"global-pull-secret-syncer-kk6nn\" (UID: \"5266cc03-b601-4ec1-a024-fac19615b5da\") " pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:06.134440 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:06.134344 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5266cc03-b601-4ec1-a024-fac19615b5da-dbus\") pod \"global-pull-secret-syncer-kk6nn\" (UID: \"5266cc03-b601-4ec1-a024-fac19615b5da\") " pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:06.134608 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:06.134586 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5266cc03-b601-4ec1-a024-fac19615b5da-dbus\") pod \"global-pull-secret-syncer-kk6nn\" (UID: \"5266cc03-b601-4ec1-a024-fac19615b5da\") " pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:06.134679 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:06.134663 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5266cc03-b601-4ec1-a024-fac19615b5da-kubelet-config\") pod \"global-pull-secret-syncer-kk6nn\" (UID: \"5266cc03-b601-4ec1-a024-fac19615b5da\") " pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:06.134785 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:06.134769 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:06.134850 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:06.134836 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5266cc03-b601-4ec1-a024-fac19615b5da-original-pull-secret podName:5266cc03-b601-4ec1-a024-fac19615b5da nodeName:}" failed. No retries permitted until 2026-04-20 20:06:06.634816155 +0000 UTC m=+7.674305697 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5266cc03-b601-4ec1-a024-fac19615b5da-original-pull-secret") pod "global-pull-secret-syncer-kk6nn" (UID: "5266cc03-b601-4ec1-a024-fac19615b5da") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:06.524329 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:06.523677 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:06.524329 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:06.523795 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9bjr7" podUID="800b4dad-a669-433c-8963-4c9f630913b5" Apr 20 20:06:06.524329 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:06.524183 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:06.524329 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:06.524273 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9sbrz" podUID="012dcd86-26f0-4115-bd86-d5066c900541" Apr 20 20:06:06.638861 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:06.638823 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5266cc03-b601-4ec1-a024-fac19615b5da-original-pull-secret\") pod \"global-pull-secret-syncer-kk6nn\" (UID: \"5266cc03-b601-4ec1-a024-fac19615b5da\") " pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:06.639033 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:06.638994 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:06.639086 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:06.639061 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5266cc03-b601-4ec1-a024-fac19615b5da-original-pull-secret podName:5266cc03-b601-4ec1-a024-fac19615b5da nodeName:}" failed. No retries permitted until 2026-04-20 20:06:07.639045342 +0000 UTC m=+8.678534889 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5266cc03-b601-4ec1-a024-fac19615b5da-original-pull-secret") pod "global-pull-secret-syncer-kk6nn" (UID: "5266cc03-b601-4ec1-a024-fac19615b5da") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:07.526592 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:07.526562 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:07.527035 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:07.526683 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kk6nn" podUID="5266cc03-b601-4ec1-a024-fac19615b5da" Apr 20 20:06:07.648379 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:07.648343 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5266cc03-b601-4ec1-a024-fac19615b5da-original-pull-secret\") pod \"global-pull-secret-syncer-kk6nn\" (UID: \"5266cc03-b601-4ec1-a024-fac19615b5da\") " pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:07.648611 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:07.648508 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:07.648611 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:07.648559 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5266cc03-b601-4ec1-a024-fac19615b5da-original-pull-secret podName:5266cc03-b601-4ec1-a024-fac19615b5da nodeName:}" failed. No retries permitted until 2026-04-20 20:06:09.648545775 +0000 UTC m=+10.688035300 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5266cc03-b601-4ec1-a024-fac19615b5da-original-pull-secret") pod "global-pull-secret-syncer-kk6nn" (UID: "5266cc03-b601-4ec1-a024-fac19615b5da") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:08.152076 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:08.152039 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs\") pod \"network-metrics-daemon-9sbrz\" (UID: \"012dcd86-26f0-4115-bd86-d5066c900541\") " pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:08.152354 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:08.152180 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:08.152354 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:08.152245 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs podName:012dcd86-26f0-4115-bd86-d5066c900541 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:16.152226496 +0000 UTC m=+17.191716025 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs") pod "network-metrics-daemon-9sbrz" (UID: "012dcd86-26f0-4115-bd86-d5066c900541") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:08.253382 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:08.253330 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzxcj\" (UniqueName: \"kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj\") pod \"network-check-target-9bjr7\" (UID: \"800b4dad-a669-433c-8963-4c9f630913b5\") " pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:08.253579 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:08.253549 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:06:08.253579 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:08.253568 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:06:08.253692 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:08.253582 2571 projected.go:194] Error preparing data for projected volume kube-api-access-rzxcj for pod openshift-network-diagnostics/network-check-target-9bjr7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:08.253692 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:08.253637 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj podName:800b4dad-a669-433c-8963-4c9f630913b5 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:16.25361927 +0000 UTC m=+17.293108799 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-rzxcj" (UniqueName: "kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj") pod "network-check-target-9bjr7" (UID: "800b4dad-a669-433c-8963-4c9f630913b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:08.523736 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:08.523679 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:08.523968 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:08.523825 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9bjr7" podUID="800b4dad-a669-433c-8963-4c9f630913b5" Apr 20 20:06:08.524175 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:08.523681 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:08.524301 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:08.524278 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9sbrz" podUID="012dcd86-26f0-4115-bd86-d5066c900541" Apr 20 20:06:09.523584 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:09.523559 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:09.524012 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:09.523650 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kk6nn" podUID="5266cc03-b601-4ec1-a024-fac19615b5da" Apr 20 20:06:09.664525 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:09.664488 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5266cc03-b601-4ec1-a024-fac19615b5da-original-pull-secret\") pod \"global-pull-secret-syncer-kk6nn\" (UID: \"5266cc03-b601-4ec1-a024-fac19615b5da\") " pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:09.664709 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:09.664633 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:09.664709 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:09.664691 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5266cc03-b601-4ec1-a024-fac19615b5da-original-pull-secret podName:5266cc03-b601-4ec1-a024-fac19615b5da nodeName:}" failed. No retries permitted until 2026-04-20 20:06:13.664672914 +0000 UTC m=+14.704162442 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5266cc03-b601-4ec1-a024-fac19615b5da-original-pull-secret") pod "global-pull-secret-syncer-kk6nn" (UID: "5266cc03-b601-4ec1-a024-fac19615b5da") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:10.523381 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:10.523342 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:10.523568 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:10.523349 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:10.523568 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:10.523492 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9bjr7" podUID="800b4dad-a669-433c-8963-4c9f630913b5" Apr 20 20:06:10.523568 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:10.523550 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9sbrz" podUID="012dcd86-26f0-4115-bd86-d5066c900541" Apr 20 20:06:11.523099 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:11.523064 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:11.523289 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:11.523182 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kk6nn" podUID="5266cc03-b601-4ec1-a024-fac19615b5da" Apr 20 20:06:12.523594 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:12.523546 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:12.524019 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:12.523546 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:12.524019 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:12.523698 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9sbrz" podUID="012dcd86-26f0-4115-bd86-d5066c900541" Apr 20 20:06:12.524019 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:12.523758 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9bjr7" podUID="800b4dad-a669-433c-8963-4c9f630913b5" Apr 20 20:06:13.522949 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:13.522913 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:13.523134 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:13.523045 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kk6nn" podUID="5266cc03-b601-4ec1-a024-fac19615b5da" Apr 20 20:06:13.693213 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:13.693176 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5266cc03-b601-4ec1-a024-fac19615b5da-original-pull-secret\") pod \"global-pull-secret-syncer-kk6nn\" (UID: \"5266cc03-b601-4ec1-a024-fac19615b5da\") " pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:13.693627 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:13.693322 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:13.693627 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:13.693404 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5266cc03-b601-4ec1-a024-fac19615b5da-original-pull-secret podName:5266cc03-b601-4ec1-a024-fac19615b5da nodeName:}" failed. No retries permitted until 2026-04-20 20:06:21.693373033 +0000 UTC m=+22.732862585 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5266cc03-b601-4ec1-a024-fac19615b5da-original-pull-secret") pod "global-pull-secret-syncer-kk6nn" (UID: "5266cc03-b601-4ec1-a024-fac19615b5da") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:14.522758 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:14.522728 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:14.522758 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:14.522742 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:14.522945 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:14.522818 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9bjr7" podUID="800b4dad-a669-433c-8963-4c9f630913b5" Apr 20 20:06:14.522945 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:14.522926 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9sbrz" podUID="012dcd86-26f0-4115-bd86-d5066c900541" Apr 20 20:06:15.523362 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:15.523327 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:15.523832 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:15.523469 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kk6nn" podUID="5266cc03-b601-4ec1-a024-fac19615b5da" Apr 20 20:06:16.212147 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:16.212116 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs\") pod \"network-metrics-daemon-9sbrz\" (UID: \"012dcd86-26f0-4115-bd86-d5066c900541\") " pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:16.212334 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:16.212245 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:16.212334 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:16.212301 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs podName:012dcd86-26f0-4115-bd86-d5066c900541 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:32.212287836 +0000 UTC m=+33.251777360 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs") pod "network-metrics-daemon-9sbrz" (UID: "012dcd86-26f0-4115-bd86-d5066c900541") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:16.312945 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:16.312912 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzxcj\" (UniqueName: \"kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj\") pod \"network-check-target-9bjr7\" (UID: \"800b4dad-a669-433c-8963-4c9f630913b5\") " pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:16.313141 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:16.313121 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:06:16.313206 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:16.313146 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:06:16.313206 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:16.313162 2571 projected.go:194] Error preparing data for projected volume kube-api-access-rzxcj for pod openshift-network-diagnostics/network-check-target-9bjr7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:16.313315 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:16.313225 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj podName:800b4dad-a669-433c-8963-4c9f630913b5 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:32.313204606 +0000 UTC m=+33.352694133 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-rzxcj" (UniqueName: "kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj") pod "network-check-target-9bjr7" (UID: "800b4dad-a669-433c-8963-4c9f630913b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:16.523259 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:16.523219 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:16.523419 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:16.523233 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:16.523419 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:16.523362 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9bjr7" podUID="800b4dad-a669-433c-8963-4c9f630913b5" Apr 20 20:06:16.523829 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:16.523446 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9sbrz" podUID="012dcd86-26f0-4115-bd86-d5066c900541" Apr 20 20:06:17.523690 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:17.523645 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:17.524097 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:17.523761 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kk6nn" podUID="5266cc03-b601-4ec1-a024-fac19615b5da" Apr 20 20:06:18.523530 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:18.523492 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:18.523720 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:18.523504 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:18.523720 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:18.523608 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9bjr7" podUID="800b4dad-a669-433c-8963-4c9f630913b5" Apr 20 20:06:18.523720 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:18.523698 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9sbrz" podUID="012dcd86-26f0-4115-bd86-d5066c900541" Apr 20 20:06:19.526783 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:19.526760 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:19.527069 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:19.526858 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kk6nn" podUID="5266cc03-b601-4ec1-a024-fac19615b5da" Apr 20 20:06:19.696494 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:19.696050 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-89gqs" event={"ID":"07ec338a-d16f-4d81-9472-f216291c9dba","Type":"ContainerStarted","Data":"3c7133f2d26a239b3edd75ce5efe08e177ece40232270487257d31713390d960"} Apr 20 20:06:19.700020 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:19.699996 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kldp5_997e9539-5288-4af5-92f4-55d8ccefbbf7/ovn-acl-logging/0.log" Apr 20 20:06:19.700362 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:19.700333 2571 generic.go:358] "Generic (PLEG): container finished" podID="997e9539-5288-4af5-92f4-55d8ccefbbf7" containerID="b0cc855ab31d030b3941ea81d2d6aa8eacf2bee5b4af1c07ff53d039bc5cd260" exitCode=1 Apr 20 20:06:19.700543 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:19.700417 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" event={"ID":"997e9539-5288-4af5-92f4-55d8ccefbbf7","Type":"ContainerStarted","Data":"dfd044e9333f93c289028806f1a84fc1cc2d7eb42463f15c8c243c04d17c3a8f"} Apr 20 20:06:19.700604 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:19.700556 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" event={"ID":"997e9539-5288-4af5-92f4-55d8ccefbbf7","Type":"ContainerDied","Data":"b0cc855ab31d030b3941ea81d2d6aa8eacf2bee5b4af1c07ff53d039bc5cd260"} Apr 20 20:06:19.700604 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:19.700571 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" event={"ID":"997e9539-5288-4af5-92f4-55d8ccefbbf7","Type":"ContainerStarted","Data":"e44fe9596bdc635dcab94926416003e11b517d5e6451c389452c8601b44d2a10"} Apr 20 20:06:19.702930 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:19.702897 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-184.ec2.internal" event={"ID":"d93550a1d0c6abebbeb4587739ed181c","Type":"ContainerStarted","Data":"5f151fb7258b636b0a89216320cdd55fde4d2a6ff0a8fefd90cda90201d78157"} Apr 20 20:06:19.714112 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:19.714062 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-89gqs" podStartSLOduration=2.274987404 podStartE2EDuration="20.714051889s" podCreationTimestamp="2026-04-20 20:05:59 +0000 UTC" firstStartedPulling="2026-04-20 20:06:00.73661194 +0000 UTC m=+1.776101467" lastFinishedPulling="2026-04-20 20:06:19.175676427 +0000 UTC m=+20.215165952" observedRunningTime="2026-04-20 20:06:19.7139623 +0000 UTC m=+20.753451845" watchObservedRunningTime="2026-04-20 20:06:19.714051889 +0000 UTC m=+20.753541434" Apr 20 20:06:19.727916 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:19.727866 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-184.ec2.internal" podStartSLOduration=19.727850575 podStartE2EDuration="19.727850575s" podCreationTimestamp="2026-04-20 20:06:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:06:19.727661842 +0000 UTC m=+20.767151388" watchObservedRunningTime="2026-04-20 20:06:19.727850575 +0000 UTC m=+20.767340121" Apr 20 20:06:20.522931 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:20.522902 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:20.523019 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:20.522906 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:20.523084 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:20.523068 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9sbrz" podUID="012dcd86-26f0-4115-bd86-d5066c900541" Apr 20 20:06:20.523118 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:20.522990 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9bjr7" podUID="800b4dad-a669-433c-8963-4c9f630913b5" Apr 20 20:06:20.706530 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:20.706485 2571 generic.go:358] "Generic (PLEG): container finished" podID="4e90b560-013d-4eb3-83bf-d19971d4fd0c" containerID="8f4edd8077aab6a309ce185713d585f85f15eef882f091bd297cb4309b2edb09" exitCode=0 Apr 20 20:06:20.707282 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:20.706581 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5g757" event={"ID":"4e90b560-013d-4eb3-83bf-d19971d4fd0c","Type":"ContainerDied","Data":"8f4edd8077aab6a309ce185713d585f85f15eef882f091bd297cb4309b2edb09"} Apr 20 20:06:20.707955 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:20.707932 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" event={"ID":"7774b4c6-2299-4070-b786-73a21e70389b","Type":"ContainerStarted","Data":"c548edd59230e2bdbd519cae80c2aae8bc866472062bd346559d4ffc4e781a34"} Apr 20 20:06:20.709844 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:20.709704 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mckzb" event={"ID":"37568615-5b75-4d85-aad5-7bfdbb676856","Type":"ContainerStarted","Data":"660a8e2f98e54dac8c96be700ae8d2129ecfb5eb576cd80da9565e585749a2c3"} Apr 20 20:06:20.711490 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:20.711458 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7bzrm" event={"ID":"e8fae2ab-f747-4b27-b9a3-55be9806fb45","Type":"ContainerStarted","Data":"c3f7b844fb80ec245658ffc13215dc4a58c4d837f8b78c53535208f0ec2d4b63"} Apr 20 20:06:20.714200 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:20.714184 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kldp5_997e9539-5288-4af5-92f4-55d8ccefbbf7/ovn-acl-logging/0.log" Apr 20 20:06:20.714496 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:20.714459 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" event={"ID":"997e9539-5288-4af5-92f4-55d8ccefbbf7","Type":"ContainerStarted","Data":"cc351250549f41c3ba288958e43646fb7911b8510308619d5eab8b47f46764e1"} Apr 20 20:06:20.714566 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:20.714504 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" event={"ID":"997e9539-5288-4af5-92f4-55d8ccefbbf7","Type":"ContainerStarted","Data":"ab74e9b033cdb502c1bbb237281a03277080030b2dcad14f21955182a7688340"} Apr 20 20:06:20.714566 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:20.714526 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" event={"ID":"997e9539-5288-4af5-92f4-55d8ccefbbf7","Type":"ContainerStarted","Data":"9b16d2dc4a636424e1531f3cca31ed82d5c13efa5c5591a2ab2768080aa7ad09"} Apr 20 20:06:20.715715 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:20.715695 2571 generic.go:358] "Generic (PLEG): container finished" podID="0fdd1067d66eaa5bc876fed7618c0cd3" containerID="0cf44b27be3aa9566ee89938dca5ef21295844e0b259dee9d3385a17b686228e" exitCode=0 Apr 20 20:06:20.715772 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:20.715749 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-184.ec2.internal" event={"ID":"0fdd1067d66eaa5bc876fed7618c0cd3","Type":"ContainerDied","Data":"0cf44b27be3aa9566ee89938dca5ef21295844e0b259dee9d3385a17b686228e"} Apr 20 20:06:20.716997 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:20.716972 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rm9m6" event={"ID":"2940fed8-94a7-4975-8584-3fcd4e6a7933","Type":"ContainerStarted","Data":"5410ecc0652a45ec053398b68d7b9c3343ce211edc2fb7d01561161c91221514"} Apr 20 20:06:20.722649 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:20.722626 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-24vt9" event={"ID":"7cc9261c-1baf-4d71-aae3-b734d559681b","Type":"ContainerStarted","Data":"bc625708bbecc2142f7fd13be3ba673fc0565902d813969ba67e2fb216b319ac"} Apr 20 20:06:20.723930 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:20.723884 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ntdcl" event={"ID":"43405a48-098c-49ef-95e3-3544654522ad","Type":"ContainerStarted","Data":"38af4fc24f22455c83fb9f4d308306cac344e638745d007d9e72f3358e4d8615"} Apr 20 20:06:20.746806 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:20.746738 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ntdcl" podStartSLOduration=3.355886347 podStartE2EDuration="21.746727519s" podCreationTimestamp="2026-04-20 20:05:59 +0000 UTC" firstStartedPulling="2026-04-20 20:06:00.782404753 +0000 UTC m=+1.821894276" lastFinishedPulling="2026-04-20 20:06:19.173245915 +0000 UTC m=+20.212735448" observedRunningTime="2026-04-20 20:06:20.746665993 +0000 UTC m=+21.786155539" watchObservedRunningTime="2026-04-20 20:06:20.746727519 +0000 UTC m=+21.786217064" Apr 20 20:06:20.764508 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:20.764462 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-24vt9" podStartSLOduration=2.908179879 podStartE2EDuration="21.764450608s" podCreationTimestamp="2026-04-20 20:05:59 +0000 UTC" firstStartedPulling="2026-04-20 20:06:00.79330948 +0000 UTC m=+1.832799005" lastFinishedPulling="2026-04-20 20:06:19.649580202 +0000 UTC m=+20.689069734" observedRunningTime="2026-04-20 20:06:20.764198148 +0000 UTC m=+21.803687693" watchObservedRunningTime="2026-04-20 20:06:20.764450608 +0000 UTC m=+21.803940158" Apr 20 20:06:20.778359 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:20.778316 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-mckzb" podStartSLOduration=3.42258972 podStartE2EDuration="21.778305294s" podCreationTimestamp="2026-04-20 20:05:59 +0000 UTC" firstStartedPulling="2026-04-20 20:06:00.817670865 +0000 UTC m=+1.857160394" lastFinishedPulling="2026-04-20 20:06:19.173386443 +0000 UTC m=+20.212875968" observedRunningTime="2026-04-20 20:06:20.777790649 +0000 UTC m=+21.817280196" watchObservedRunningTime="2026-04-20 20:06:20.778305294 +0000 UTC m=+21.817794839" Apr 20 20:06:20.807870 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:20.807825 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7bzrm" podStartSLOduration=3.393342525 podStartE2EDuration="21.807813915s" podCreationTimestamp="2026-04-20 20:05:59 +0000 UTC" firstStartedPulling="2026-04-20 20:06:00.75891263 +0000 UTC m=+1.798402156" lastFinishedPulling="2026-04-20 20:06:19.173384007 +0000 UTC m=+20.212873546" observedRunningTime="2026-04-20 20:06:20.807677535 +0000 UTC m=+21.847167083" watchObservedRunningTime="2026-04-20 20:06:20.807813915 +0000 UTC m=+21.847303461" Apr 20 20:06:20.823465 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:20.823403 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rm9m6" podStartSLOduration=3.498519713 podStartE2EDuration="21.823388506s" podCreationTimestamp="2026-04-20 20:05:59 +0000 UTC" firstStartedPulling="2026-04-20 20:06:00.849762406 +0000 UTC m=+1.889251934" lastFinishedPulling="2026-04-20 20:06:19.174631195 +0000 UTC m=+20.214120727" observedRunningTime="2026-04-20 20:06:20.822796715 +0000 UTC m=+21.862286260" watchObservedRunningTime="2026-04-20 20:06:20.823388506 +0000 UTC m=+21.862878054" Apr 20 20:06:21.080655 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:21.080504 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 20:06:21.455899 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:21.455745 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T20:06:21.080650793Z","UUID":"c66d1f54-bf02-42af-a6e8-28fb62afe206","Handler":null,"Name":"","Endpoint":""} Apr 20 20:06:21.457773 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:21.457746 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 20:06:21.457773 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:21.457776 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 20:06:21.526792 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:21.526759 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:21.526955 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:21.526868 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kk6nn" podUID="5266cc03-b601-4ec1-a024-fac19615b5da" Apr 20 20:06:21.727384 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:21.727306 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" event={"ID":"7774b4c6-2299-4070-b786-73a21e70389b","Type":"ContainerStarted","Data":"90f2e58389178c63169243756ee300b65c782fae9e84ab86f1abcd1e975cf634"} Apr 20 20:06:21.729249 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:21.729189 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-184.ec2.internal" event={"ID":"0fdd1067d66eaa5bc876fed7618c0cd3","Type":"ContainerStarted","Data":"c0cddd67ccc8711ed7a84533f006923816d3aa9627ab8439d726028e31fd5fc4"} Apr 20 20:06:21.752855 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:21.752821 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5266cc03-b601-4ec1-a024-fac19615b5da-original-pull-secret\") pod \"global-pull-secret-syncer-kk6nn\" (UID: \"5266cc03-b601-4ec1-a024-fac19615b5da\") " pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:21.752991 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:21.752973 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:21.753177 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:21.753036 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5266cc03-b601-4ec1-a024-fac19615b5da-original-pull-secret podName:5266cc03-b601-4ec1-a024-fac19615b5da nodeName:}" failed. No retries permitted until 2026-04-20 20:06:37.753019896 +0000 UTC m=+38.792509423 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5266cc03-b601-4ec1-a024-fac19615b5da-original-pull-secret") pod "global-pull-secret-syncer-kk6nn" (UID: "5266cc03-b601-4ec1-a024-fac19615b5da") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:22.523237 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:22.523202 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:22.523418 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:22.523329 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9sbrz" podUID="012dcd86-26f0-4115-bd86-d5066c900541" Apr 20 20:06:22.523418 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:22.523361 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:22.523568 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:22.523438 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9bjr7" podUID="800b4dad-a669-433c-8963-4c9f630913b5" Apr 20 20:06:22.537155 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:22.537123 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-mckzb" Apr 20 20:06:22.537805 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:22.537782 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-mckzb" Apr 20 20:06:22.551437 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:22.551388 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-184.ec2.internal" podStartSLOduration=22.551370039 podStartE2EDuration="22.551370039s" podCreationTimestamp="2026-04-20 20:06:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:06:21.74364197 +0000 UTC m=+22.783131514" watchObservedRunningTime="2026-04-20 20:06:22.551370039 +0000 UTC m=+23.590859566" Apr 20 20:06:22.733782 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:22.733754 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kldp5_997e9539-5288-4af5-92f4-55d8ccefbbf7/ovn-acl-logging/0.log" Apr 20 20:06:22.734603 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:22.734569 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" event={"ID":"997e9539-5288-4af5-92f4-55d8ccefbbf7","Type":"ContainerStarted","Data":"5bb639682440823d546d9ac5ebdf215ed741c4010cfbcc5f0ddb5f8c7fd9318b"} Apr 20 20:06:22.736581 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:22.736556 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" event={"ID":"7774b4c6-2299-4070-b786-73a21e70389b","Type":"ContainerStarted","Data":"17b59385257144370d33ebd52f59e50ce3944b44e96532eb40ffb436b9faf23c"} Apr 20 20:06:22.737134 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:22.737103 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-mckzb" Apr 20 20:06:22.737449 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:22.737414 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-mckzb" Apr 20 20:06:22.756283 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:22.756234 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdlm5" podStartSLOduration=2.6546591040000003 podStartE2EDuration="23.756223444s" podCreationTimestamp="2026-04-20 20:05:59 +0000 UTC" firstStartedPulling="2026-04-20 20:06:00.821954301 +0000 UTC m=+1.861443825" lastFinishedPulling="2026-04-20 20:06:21.923518627 +0000 UTC m=+22.963008165" observedRunningTime="2026-04-20 20:06:22.755907685 +0000 UTC m=+23.795397233" watchObservedRunningTime="2026-04-20 20:06:22.756223444 +0000 UTC m=+23.795712991" Apr 20 20:06:23.522880 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:23.522843 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:23.523055 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:23.522992 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kk6nn" podUID="5266cc03-b601-4ec1-a024-fac19615b5da" Apr 20 20:06:24.523514 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:24.523481 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:24.524065 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:24.523482 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:24.524065 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:24.523600 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9sbrz" podUID="012dcd86-26f0-4115-bd86-d5066c900541" Apr 20 20:06:24.524065 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:24.523679 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9bjr7" podUID="800b4dad-a669-433c-8963-4c9f630913b5" Apr 20 20:06:25.523094 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:25.522919 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:25.523251 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:25.523188 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kk6nn" podUID="5266cc03-b601-4ec1-a024-fac19615b5da" Apr 20 20:06:25.743701 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:25.743669 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kldp5_997e9539-5288-4af5-92f4-55d8ccefbbf7/ovn-acl-logging/0.log" Apr 20 20:06:25.744314 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:25.743976 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" event={"ID":"997e9539-5288-4af5-92f4-55d8ccefbbf7","Type":"ContainerStarted","Data":"b5fa441231ce85f100c4eb0dc1a2851170ecf781e0559d6953a0d11f0c3c4189"} Apr 20 20:06:25.744314 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:25.744250 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:25.744314 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:25.744275 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:25.744494 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:25.744462 2571 scope.go:117] "RemoveContainer" containerID="b0cc855ab31d030b3941ea81d2d6aa8eacf2bee5b4af1c07ff53d039bc5cd260" Apr 20 20:06:25.745707 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:25.745686 2571 generic.go:358] "Generic (PLEG): container finished" podID="4e90b560-013d-4eb3-83bf-d19971d4fd0c" containerID="9ae19cb8aa05c6dcb48d3af2daa5675b8069aa1b5f3159939422f6776029cab2" exitCode=0 Apr 20 20:06:25.745795 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:25.745728 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5g757" event={"ID":"4e90b560-013d-4eb3-83bf-d19971d4fd0c","Type":"ContainerDied","Data":"9ae19cb8aa05c6dcb48d3af2daa5675b8069aa1b5f3159939422f6776029cab2"} Apr 20 20:06:25.760917 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:25.760892 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:25.761054 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:25.761025 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:26.523635 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:26.523479 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:26.523750 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:26.523479 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:26.523807 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:26.523750 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9sbrz" podUID="012dcd86-26f0-4115-bd86-d5066c900541" Apr 20 20:06:26.523871 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:26.523847 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9bjr7" podUID="800b4dad-a669-433c-8963-4c9f630913b5" Apr 20 20:06:26.753133 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:26.753109 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kldp5_997e9539-5288-4af5-92f4-55d8ccefbbf7/ovn-acl-logging/0.log" Apr 20 20:06:26.753715 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:26.753519 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" event={"ID":"997e9539-5288-4af5-92f4-55d8ccefbbf7","Type":"ContainerStarted","Data":"4a12631bd709eb42b3c9a0fd76b4e4d864c9fcb6624742119728e95d2f8f4ef0"} Apr 20 20:06:26.753715 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:26.753708 2571 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 20:06:26.754896 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:26.754878 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9sbrz"] Apr 20 20:06:26.755495 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:26.755476 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kk6nn"] Apr 20 20:06:26.755600 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:26.755559 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:26.755664 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:26.755647 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kk6nn" podUID="5266cc03-b601-4ec1-a024-fac19615b5da" Apr 20 20:06:26.756899 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:26.756882 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9bjr7"] Apr 20 20:06:26.756998 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:26.756904 2571 generic.go:358] "Generic (PLEG): container finished" podID="4e90b560-013d-4eb3-83bf-d19971d4fd0c" containerID="e3317a0988150b85e40bcf67aa82c82f6cc8ad7479a265d26aab2ba2e749f8de" exitCode=0 Apr 20 20:06:26.756998 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:26.756935 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5g757" event={"ID":"4e90b560-013d-4eb3-83bf-d19971d4fd0c","Type":"ContainerDied","Data":"e3317a0988150b85e40bcf67aa82c82f6cc8ad7479a265d26aab2ba2e749f8de"} Apr 20 20:06:26.756998 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:26.756983 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:26.757096 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:26.757000 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:26.757096 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:26.757073 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9sbrz" podUID="012dcd86-26f0-4115-bd86-d5066c900541" Apr 20 20:06:26.757216 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:26.757167 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9bjr7" podUID="800b4dad-a669-433c-8963-4c9f630913b5" Apr 20 20:06:26.782394 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:26.782341 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" podStartSLOduration=9.199074761 podStartE2EDuration="27.782327558s" podCreationTimestamp="2026-04-20 20:05:59 +0000 UTC" firstStartedPulling="2026-04-20 20:06:00.716925034 +0000 UTC m=+1.756414559" lastFinishedPulling="2026-04-20 20:06:19.300177831 +0000 UTC m=+20.339667356" observedRunningTime="2026-04-20 20:06:26.782051971 +0000 UTC m=+27.821541517" watchObservedRunningTime="2026-04-20 20:06:26.782327558 +0000 UTC m=+27.821817101" Apr 20 20:06:27.760587 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:27.760558 2571 generic.go:358] "Generic (PLEG): container finished" podID="4e90b560-013d-4eb3-83bf-d19971d4fd0c" containerID="172299d04845dfa60fb3ffbf6bfc822ff4b6e6eb1fa3d1cc0d1a49bbc1d972c6" exitCode=0 Apr 20 20:06:27.761007 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:27.760649 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5g757" event={"ID":"4e90b560-013d-4eb3-83bf-d19971d4fd0c","Type":"ContainerDied","Data":"172299d04845dfa60fb3ffbf6bfc822ff4b6e6eb1fa3d1cc0d1a49bbc1d972c6"} Apr 20 20:06:27.761007 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:27.760843 2571 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 20:06:28.522995 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:28.522969 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:28.523193 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:28.523000 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:28.523193 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:28.523067 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:28.523296 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:28.523187 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kk6nn" podUID="5266cc03-b601-4ec1-a024-fac19615b5da" Apr 20 20:06:28.523354 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:28.523322 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9bjr7" podUID="800b4dad-a669-433c-8963-4c9f630913b5" Apr 20 20:06:28.523466 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:28.523442 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9sbrz" podUID="012dcd86-26f0-4115-bd86-d5066c900541" Apr 20 20:06:30.522883 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:30.522844 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:30.522883 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:30.522880 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:30.523553 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:30.522880 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:30.523553 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:30.522972 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kk6nn" podUID="5266cc03-b601-4ec1-a024-fac19615b5da" Apr 20 20:06:30.523553 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:30.523072 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9bjr7" podUID="800b4dad-a669-433c-8963-4c9f630913b5" Apr 20 20:06:30.523553 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:30.523172 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9sbrz" podUID="012dcd86-26f0-4115-bd86-d5066c900541" Apr 20 20:06:30.640648 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:30.640609 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:06:30.640880 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:30.640862 2571 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 20:06:30.657062 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:30.656827 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" podUID="997e9539-5288-4af5-92f4-55d8ccefbbf7" containerName="ovnkube-controller" probeResult="failure" output="" Apr 20 20:06:30.667211 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:30.667146 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" podUID="997e9539-5288-4af5-92f4-55d8ccefbbf7" containerName="ovnkube-controller" probeResult="failure" output="" Apr 20 20:06:32.230542 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.230509 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs\") pod \"network-metrics-daemon-9sbrz\" (UID: \"012dcd86-26f0-4115-bd86-d5066c900541\") " pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:32.230901 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:32.230625 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:32.230901 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:32.230681 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs podName:012dcd86-26f0-4115-bd86-d5066c900541 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:04.230666336 +0000 UTC m=+65.270155860 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs") pod "network-metrics-daemon-9sbrz" (UID: "012dcd86-26f0-4115-bd86-d5066c900541") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:32.274171 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.274140 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-184.ec2.internal" event="NodeReady" Apr 20 20:06:32.274330 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.274283 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 20:06:32.316720 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.316653 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-n8mvh"] Apr 20 20:06:32.331457 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.330878 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzxcj\" (UniqueName: \"kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj\") pod \"network-check-target-9bjr7\" (UID: \"800b4dad-a669-433c-8963-4c9f630913b5\") " pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:32.331457 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:32.331046 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:06:32.331457 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:32.331064 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:06:32.331457 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:32.331076 2571 projected.go:194] Error preparing data for projected volume kube-api-access-rzxcj for pod openshift-network-diagnostics/network-check-target-9bjr7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:32.331457 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:32.331128 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj podName:800b4dad-a669-433c-8963-4c9f630913b5 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:04.331111634 +0000 UTC m=+65.370601174 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-rzxcj" (UniqueName: "kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj") pod "network-check-target-9bjr7" (UID: "800b4dad-a669-433c-8963-4c9f630913b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:32.343998 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.343971 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-n74j9"] Apr 20 20:06:32.344167 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.344147 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-n8mvh" Apr 20 20:06:32.346629 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.346602 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 20:06:32.346762 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.346613 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vhk8x\"" Apr 20 20:06:32.346762 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.346690 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 20:06:32.358991 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.358970 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-n8mvh"] Apr 20 20:06:32.358991 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.358993 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-n74j9"] Apr 20 20:06:32.359126 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.359075 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n74j9" Apr 20 20:06:32.363054 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.361736 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xwtbs\"" Apr 20 20:06:32.363054 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.361736 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 20:06:32.363054 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.361737 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 20:06:32.363054 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.361736 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 20:06:32.432053 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.432020 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh7jd\" (UniqueName: \"kubernetes.io/projected/6b4f78da-4307-441b-8a96-07e157e132e8-kube-api-access-jh7jd\") pod \"ingress-canary-n74j9\" (UID: \"6b4f78da-4307-441b-8a96-07e157e132e8\") " pod="openshift-ingress-canary/ingress-canary-n74j9" Apr 20 20:06:32.432219 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.432088 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls\") pod \"dns-default-n8mvh\" (UID: \"e50d24aa-c678-44f8-ba98-19a30a72720c\") " pod="openshift-dns/dns-default-n8mvh" Apr 20 20:06:32.432219 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.432136 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sskms\" (UniqueName: \"kubernetes.io/projected/e50d24aa-c678-44f8-ba98-19a30a72720c-kube-api-access-sskms\") pod \"dns-default-n8mvh\" (UID: \"e50d24aa-c678-44f8-ba98-19a30a72720c\") " pod="openshift-dns/dns-default-n8mvh" Apr 20 20:06:32.432219 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.432159 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert\") pod \"ingress-canary-n74j9\" (UID: \"6b4f78da-4307-441b-8a96-07e157e132e8\") " pod="openshift-ingress-canary/ingress-canary-n74j9" Apr 20 20:06:32.432350 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.432230 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e50d24aa-c678-44f8-ba98-19a30a72720c-config-volume\") pod \"dns-default-n8mvh\" (UID: \"e50d24aa-c678-44f8-ba98-19a30a72720c\") " pod="openshift-dns/dns-default-n8mvh" Apr 20 20:06:32.432350 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.432297 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e50d24aa-c678-44f8-ba98-19a30a72720c-tmp-dir\") pod \"dns-default-n8mvh\" (UID: \"e50d24aa-c678-44f8-ba98-19a30a72720c\") " pod="openshift-dns/dns-default-n8mvh" Apr 20 20:06:32.523732 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.523691 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:32.523942 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.523709 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:06:32.523942 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.523878 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:06:32.526813 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.526667 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 20:06:32.526813 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.526678 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 20:06:32.526813 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.526680 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-9pkw7\"" Apr 20 20:06:32.526813 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.526756 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 20:06:32.526813 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.526796 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 20:06:32.527122 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.527028 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-4tpsm\"" Apr 20 20:06:32.533155 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.533132 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e50d24aa-c678-44f8-ba98-19a30a72720c-tmp-dir\") pod \"dns-default-n8mvh\" (UID: \"e50d24aa-c678-44f8-ba98-19a30a72720c\") " pod="openshift-dns/dns-default-n8mvh" Apr 20 20:06:32.533270 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.533166 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jh7jd\" (UniqueName: \"kubernetes.io/projected/6b4f78da-4307-441b-8a96-07e157e132e8-kube-api-access-jh7jd\") pod \"ingress-canary-n74j9\" (UID: \"6b4f78da-4307-441b-8a96-07e157e132e8\") " pod="openshift-ingress-canary/ingress-canary-n74j9" Apr 20 20:06:32.533270 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.533209 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls\") pod \"dns-default-n8mvh\" (UID: \"e50d24aa-c678-44f8-ba98-19a30a72720c\") " pod="openshift-dns/dns-default-n8mvh" Apr 20 20:06:32.533270 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.533262 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sskms\" (UniqueName: \"kubernetes.io/projected/e50d24aa-c678-44f8-ba98-19a30a72720c-kube-api-access-sskms\") pod \"dns-default-n8mvh\" (UID: \"e50d24aa-c678-44f8-ba98-19a30a72720c\") " pod="openshift-dns/dns-default-n8mvh" Apr 20 20:06:32.533455 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.533290 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert\") pod \"ingress-canary-n74j9\" (UID: \"6b4f78da-4307-441b-8a96-07e157e132e8\") " pod="openshift-ingress-canary/ingress-canary-n74j9" Apr 20 20:06:32.533455 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.533313 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e50d24aa-c678-44f8-ba98-19a30a72720c-config-volume\") pod \"dns-default-n8mvh\" (UID: \"e50d24aa-c678-44f8-ba98-19a30a72720c\") " pod="openshift-dns/dns-default-n8mvh" Apr 20 20:06:32.533588 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.533565 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e50d24aa-c678-44f8-ba98-19a30a72720c-tmp-dir\") pod \"dns-default-n8mvh\" (UID: \"e50d24aa-c678-44f8-ba98-19a30a72720c\") " pod="openshift-dns/dns-default-n8mvh" Apr 20 20:06:32.533695 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:32.533659 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:32.533759 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:32.533715 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls podName:e50d24aa-c678-44f8-ba98-19a30a72720c nodeName:}" failed. No retries permitted until 2026-04-20 20:06:33.033697169 +0000 UTC m=+34.073186709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls") pod "dns-default-n8mvh" (UID: "e50d24aa-c678-44f8-ba98-19a30a72720c") : secret "dns-default-metrics-tls" not found Apr 20 20:06:32.533826 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:32.533804 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:32.533922 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:32.533907 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert podName:6b4f78da-4307-441b-8a96-07e157e132e8 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:33.033890518 +0000 UTC m=+34.073380048 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert") pod "ingress-canary-n74j9" (UID: "6b4f78da-4307-441b-8a96-07e157e132e8") : secret "canary-serving-cert" not found Apr 20 20:06:32.534015 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.533955 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e50d24aa-c678-44f8-ba98-19a30a72720c-config-volume\") pod \"dns-default-n8mvh\" (UID: \"e50d24aa-c678-44f8-ba98-19a30a72720c\") " pod="openshift-dns/dns-default-n8mvh" Apr 20 20:06:32.544453 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.544411 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sskms\" (UniqueName: \"kubernetes.io/projected/e50d24aa-c678-44f8-ba98-19a30a72720c-kube-api-access-sskms\") pod \"dns-default-n8mvh\" (UID: \"e50d24aa-c678-44f8-ba98-19a30a72720c\") " pod="openshift-dns/dns-default-n8mvh" Apr 20 20:06:32.544820 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:32.544796 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh7jd\" (UniqueName: \"kubernetes.io/projected/6b4f78da-4307-441b-8a96-07e157e132e8-kube-api-access-jh7jd\") pod \"ingress-canary-n74j9\" (UID: \"6b4f78da-4307-441b-8a96-07e157e132e8\") " pod="openshift-ingress-canary/ingress-canary-n74j9" Apr 20 20:06:33.037563 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:33.037519 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls\") pod \"dns-default-n8mvh\" (UID: \"e50d24aa-c678-44f8-ba98-19a30a72720c\") " pod="openshift-dns/dns-default-n8mvh" Apr 20 20:06:33.037777 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:33.037620 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert\") pod \"ingress-canary-n74j9\" (UID: \"6b4f78da-4307-441b-8a96-07e157e132e8\") " pod="openshift-ingress-canary/ingress-canary-n74j9" Apr 20 20:06:33.037777 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:33.037681 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:33.037777 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:33.037726 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:33.037777 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:33.037745 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls podName:e50d24aa-c678-44f8-ba98-19a30a72720c nodeName:}" failed. No retries permitted until 2026-04-20 20:06:34.037729479 +0000 UTC m=+35.077219003 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls") pod "dns-default-n8mvh" (UID: "e50d24aa-c678-44f8-ba98-19a30a72720c") : secret "dns-default-metrics-tls" not found Apr 20 20:06:33.037777 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:33.037777 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert podName:6b4f78da-4307-441b-8a96-07e157e132e8 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:34.037758154 +0000 UTC m=+35.077247683 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert") pod "ingress-canary-n74j9" (UID: "6b4f78da-4307-441b-8a96-07e157e132e8") : secret "canary-serving-cert" not found Apr 20 20:06:34.044997 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:34.044958 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls\") pod \"dns-default-n8mvh\" (UID: \"e50d24aa-c678-44f8-ba98-19a30a72720c\") " pod="openshift-dns/dns-default-n8mvh" Apr 20 20:06:34.045461 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:34.045021 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert\") pod \"ingress-canary-n74j9\" (UID: \"6b4f78da-4307-441b-8a96-07e157e132e8\") " pod="openshift-ingress-canary/ingress-canary-n74j9" Apr 20 20:06:34.045461 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:34.045097 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:34.045461 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:34.045163 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls podName:e50d24aa-c678-44f8-ba98-19a30a72720c nodeName:}" failed. No retries permitted until 2026-04-20 20:06:36.04514768 +0000 UTC m=+37.084637205 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls") pod "dns-default-n8mvh" (UID: "e50d24aa-c678-44f8-ba98-19a30a72720c") : secret "dns-default-metrics-tls" not found Apr 20 20:06:34.045461 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:34.045103 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:34.045461 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:34.045216 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert podName:6b4f78da-4307-441b-8a96-07e157e132e8 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:36.045205139 +0000 UTC m=+37.084694663 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert") pod "ingress-canary-n74j9" (UID: "6b4f78da-4307-441b-8a96-07e157e132e8") : secret "canary-serving-cert" not found Apr 20 20:06:34.780376 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:34.780344 2571 generic.go:358] "Generic (PLEG): container finished" podID="4e90b560-013d-4eb3-83bf-d19971d4fd0c" containerID="e656482aaf92b503ecdcd250de6b5011210f71e8a22c026889a95cfba285b55f" exitCode=0 Apr 20 20:06:34.780583 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:34.780407 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5g757" event={"ID":"4e90b560-013d-4eb3-83bf-d19971d4fd0c","Type":"ContainerDied","Data":"e656482aaf92b503ecdcd250de6b5011210f71e8a22c026889a95cfba285b55f"} Apr 20 20:06:35.784271 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:35.784244 2571 generic.go:358] "Generic (PLEG): container finished" podID="4e90b560-013d-4eb3-83bf-d19971d4fd0c" containerID="9d9edc66dda691a84a8ab54640a22aa35c39bf4c5e8a02a461bdbff62dd05657" exitCode=0 Apr 20 20:06:35.784622 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:35.784303 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5g757" event={"ID":"4e90b560-013d-4eb3-83bf-d19971d4fd0c","Type":"ContainerDied","Data":"9d9edc66dda691a84a8ab54640a22aa35c39bf4c5e8a02a461bdbff62dd05657"} Apr 20 20:06:36.060398 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:36.060135 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert\") pod \"ingress-canary-n74j9\" (UID: \"6b4f78da-4307-441b-8a96-07e157e132e8\") " pod="openshift-ingress-canary/ingress-canary-n74j9" Apr 20 20:06:36.060572 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:36.060282 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:36.060572 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:36.060415 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls\") pod \"dns-default-n8mvh\" (UID: \"e50d24aa-c678-44f8-ba98-19a30a72720c\") " pod="openshift-dns/dns-default-n8mvh" Apr 20 20:06:36.060572 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:36.060495 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert podName:6b4f78da-4307-441b-8a96-07e157e132e8 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:40.060476158 +0000 UTC m=+41.099965683 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert") pod "ingress-canary-n74j9" (UID: "6b4f78da-4307-441b-8a96-07e157e132e8") : secret "canary-serving-cert" not found Apr 20 20:06:36.060572 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:36.060555 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:36.060747 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:36.060608 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls podName:e50d24aa-c678-44f8-ba98-19a30a72720c nodeName:}" failed. No retries permitted until 2026-04-20 20:06:40.060592765 +0000 UTC m=+41.100082309 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls") pod "dns-default-n8mvh" (UID: "e50d24aa-c678-44f8-ba98-19a30a72720c") : secret "dns-default-metrics-tls" not found Apr 20 20:06:36.789242 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:36.789205 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5g757" event={"ID":"4e90b560-013d-4eb3-83bf-d19971d4fd0c","Type":"ContainerStarted","Data":"874dc881b43ea07c10a750b4be59434194739a14839ce43ade0d52377679df91"} Apr 20 20:06:36.814532 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:36.814488 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5g757" podStartSLOduration=5.041135592 podStartE2EDuration="37.814474552s" podCreationTimestamp="2026-04-20 20:05:59 +0000 UTC" firstStartedPulling="2026-04-20 20:06:00.845589842 +0000 UTC m=+1.885079366" lastFinishedPulling="2026-04-20 20:06:33.618928788 +0000 UTC m=+34.658418326" observedRunningTime="2026-04-20 20:06:36.812267676 +0000 UTC m=+37.851757223" watchObservedRunningTime="2026-04-20 20:06:36.814474552 +0000 UTC m=+37.853964140" Apr 20 20:06:37.774561 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:37.774523 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5266cc03-b601-4ec1-a024-fac19615b5da-original-pull-secret\") pod \"global-pull-secret-syncer-kk6nn\" (UID: \"5266cc03-b601-4ec1-a024-fac19615b5da\") " pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:37.777927 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:37.777903 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5266cc03-b601-4ec1-a024-fac19615b5da-original-pull-secret\") pod \"global-pull-secret-syncer-kk6nn\" (UID: \"5266cc03-b601-4ec1-a024-fac19615b5da\") " pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:37.934974 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:37.934940 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kk6nn" Apr 20 20:06:38.099779 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:38.099743 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kk6nn"] Apr 20 20:06:38.104638 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:06:38.104608 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5266cc03_b601_4ec1_a024_fac19615b5da.slice/crio-007166f8e9f2152d59515a425e2a83b65c9e44058c98cf2a0ee283eb2fcae770 WatchSource:0}: Error finding container 007166f8e9f2152d59515a425e2a83b65c9e44058c98cf2a0ee283eb2fcae770: Status 404 returned error can't find the container with id 007166f8e9f2152d59515a425e2a83b65c9e44058c98cf2a0ee283eb2fcae770 Apr 20 20:06:38.794184 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:38.794139 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kk6nn" event={"ID":"5266cc03-b601-4ec1-a024-fac19615b5da","Type":"ContainerStarted","Data":"007166f8e9f2152d59515a425e2a83b65c9e44058c98cf2a0ee283eb2fcae770"} Apr 20 20:06:40.092364 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:40.092325 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls\") pod \"dns-default-n8mvh\" (UID: \"e50d24aa-c678-44f8-ba98-19a30a72720c\") " pod="openshift-dns/dns-default-n8mvh" Apr 20 20:06:40.092806 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:40.092413 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert\") pod \"ingress-canary-n74j9\" (UID: \"6b4f78da-4307-441b-8a96-07e157e132e8\") " pod="openshift-ingress-canary/ingress-canary-n74j9" Apr 20 20:06:40.092806 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:40.092505 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:40.092806 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:40.092514 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:40.092806 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:40.092597 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert podName:6b4f78da-4307-441b-8a96-07e157e132e8 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:48.092576119 +0000 UTC m=+49.132065644 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert") pod "ingress-canary-n74j9" (UID: "6b4f78da-4307-441b-8a96-07e157e132e8") : secret "canary-serving-cert" not found Apr 20 20:06:40.092806 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:40.092616 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls podName:e50d24aa-c678-44f8-ba98-19a30a72720c nodeName:}" failed. No retries permitted until 2026-04-20 20:06:48.092606555 +0000 UTC m=+49.132096078 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls") pod "dns-default-n8mvh" (UID: "e50d24aa-c678-44f8-ba98-19a30a72720c") : secret "dns-default-metrics-tls" not found Apr 20 20:06:42.803645 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:42.803608 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kk6nn" event={"ID":"5266cc03-b601-4ec1-a024-fac19615b5da","Type":"ContainerStarted","Data":"9329f27500db456e5f95addc2a925a18ebe7f2e1e20345ae8a5a049cd9faf577"} Apr 20 20:06:42.818062 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:42.818019 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-kk6nn" podStartSLOduration=33.738727502 podStartE2EDuration="37.818006268s" podCreationTimestamp="2026-04-20 20:06:05 +0000 UTC" firstStartedPulling="2026-04-20 20:06:38.10624827 +0000 UTC m=+39.145737794" lastFinishedPulling="2026-04-20 20:06:42.185527036 +0000 UTC m=+43.225016560" observedRunningTime="2026-04-20 20:06:42.817598101 +0000 UTC m=+43.857087659" watchObservedRunningTime="2026-04-20 20:06:42.818006268 +0000 UTC m=+43.857495814" Apr 20 20:06:48.141619 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:48.141580 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert\") pod \"ingress-canary-n74j9\" (UID: \"6b4f78da-4307-441b-8a96-07e157e132e8\") " pod="openshift-ingress-canary/ingress-canary-n74j9" Apr 20 20:06:48.142051 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:06:48.141650 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls\") pod \"dns-default-n8mvh\" (UID: \"e50d24aa-c678-44f8-ba98-19a30a72720c\") " pod="openshift-dns/dns-default-n8mvh" Apr 20 20:06:48.142051 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:48.141731 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:48.142051 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:48.141785 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:48.142051 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:48.141794 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert podName:6b4f78da-4307-441b-8a96-07e157e132e8 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:04.141777083 +0000 UTC m=+65.181266607 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert") pod "ingress-canary-n74j9" (UID: "6b4f78da-4307-441b-8a96-07e157e132e8") : secret "canary-serving-cert" not found Apr 20 20:06:48.142051 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:06:48.141848 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls podName:e50d24aa-c678-44f8-ba98-19a30a72720c nodeName:}" failed. No retries permitted until 2026-04-20 20:07:04.141830231 +0000 UTC m=+65.181319759 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls") pod "dns-default-n8mvh" (UID: "e50d24aa-c678-44f8-ba98-19a30a72720c") : secret "dns-default-metrics-tls" not found Apr 20 20:07:00.667054 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:07:00.667028 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kldp5" Apr 20 20:07:04.152635 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:07:04.152596 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert\") pod \"ingress-canary-n74j9\" (UID: \"6b4f78da-4307-441b-8a96-07e157e132e8\") " pod="openshift-ingress-canary/ingress-canary-n74j9" Apr 20 20:07:04.153008 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:07:04.152675 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls\") pod \"dns-default-n8mvh\" (UID: \"e50d24aa-c678-44f8-ba98-19a30a72720c\") " pod="openshift-dns/dns-default-n8mvh" Apr 20 20:07:04.153008 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:07:04.152751 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:07:04.153008 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:07:04.152771 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:07:04.153008 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:07:04.152834 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls podName:e50d24aa-c678-44f8-ba98-19a30a72720c nodeName:}" failed. No retries permitted until 2026-04-20 20:07:36.152814033 +0000 UTC m=+97.192303558 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls") pod "dns-default-n8mvh" (UID: "e50d24aa-c678-44f8-ba98-19a30a72720c") : secret "dns-default-metrics-tls" not found Apr 20 20:07:04.153008 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:07:04.152852 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert podName:6b4f78da-4307-441b-8a96-07e157e132e8 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:36.152843941 +0000 UTC m=+97.192333467 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert") pod "ingress-canary-n74j9" (UID: "6b4f78da-4307-441b-8a96-07e157e132e8") : secret "canary-serving-cert" not found Apr 20 20:07:04.253939 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:07:04.253905 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs\") pod \"network-metrics-daemon-9sbrz\" (UID: \"012dcd86-26f0-4115-bd86-d5066c900541\") " pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:07:04.256785 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:07:04.256766 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 20:07:04.264262 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:07:04.264240 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:07:04.264346 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:07:04.264309 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs podName:012dcd86-26f0-4115-bd86-d5066c900541 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:08.264288882 +0000 UTC m=+129.303778407 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs") pod "network-metrics-daemon-9sbrz" (UID: "012dcd86-26f0-4115-bd86-d5066c900541") : secret "metrics-daemon-secret" not found Apr 20 20:07:04.354743 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:07:04.354714 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzxcj\" (UniqueName: \"kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj\") pod \"network-check-target-9bjr7\" (UID: \"800b4dad-a669-433c-8963-4c9f630913b5\") " pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:07:04.357654 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:07:04.357637 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 20:07:04.368113 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:07:04.368091 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 20:07:04.378593 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:07:04.378578 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzxcj\" (UniqueName: \"kubernetes.io/projected/800b4dad-a669-433c-8963-4c9f630913b5-kube-api-access-rzxcj\") pod \"network-check-target-9bjr7\" (UID: \"800b4dad-a669-433c-8963-4c9f630913b5\") " pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:07:04.645650 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:07:04.645627 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-9pkw7\"" Apr 20 20:07:04.653697 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:07:04.653684 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:07:04.780639 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:07:04.780610 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9bjr7"] Apr 20 20:07:04.783260 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:07:04.783226 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod800b4dad_a669_433c_8963_4c9f630913b5.slice/crio-70aa4cb3e1ad5566615b7f77c4927d2ed3ef4ee91f4368c104f24724ef741b27 WatchSource:0}: Error finding container 70aa4cb3e1ad5566615b7f77c4927d2ed3ef4ee91f4368c104f24724ef741b27: Status 404 returned error can't find the container with id 70aa4cb3e1ad5566615b7f77c4927d2ed3ef4ee91f4368c104f24724ef741b27 Apr 20 20:07:04.844143 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:07:04.844110 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9bjr7" event={"ID":"800b4dad-a669-433c-8963-4c9f630913b5","Type":"ContainerStarted","Data":"70aa4cb3e1ad5566615b7f77c4927d2ed3ef4ee91f4368c104f24724ef741b27"} Apr 20 20:07:07.852198 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:07:07.852161 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9bjr7" event={"ID":"800b4dad-a669-433c-8963-4c9f630913b5","Type":"ContainerStarted","Data":"cd7fc61b44845f25a3c641bdd7d7f754a61e8afa004bcbf765bcd36a9b5b56f2"} Apr 20 20:07:07.852677 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:07:07.852370 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:07:07.868046 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:07:07.868004 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-9bjr7" podStartSLOduration=66.188770072 podStartE2EDuration="1m8.86799174s" podCreationTimestamp="2026-04-20 20:05:59 +0000 UTC" firstStartedPulling="2026-04-20 20:07:04.785151404 +0000 UTC m=+65.824640929" lastFinishedPulling="2026-04-20 20:07:07.464373071 +0000 UTC m=+68.503862597" observedRunningTime="2026-04-20 20:07:07.867658021 +0000 UTC m=+68.907147568" watchObservedRunningTime="2026-04-20 20:07:07.86799174 +0000 UTC m=+68.907481346" Apr 20 20:07:36.170378 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:07:36.170209 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert\") pod \"ingress-canary-n74j9\" (UID: \"6b4f78da-4307-441b-8a96-07e157e132e8\") " pod="openshift-ingress-canary/ingress-canary-n74j9" Apr 20 20:07:36.170378 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:07:36.170325 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls\") pod \"dns-default-n8mvh\" (UID: \"e50d24aa-c678-44f8-ba98-19a30a72720c\") " pod="openshift-dns/dns-default-n8mvh" Apr 20 20:07:36.170378 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:07:36.170356 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:07:36.170884 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:07:36.170443 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:07:36.170884 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:07:36.170466 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert podName:6b4f78da-4307-441b-8a96-07e157e132e8 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:40.170419546 +0000 UTC m=+161.209932592 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert") pod "ingress-canary-n74j9" (UID: "6b4f78da-4307-441b-8a96-07e157e132e8") : secret "canary-serving-cert" not found Apr 20 20:07:36.170884 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:07:36.170506 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls podName:e50d24aa-c678-44f8-ba98-19a30a72720c nodeName:}" failed. No retries permitted until 2026-04-20 20:08:40.170490476 +0000 UTC m=+161.209980000 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls") pod "dns-default-n8mvh" (UID: "e50d24aa-c678-44f8-ba98-19a30a72720c") : secret "dns-default-metrics-tls" not found Apr 20 20:07:38.856602 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:07:38.856568 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-9bjr7" Apr 20 20:08:08.290814 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:08.290757 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs\") pod \"network-metrics-daemon-9sbrz\" (UID: \"012dcd86-26f0-4115-bd86-d5066c900541\") " pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:08:08.291241 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:08.290933 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:08:08.291241 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:08.291017 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs podName:012dcd86-26f0-4115-bd86-d5066c900541 nodeName:}" failed. No retries permitted until 2026-04-20 20:10:10.29099966 +0000 UTC m=+251.330489184 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs") pod "network-metrics-daemon-9sbrz" (UID: "012dcd86-26f0-4115-bd86-d5066c900541") : secret "metrics-daemon-secret" not found Apr 20 20:08:09.816517 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.816487 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-8ftcl"] Apr 20 20:08:09.819167 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.819151 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-8ftcl" Apr 20 20:08:09.821813 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.821790 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 20 20:08:09.821962 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.821947 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 20:08:09.822016 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.821947 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 20:08:09.822073 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.821970 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-lmxxr\"" Apr 20 20:08:09.823195 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.823180 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 20 20:08:09.828264 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.828246 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-8ftcl"] Apr 20 20:08:09.828363 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.828333 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 20 20:08:09.899433 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.899401 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b604c7c-dbda-486e-9ca5-fd23ee10bc87-service-ca-bundle\") pod \"insights-operator-585dfdc468-8ftcl\" (UID: \"0b604c7c-dbda-486e-9ca5-fd23ee10bc87\") " pod="openshift-insights/insights-operator-585dfdc468-8ftcl" Apr 20 20:08:09.899579 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.899482 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b604c7c-dbda-486e-9ca5-fd23ee10bc87-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-8ftcl\" (UID: \"0b604c7c-dbda-486e-9ca5-fd23ee10bc87\") " pod="openshift-insights/insights-operator-585dfdc468-8ftcl" Apr 20 20:08:09.899579 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.899506 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwnms\" (UniqueName: \"kubernetes.io/projected/0b604c7c-dbda-486e-9ca5-fd23ee10bc87-kube-api-access-vwnms\") pod \"insights-operator-585dfdc468-8ftcl\" (UID: \"0b604c7c-dbda-486e-9ca5-fd23ee10bc87\") " pod="openshift-insights/insights-operator-585dfdc468-8ftcl" Apr 20 20:08:09.899579 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.899525 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0b604c7c-dbda-486e-9ca5-fd23ee10bc87-snapshots\") pod \"insights-operator-585dfdc468-8ftcl\" (UID: \"0b604c7c-dbda-486e-9ca5-fd23ee10bc87\") " pod="openshift-insights/insights-operator-585dfdc468-8ftcl" Apr 20 20:08:09.899579 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.899540 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b604c7c-dbda-486e-9ca5-fd23ee10bc87-serving-cert\") pod \"insights-operator-585dfdc468-8ftcl\" (UID: \"0b604c7c-dbda-486e-9ca5-fd23ee10bc87\") " pod="openshift-insights/insights-operator-585dfdc468-8ftcl" Apr 20 20:08:09.899579 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.899556 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b604c7c-dbda-486e-9ca5-fd23ee10bc87-tmp\") pod \"insights-operator-585dfdc468-8ftcl\" (UID: \"0b604c7c-dbda-486e-9ca5-fd23ee10bc87\") " pod="openshift-insights/insights-operator-585dfdc468-8ftcl" Apr 20 20:08:09.919750 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.919721 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-6697d777bb-g2vc6"] Apr 20 20:08:09.922451 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.922415 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:09.925859 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.925842 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 20:08:09.926492 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.926477 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-2bjnj\"" Apr 20 20:08:09.926555 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.926493 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 20:08:09.926555 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.926483 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 20:08:09.926776 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.926759 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 20:08:09.927015 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.926997 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 20:08:09.927604 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.927583 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 20:08:09.938384 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:09.938364 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6697d777bb-g2vc6"] Apr 20 20:08:10.000834 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.000809 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89fd0c53-c978-4418-a79a-46ee5bab209d-service-ca-bundle\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:10.000982 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.000842 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-metrics-certs\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:10.000982 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.000872 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b604c7c-dbda-486e-9ca5-fd23ee10bc87-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-8ftcl\" (UID: \"0b604c7c-dbda-486e-9ca5-fd23ee10bc87\") " pod="openshift-insights/insights-operator-585dfdc468-8ftcl" Apr 20 20:08:10.000982 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.000895 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-stats-auth\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:10.000982 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.000957 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwnms\" (UniqueName: \"kubernetes.io/projected/0b604c7c-dbda-486e-9ca5-fd23ee10bc87-kube-api-access-vwnms\") pod \"insights-operator-585dfdc468-8ftcl\" (UID: \"0b604c7c-dbda-486e-9ca5-fd23ee10bc87\") " pod="openshift-insights/insights-operator-585dfdc468-8ftcl" Apr 20 20:08:10.001114 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.001013 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0b604c7c-dbda-486e-9ca5-fd23ee10bc87-snapshots\") pod \"insights-operator-585dfdc468-8ftcl\" (UID: \"0b604c7c-dbda-486e-9ca5-fd23ee10bc87\") " pod="openshift-insights/insights-operator-585dfdc468-8ftcl" Apr 20 20:08:10.001114 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.001044 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b604c7c-dbda-486e-9ca5-fd23ee10bc87-serving-cert\") pod \"insights-operator-585dfdc468-8ftcl\" (UID: \"0b604c7c-dbda-486e-9ca5-fd23ee10bc87\") " pod="openshift-insights/insights-operator-585dfdc468-8ftcl" Apr 20 20:08:10.001114 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.001064 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b604c7c-dbda-486e-9ca5-fd23ee10bc87-tmp\") pod \"insights-operator-585dfdc468-8ftcl\" (UID: \"0b604c7c-dbda-486e-9ca5-fd23ee10bc87\") " pod="openshift-insights/insights-operator-585dfdc468-8ftcl" Apr 20 20:08:10.001114 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.001097 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-default-certificate\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:10.001265 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.001171 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b604c7c-dbda-486e-9ca5-fd23ee10bc87-service-ca-bundle\") pod \"insights-operator-585dfdc468-8ftcl\" (UID: \"0b604c7c-dbda-486e-9ca5-fd23ee10bc87\") " pod="openshift-insights/insights-operator-585dfdc468-8ftcl" Apr 20 20:08:10.001377 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.001275 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfjnk\" (UniqueName: \"kubernetes.io/projected/89fd0c53-c978-4418-a79a-46ee5bab209d-kube-api-access-vfjnk\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:10.001635 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.001607 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b604c7c-dbda-486e-9ca5-fd23ee10bc87-tmp\") pod \"insights-operator-585dfdc468-8ftcl\" (UID: \"0b604c7c-dbda-486e-9ca5-fd23ee10bc87\") " pod="openshift-insights/insights-operator-585dfdc468-8ftcl" Apr 20 20:08:10.001746 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.001662 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0b604c7c-dbda-486e-9ca5-fd23ee10bc87-snapshots\") pod \"insights-operator-585dfdc468-8ftcl\" (UID: \"0b604c7c-dbda-486e-9ca5-fd23ee10bc87\") " pod="openshift-insights/insights-operator-585dfdc468-8ftcl" Apr 20 20:08:10.001790 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.001769 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b604c7c-dbda-486e-9ca5-fd23ee10bc87-service-ca-bundle\") pod \"insights-operator-585dfdc468-8ftcl\" (UID: \"0b604c7c-dbda-486e-9ca5-fd23ee10bc87\") " pod="openshift-insights/insights-operator-585dfdc468-8ftcl" Apr 20 20:08:10.001882 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.001861 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b604c7c-dbda-486e-9ca5-fd23ee10bc87-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-8ftcl\" (UID: \"0b604c7c-dbda-486e-9ca5-fd23ee10bc87\") " pod="openshift-insights/insights-operator-585dfdc468-8ftcl" Apr 20 20:08:10.003517 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.003493 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b604c7c-dbda-486e-9ca5-fd23ee10bc87-serving-cert\") pod \"insights-operator-585dfdc468-8ftcl\" (UID: \"0b604c7c-dbda-486e-9ca5-fd23ee10bc87\") " pod="openshift-insights/insights-operator-585dfdc468-8ftcl" Apr 20 20:08:10.020698 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.020669 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwnms\" (UniqueName: \"kubernetes.io/projected/0b604c7c-dbda-486e-9ca5-fd23ee10bc87-kube-api-access-vwnms\") pod \"insights-operator-585dfdc468-8ftcl\" (UID: \"0b604c7c-dbda-486e-9ca5-fd23ee10bc87\") " pod="openshift-insights/insights-operator-585dfdc468-8ftcl" Apr 20 20:08:10.028079 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.028059 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-c6bf74cfb-rqlgt"] Apr 20 20:08:10.031706 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.031631 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.036903 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.036874 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 20:08:10.037384 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.037365 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 20:08:10.037589 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.037570 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dkw75\"" Apr 20 20:08:10.038345 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.038328 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 20:08:10.044595 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.044573 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 20:08:10.049609 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.049589 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-c6bf74cfb-rqlgt"] Apr 20 20:08:10.102182 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.102119 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfjnk\" (UniqueName: \"kubernetes.io/projected/89fd0c53-c978-4418-a79a-46ee5bab209d-kube-api-access-vfjnk\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:10.102182 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.102149 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df27caca-27b2-4278-b3e1-119d6af3a947-ca-trust-extracted\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.102324 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.102183 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89fd0c53-c978-4418-a79a-46ee5bab209d-service-ca-bundle\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:10.102324 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.102208 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-bound-sa-token\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.102324 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.102246 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-metrics-certs\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:10.102324 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.102269 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-stats-auth\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:10.102324 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.102285 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-registry-tls\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.102558 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:10.102322 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/89fd0c53-c978-4418-a79a-46ee5bab209d-service-ca-bundle podName:89fd0c53-c978-4418-a79a-46ee5bab209d nodeName:}" failed. No retries permitted until 2026-04-20 20:08:10.602300589 +0000 UTC m=+131.641790127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/89fd0c53-c978-4418-a79a-46ee5bab209d-service-ca-bundle") pod "router-default-6697d777bb-g2vc6" (UID: "89fd0c53-c978-4418-a79a-46ee5bab209d") : configmap references non-existent config key: service-ca.crt Apr 20 20:08:10.102558 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:10.102356 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:08:10.102558 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:10.102410 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-metrics-certs podName:89fd0c53-c978-4418-a79a-46ee5bab209d nodeName:}" failed. No retries permitted until 2026-04-20 20:08:10.602396982 +0000 UTC m=+131.641886510 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-metrics-certs") pod "router-default-6697d777bb-g2vc6" (UID: "89fd0c53-c978-4418-a79a-46ee5bab209d") : secret "router-metrics-certs-default" not found Apr 20 20:08:10.102558 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.102475 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df27caca-27b2-4278-b3e1-119d6af3a947-registry-certificates\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.102558 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.102506 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kqsg\" (UniqueName: \"kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-kube-api-access-6kqsg\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.102558 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.102549 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-default-certificate\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:10.102831 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.102570 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/df27caca-27b2-4278-b3e1-119d6af3a947-image-registry-private-configuration\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.102831 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.102587 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df27caca-27b2-4278-b3e1-119d6af3a947-trusted-ca\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.102831 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.102615 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df27caca-27b2-4278-b3e1-119d6af3a947-installation-pull-secrets\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.104852 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.104834 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-stats-auth\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:10.104930 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.104916 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-default-certificate\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:10.114856 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.114835 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfjnk\" (UniqueName: \"kubernetes.io/projected/89fd0c53-c978-4418-a79a-46ee5bab209d-kube-api-access-vfjnk\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:10.128696 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.128664 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-8ftcl" Apr 20 20:08:10.203974 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.203944 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df27caca-27b2-4278-b3e1-119d6af3a947-registry-certificates\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.204101 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.203983 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kqsg\" (UniqueName: \"kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-kube-api-access-6kqsg\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.204101 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.204024 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/df27caca-27b2-4278-b3e1-119d6af3a947-image-registry-private-configuration\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.204101 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.204047 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df27caca-27b2-4278-b3e1-119d6af3a947-trusted-ca\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.204101 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.204079 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df27caca-27b2-4278-b3e1-119d6af3a947-installation-pull-secrets\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.204301 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.204137 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df27caca-27b2-4278-b3e1-119d6af3a947-ca-trust-extracted\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.204301 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.204198 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-bound-sa-token\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.204301 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.204248 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-registry-tls\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.204784 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:10.204539 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:08:10.204784 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:10.204559 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c6bf74cfb-rqlgt: secret "image-registry-tls" not found Apr 20 20:08:10.204784 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:10.204616 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-registry-tls podName:df27caca-27b2-4278-b3e1-119d6af3a947 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:10.704595729 +0000 UTC m=+131.744085253 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-registry-tls") pod "image-registry-c6bf74cfb-rqlgt" (UID: "df27caca-27b2-4278-b3e1-119d6af3a947") : secret "image-registry-tls" not found Apr 20 20:08:10.205351 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.205235 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df27caca-27b2-4278-b3e1-119d6af3a947-ca-trust-extracted\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.205452 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.205378 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df27caca-27b2-4278-b3e1-119d6af3a947-registry-certificates\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.205619 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.205600 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df27caca-27b2-4278-b3e1-119d6af3a947-trusted-ca\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.207568 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.207546 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/df27caca-27b2-4278-b3e1-119d6af3a947-image-registry-private-configuration\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.207648 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.207571 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df27caca-27b2-4278-b3e1-119d6af3a947-installation-pull-secrets\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.216239 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.216219 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-bound-sa-token\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.216818 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.216799 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kqsg\" (UniqueName: \"kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-kube-api-access-6kqsg\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.243706 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.243678 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-8ftcl"] Apr 20 20:08:10.246381 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:08:10.246357 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b604c7c_dbda_486e_9ca5_fd23ee10bc87.slice/crio-c553d943abf08d0fedad56d2ed920602a5a5024f9ced3d007f5419ccb2efdb02 WatchSource:0}: Error finding container c553d943abf08d0fedad56d2ed920602a5a5024f9ced3d007f5419ccb2efdb02: Status 404 returned error can't find the container with id c553d943abf08d0fedad56d2ed920602a5a5024f9ced3d007f5419ccb2efdb02 Apr 20 20:08:10.607778 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.607743 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89fd0c53-c978-4418-a79a-46ee5bab209d-service-ca-bundle\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:10.607964 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.607786 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-metrics-certs\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:10.607964 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:10.607926 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/89fd0c53-c978-4418-a79a-46ee5bab209d-service-ca-bundle podName:89fd0c53-c978-4418-a79a-46ee5bab209d nodeName:}" failed. No retries permitted until 2026-04-20 20:08:11.607907989 +0000 UTC m=+132.647397536 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/89fd0c53-c978-4418-a79a-46ee5bab209d-service-ca-bundle") pod "router-default-6697d777bb-g2vc6" (UID: "89fd0c53-c978-4418-a79a-46ee5bab209d") : configmap references non-existent config key: service-ca.crt Apr 20 20:08:10.607964 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:10.607926 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:08:10.607964 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:10.607965 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-metrics-certs podName:89fd0c53-c978-4418-a79a-46ee5bab209d nodeName:}" failed. No retries permitted until 2026-04-20 20:08:11.607956156 +0000 UTC m=+132.647445679 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-metrics-certs") pod "router-default-6697d777bb-g2vc6" (UID: "89fd0c53-c978-4418-a79a-46ee5bab209d") : secret "router-metrics-certs-default" not found Apr 20 20:08:10.708642 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.708609 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-registry-tls\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:10.708804 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:10.708772 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:08:10.708804 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:10.708792 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c6bf74cfb-rqlgt: secret "image-registry-tls" not found Apr 20 20:08:10.708907 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:10.708875 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-registry-tls podName:df27caca-27b2-4278-b3e1-119d6af3a947 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:11.708851267 +0000 UTC m=+132.748340813 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-registry-tls") pod "image-registry-c6bf74cfb-rqlgt" (UID: "df27caca-27b2-4278-b3e1-119d6af3a947") : secret "image-registry-tls" not found Apr 20 20:08:10.975449 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:10.975344 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8ftcl" event={"ID":"0b604c7c-dbda-486e-9ca5-fd23ee10bc87","Type":"ContainerStarted","Data":"c553d943abf08d0fedad56d2ed920602a5a5024f9ced3d007f5419ccb2efdb02"} Apr 20 20:08:11.616364 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:11.616324 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89fd0c53-c978-4418-a79a-46ee5bab209d-service-ca-bundle\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:11.616561 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:11.616380 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-metrics-certs\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:11.616561 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:11.616535 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/89fd0c53-c978-4418-a79a-46ee5bab209d-service-ca-bundle podName:89fd0c53-c978-4418-a79a-46ee5bab209d nodeName:}" failed. No retries permitted until 2026-04-20 20:08:13.616511068 +0000 UTC m=+134.656000592 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/89fd0c53-c978-4418-a79a-46ee5bab209d-service-ca-bundle") pod "router-default-6697d777bb-g2vc6" (UID: "89fd0c53-c978-4418-a79a-46ee5bab209d") : configmap references non-existent config key: service-ca.crt Apr 20 20:08:11.616561 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:11.616541 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:08:11.616720 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:11.616601 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-metrics-certs podName:89fd0c53-c978-4418-a79a-46ee5bab209d nodeName:}" failed. No retries permitted until 2026-04-20 20:08:13.616585166 +0000 UTC m=+134.656074692 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-metrics-certs") pod "router-default-6697d777bb-g2vc6" (UID: "89fd0c53-c978-4418-a79a-46ee5bab209d") : secret "router-metrics-certs-default" not found Apr 20 20:08:11.717061 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:11.717025 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-registry-tls\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:11.717252 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:11.717175 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:08:11.717252 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:11.717195 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c6bf74cfb-rqlgt: secret "image-registry-tls" not found Apr 20 20:08:11.717371 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:11.717263 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-registry-tls podName:df27caca-27b2-4278-b3e1-119d6af3a947 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:13.717242228 +0000 UTC m=+134.756731755 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-registry-tls") pod "image-registry-c6bf74cfb-rqlgt" (UID: "df27caca-27b2-4278-b3e1-119d6af3a947") : secret "image-registry-tls" not found Apr 20 20:08:12.980383 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:12.980343 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8ftcl" event={"ID":"0b604c7c-dbda-486e-9ca5-fd23ee10bc87","Type":"ContainerStarted","Data":"b98e5d6e80a67d8fc5ef2e630a709532f553d959b066ee051b7a053ce6960d19"} Apr 20 20:08:13.630492 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:13.630450 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89fd0c53-c978-4418-a79a-46ee5bab209d-service-ca-bundle\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:13.630492 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:13.630494 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-metrics-certs\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:13.630764 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:13.630603 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:08:13.630764 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:13.630646 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-metrics-certs podName:89fd0c53-c978-4418-a79a-46ee5bab209d nodeName:}" failed. No retries permitted until 2026-04-20 20:08:17.630633163 +0000 UTC m=+138.670122687 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-metrics-certs") pod "router-default-6697d777bb-g2vc6" (UID: "89fd0c53-c978-4418-a79a-46ee5bab209d") : secret "router-metrics-certs-default" not found Apr 20 20:08:13.630764 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:13.630659 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/89fd0c53-c978-4418-a79a-46ee5bab209d-service-ca-bundle podName:89fd0c53-c978-4418-a79a-46ee5bab209d nodeName:}" failed. No retries permitted until 2026-04-20 20:08:17.630652848 +0000 UTC m=+138.670142372 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/89fd0c53-c978-4418-a79a-46ee5bab209d-service-ca-bundle") pod "router-default-6697d777bb-g2vc6" (UID: "89fd0c53-c978-4418-a79a-46ee5bab209d") : configmap references non-existent config key: service-ca.crt Apr 20 20:08:13.732003 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:13.731963 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-registry-tls\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:13.732195 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:13.732090 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:08:13.732195 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:13.732107 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c6bf74cfb-rqlgt: secret "image-registry-tls" not found Apr 20 20:08:13.732195 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:13.732171 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-registry-tls podName:df27caca-27b2-4278-b3e1-119d6af3a947 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:17.732153646 +0000 UTC m=+138.771643186 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-registry-tls") pod "image-registry-c6bf74cfb-rqlgt" (UID: "df27caca-27b2-4278-b3e1-119d6af3a947") : secret "image-registry-tls" not found Apr 20 20:08:15.193954 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:15.193920 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7bzrm_e8fae2ab-f747-4b27-b9a3-55be9806fb45/dns-node-resolver/0.log" Apr 20 20:08:16.396576 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:16.396551 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ntdcl_43405a48-098c-49ef-95e3-3544654522ad/node-ca/0.log" Apr 20 20:08:17.074652 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:17.074602 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-8ftcl" podStartSLOduration=6.37488995 podStartE2EDuration="8.074587989s" podCreationTimestamp="2026-04-20 20:08:09 +0000 UTC" firstStartedPulling="2026-04-20 20:08:10.248110097 +0000 UTC m=+131.287599621" lastFinishedPulling="2026-04-20 20:08:11.947808115 +0000 UTC m=+132.987297660" observedRunningTime="2026-04-20 20:08:12.997248176 +0000 UTC m=+134.036737721" watchObservedRunningTime="2026-04-20 20:08:17.074587989 +0000 UTC m=+138.114077546" Apr 20 20:08:17.075279 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:17.075262 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-7wcw2"] Apr 20 20:08:17.078113 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:17.078098 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7wcw2" Apr 20 20:08:17.080621 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:17.080603 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-clpnq\"" Apr 20 20:08:17.081668 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:17.081651 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 20:08:17.081764 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:17.081653 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 20:08:17.085763 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:17.085734 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-7wcw2"] Apr 20 20:08:17.157574 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:17.157539 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k592w\" (UniqueName: \"kubernetes.io/projected/75cdcb83-e8e3-4d26-90d4-d4278f6d7672-kube-api-access-k592w\") pod \"migrator-74bb7799d9-7wcw2\" (UID: \"75cdcb83-e8e3-4d26-90d4-d4278f6d7672\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7wcw2" Apr 20 20:08:17.258132 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:17.258105 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k592w\" (UniqueName: \"kubernetes.io/projected/75cdcb83-e8e3-4d26-90d4-d4278f6d7672-kube-api-access-k592w\") pod \"migrator-74bb7799d9-7wcw2\" (UID: \"75cdcb83-e8e3-4d26-90d4-d4278f6d7672\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7wcw2" Apr 20 20:08:17.266669 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:17.266649 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k592w\" (UniqueName: \"kubernetes.io/projected/75cdcb83-e8e3-4d26-90d4-d4278f6d7672-kube-api-access-k592w\") pod \"migrator-74bb7799d9-7wcw2\" (UID: \"75cdcb83-e8e3-4d26-90d4-d4278f6d7672\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7wcw2" Apr 20 20:08:17.386816 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:17.386753 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7wcw2" Apr 20 20:08:17.502376 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:17.502351 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-7wcw2"] Apr 20 20:08:17.505147 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:08:17.505121 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75cdcb83_e8e3_4d26_90d4_d4278f6d7672.slice/crio-80fc639d0f7ff8e5aec8e9a88c31271774e07da1c87264f04ad434ac2dc16f29 WatchSource:0}: Error finding container 80fc639d0f7ff8e5aec8e9a88c31271774e07da1c87264f04ad434ac2dc16f29: Status 404 returned error can't find the container with id 80fc639d0f7ff8e5aec8e9a88c31271774e07da1c87264f04ad434ac2dc16f29 Apr 20 20:08:17.662579 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:17.662491 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89fd0c53-c978-4418-a79a-46ee5bab209d-service-ca-bundle\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:17.662579 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:17.662537 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-metrics-certs\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:17.662803 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:17.662670 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/89fd0c53-c978-4418-a79a-46ee5bab209d-service-ca-bundle podName:89fd0c53-c978-4418-a79a-46ee5bab209d nodeName:}" failed. No retries permitted until 2026-04-20 20:08:25.66265042 +0000 UTC m=+146.702139947 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/89fd0c53-c978-4418-a79a-46ee5bab209d-service-ca-bundle") pod "router-default-6697d777bb-g2vc6" (UID: "89fd0c53-c978-4418-a79a-46ee5bab209d") : configmap references non-existent config key: service-ca.crt Apr 20 20:08:17.662803 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:17.662680 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:08:17.662803 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:17.662743 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-metrics-certs podName:89fd0c53-c978-4418-a79a-46ee5bab209d nodeName:}" failed. No retries permitted until 2026-04-20 20:08:25.662717184 +0000 UTC m=+146.702206724 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-metrics-certs") pod "router-default-6697d777bb-g2vc6" (UID: "89fd0c53-c978-4418-a79a-46ee5bab209d") : secret "router-metrics-certs-default" not found Apr 20 20:08:17.763139 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:17.763109 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-registry-tls\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:17.763292 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:17.763227 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:08:17.763292 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:17.763239 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c6bf74cfb-rqlgt: secret "image-registry-tls" not found Apr 20 20:08:17.763292 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:17.763286 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-registry-tls podName:df27caca-27b2-4278-b3e1-119d6af3a947 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:25.763273305 +0000 UTC m=+146.802762830 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-registry-tls") pod "image-registry-c6bf74cfb-rqlgt" (UID: "df27caca-27b2-4278-b3e1-119d6af3a947") : secret "image-registry-tls" not found Apr 20 20:08:17.990469 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:17.990381 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7wcw2" event={"ID":"75cdcb83-e8e3-4d26-90d4-d4278f6d7672","Type":"ContainerStarted","Data":"80fc639d0f7ff8e5aec8e9a88c31271774e07da1c87264f04ad434ac2dc16f29"} Apr 20 20:08:18.995121 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:18.995037 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7wcw2" event={"ID":"75cdcb83-e8e3-4d26-90d4-d4278f6d7672","Type":"ContainerStarted","Data":"52f4a26c286527561d0490f284a73a91a61cce5d0e29d3a3c54f9fb4c8ed7c93"} Apr 20 20:08:18.995121 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:18.995073 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7wcw2" event={"ID":"75cdcb83-e8e3-4d26-90d4-d4278f6d7672","Type":"ContainerStarted","Data":"768ff8e3834eba06ee48f27ee36ce618c3472b2c1efaca55dca2f36979c45b8c"} Apr 20 20:08:19.020178 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:19.020129 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7wcw2" podStartSLOduration=0.822452143 podStartE2EDuration="2.020115054s" podCreationTimestamp="2026-04-20 20:08:17 +0000 UTC" firstStartedPulling="2026-04-20 20:08:17.506877105 +0000 UTC m=+138.546366629" lastFinishedPulling="2026-04-20 20:08:18.704540012 +0000 UTC m=+139.744029540" observedRunningTime="2026-04-20 20:08:19.019618931 +0000 UTC m=+140.059108479" watchObservedRunningTime="2026-04-20 20:08:19.020115054 +0000 UTC m=+140.059604599" Apr 20 20:08:25.720868 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:25.720829 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89fd0c53-c978-4418-a79a-46ee5bab209d-service-ca-bundle\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:25.720868 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:25.720871 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-metrics-certs\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:25.721279 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:25.720992 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/89fd0c53-c978-4418-a79a-46ee5bab209d-service-ca-bundle podName:89fd0c53-c978-4418-a79a-46ee5bab209d nodeName:}" failed. No retries permitted until 2026-04-20 20:08:41.720973255 +0000 UTC m=+162.760462780 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/89fd0c53-c978-4418-a79a-46ee5bab209d-service-ca-bundle") pod "router-default-6697d777bb-g2vc6" (UID: "89fd0c53-c978-4418-a79a-46ee5bab209d") : configmap references non-existent config key: service-ca.crt Apr 20 20:08:25.723256 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:25.723239 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89fd0c53-c978-4418-a79a-46ee5bab209d-metrics-certs\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:25.822114 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:25.822074 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-registry-tls\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:25.828001 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:25.827964 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-registry-tls\") pod \"image-registry-c6bf74cfb-rqlgt\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:25.940929 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:25.940892 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:26.063801 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:26.063762 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-c6bf74cfb-rqlgt"] Apr 20 20:08:26.067911 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:08:26.067878 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf27caca_27b2_4278_b3e1_119d6af3a947.slice/crio-d22bac5702bc7f327b6e5152d2f6fcd3c80dc02b8c621212654e5687c632fbc5 WatchSource:0}: Error finding container d22bac5702bc7f327b6e5152d2f6fcd3c80dc02b8c621212654e5687c632fbc5: Status 404 returned error can't find the container with id d22bac5702bc7f327b6e5152d2f6fcd3c80dc02b8c621212654e5687c632fbc5 Apr 20 20:08:27.016488 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:27.016456 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" event={"ID":"df27caca-27b2-4278-b3e1-119d6af3a947","Type":"ContainerStarted","Data":"945e77343ece114d8cf8afc20c356abc7d9d289b77dfa8c30d7a81c76ee8e144"} Apr 20 20:08:27.016488 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:27.016492 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" event={"ID":"df27caca-27b2-4278-b3e1-119d6af3a947","Type":"ContainerStarted","Data":"d22bac5702bc7f327b6e5152d2f6fcd3c80dc02b8c621212654e5687c632fbc5"} Apr 20 20:08:27.016893 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:27.016578 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:27.043963 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:27.043914 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" podStartSLOduration=17.043900668 podStartE2EDuration="17.043900668s" podCreationTimestamp="2026-04-20 20:08:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:08:27.042672595 +0000 UTC m=+148.082162140" watchObservedRunningTime="2026-04-20 20:08:27.043900668 +0000 UTC m=+148.083390205" Apr 20 20:08:35.355134 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:35.355089 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-n8mvh" podUID="e50d24aa-c678-44f8-ba98-19a30a72720c" Apr 20 20:08:35.369272 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:35.369238 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-n74j9" podUID="6b4f78da-4307-441b-8a96-07e157e132e8" Apr 20 20:08:35.553991 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:35.553958 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-9sbrz" podUID="012dcd86-26f0-4115-bd86-d5066c900541" Apr 20 20:08:36.037781 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:36.037752 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-n8mvh" Apr 20 20:08:38.393377 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.393343 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-q2q25"] Apr 20 20:08:38.399230 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.399212 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-q2q25" Apr 20 20:08:38.403178 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.403152 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jp4bq\"" Apr 20 20:08:38.404499 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.404480 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 20:08:38.404577 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.404483 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 20:08:38.417577 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.417554 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-q2q25"] Apr 20 20:08:38.443164 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.443139 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-c6bf74cfb-rqlgt"] Apr 20 20:08:38.510945 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.510897 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/10dc391b-44ea-4dcb-8d49-9210e125ae69-crio-socket\") pod \"insights-runtime-extractor-q2q25\" (UID: \"10dc391b-44ea-4dcb-8d49-9210e125ae69\") " pod="openshift-insights/insights-runtime-extractor-q2q25" Apr 20 20:08:38.511139 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.510986 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/10dc391b-44ea-4dcb-8d49-9210e125ae69-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q2q25\" (UID: \"10dc391b-44ea-4dcb-8d49-9210e125ae69\") " pod="openshift-insights/insights-runtime-extractor-q2q25" Apr 20 20:08:38.511139 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.511057 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/10dc391b-44ea-4dcb-8d49-9210e125ae69-data-volume\") pod \"insights-runtime-extractor-q2q25\" (UID: \"10dc391b-44ea-4dcb-8d49-9210e125ae69\") " pod="openshift-insights/insights-runtime-extractor-q2q25" Apr 20 20:08:38.511139 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.511085 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/10dc391b-44ea-4dcb-8d49-9210e125ae69-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q2q25\" (UID: \"10dc391b-44ea-4dcb-8d49-9210e125ae69\") " pod="openshift-insights/insights-runtime-extractor-q2q25" Apr 20 20:08:38.511139 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.511108 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2d8z\" (UniqueName: \"kubernetes.io/projected/10dc391b-44ea-4dcb-8d49-9210e125ae69-kube-api-access-b2d8z\") pod \"insights-runtime-extractor-q2q25\" (UID: \"10dc391b-44ea-4dcb-8d49-9210e125ae69\") " pod="openshift-insights/insights-runtime-extractor-q2q25" Apr 20 20:08:38.611701 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.611667 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/10dc391b-44ea-4dcb-8d49-9210e125ae69-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q2q25\" (UID: \"10dc391b-44ea-4dcb-8d49-9210e125ae69\") " pod="openshift-insights/insights-runtime-extractor-q2q25" Apr 20 20:08:38.611869 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.611727 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/10dc391b-44ea-4dcb-8d49-9210e125ae69-data-volume\") pod \"insights-runtime-extractor-q2q25\" (UID: \"10dc391b-44ea-4dcb-8d49-9210e125ae69\") " pod="openshift-insights/insights-runtime-extractor-q2q25" Apr 20 20:08:38.611869 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.611745 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/10dc391b-44ea-4dcb-8d49-9210e125ae69-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q2q25\" (UID: \"10dc391b-44ea-4dcb-8d49-9210e125ae69\") " pod="openshift-insights/insights-runtime-extractor-q2q25" Apr 20 20:08:38.611869 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.611760 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2d8z\" (UniqueName: \"kubernetes.io/projected/10dc391b-44ea-4dcb-8d49-9210e125ae69-kube-api-access-b2d8z\") pod \"insights-runtime-extractor-q2q25\" (UID: \"10dc391b-44ea-4dcb-8d49-9210e125ae69\") " pod="openshift-insights/insights-runtime-extractor-q2q25" Apr 20 20:08:38.611869 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.611791 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/10dc391b-44ea-4dcb-8d49-9210e125ae69-crio-socket\") pod \"insights-runtime-extractor-q2q25\" (UID: \"10dc391b-44ea-4dcb-8d49-9210e125ae69\") " pod="openshift-insights/insights-runtime-extractor-q2q25" Apr 20 20:08:38.611869 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.611867 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/10dc391b-44ea-4dcb-8d49-9210e125ae69-crio-socket\") pod \"insights-runtime-extractor-q2q25\" (UID: \"10dc391b-44ea-4dcb-8d49-9210e125ae69\") " pod="openshift-insights/insights-runtime-extractor-q2q25" Apr 20 20:08:38.612316 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.612291 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/10dc391b-44ea-4dcb-8d49-9210e125ae69-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q2q25\" (UID: \"10dc391b-44ea-4dcb-8d49-9210e125ae69\") " pod="openshift-insights/insights-runtime-extractor-q2q25" Apr 20 20:08:38.612597 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.612582 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/10dc391b-44ea-4dcb-8d49-9210e125ae69-data-volume\") pod \"insights-runtime-extractor-q2q25\" (UID: \"10dc391b-44ea-4dcb-8d49-9210e125ae69\") " pod="openshift-insights/insights-runtime-extractor-q2q25" Apr 20 20:08:38.614169 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.614143 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/10dc391b-44ea-4dcb-8d49-9210e125ae69-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q2q25\" (UID: \"10dc391b-44ea-4dcb-8d49-9210e125ae69\") " pod="openshift-insights/insights-runtime-extractor-q2q25" Apr 20 20:08:38.627717 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.627696 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2d8z\" (UniqueName: \"kubernetes.io/projected/10dc391b-44ea-4dcb-8d49-9210e125ae69-kube-api-access-b2d8z\") pod \"insights-runtime-extractor-q2q25\" (UID: \"10dc391b-44ea-4dcb-8d49-9210e125ae69\") " pod="openshift-insights/insights-runtime-extractor-q2q25" Apr 20 20:08:38.708547 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.708490 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-q2q25" Apr 20 20:08:38.833766 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:38.833735 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-q2q25"] Apr 20 20:08:38.836753 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:08:38.836728 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10dc391b_44ea_4dcb_8d49_9210e125ae69.slice/crio-6bc78f1ac0aebad8aa84b171165dba9d8cc8afd9b1dab3892d9c767ab478d250 WatchSource:0}: Error finding container 6bc78f1ac0aebad8aa84b171165dba9d8cc8afd9b1dab3892d9c767ab478d250: Status 404 returned error can't find the container with id 6bc78f1ac0aebad8aa84b171165dba9d8cc8afd9b1dab3892d9c767ab478d250 Apr 20 20:08:39.046195 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:39.046166 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q2q25" event={"ID":"10dc391b-44ea-4dcb-8d49-9210e125ae69","Type":"ContainerStarted","Data":"c03456f7f2ca0cd2a32b0cafc31e27b23a439fc45618ab70cd546f3db489778c"} Apr 20 20:08:39.046195 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:39.046199 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q2q25" event={"ID":"10dc391b-44ea-4dcb-8d49-9210e125ae69","Type":"ContainerStarted","Data":"6bc78f1ac0aebad8aa84b171165dba9d8cc8afd9b1dab3892d9c767ab478d250"} Apr 20 20:08:40.050096 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:40.050061 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q2q25" event={"ID":"10dc391b-44ea-4dcb-8d49-9210e125ae69","Type":"ContainerStarted","Data":"0df18f2e383efb8697fa21bee4a7205ed5d891e7eee4368cd76ad4163e8ae1e2"} Apr 20 20:08:40.225544 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:40.225509 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls\") pod \"dns-default-n8mvh\" (UID: \"e50d24aa-c678-44f8-ba98-19a30a72720c\") " pod="openshift-dns/dns-default-n8mvh" Apr 20 20:08:40.225688 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:40.225559 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert\") pod \"ingress-canary-n74j9\" (UID: \"6b4f78da-4307-441b-8a96-07e157e132e8\") " pod="openshift-ingress-canary/ingress-canary-n74j9" Apr 20 20:08:40.227948 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:40.227918 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b4f78da-4307-441b-8a96-07e157e132e8-cert\") pod \"ingress-canary-n74j9\" (UID: \"6b4f78da-4307-441b-8a96-07e157e132e8\") " pod="openshift-ingress-canary/ingress-canary-n74j9" Apr 20 20:08:40.228058 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:40.227959 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e50d24aa-c678-44f8-ba98-19a30a72720c-metrics-tls\") pod \"dns-default-n8mvh\" (UID: \"e50d24aa-c678-44f8-ba98-19a30a72720c\") " pod="openshift-dns/dns-default-n8mvh" Apr 20 20:08:40.241765 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:40.241745 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vhk8x\"" Apr 20 20:08:40.249385 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:40.249341 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-n8mvh" Apr 20 20:08:40.384962 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:40.384932 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-n8mvh"] Apr 20 20:08:40.388079 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:08:40.388049 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode50d24aa_c678_44f8_ba98_19a30a72720c.slice/crio-1c0f323cffe7dbeb79d1272b578daea0a316029fbc25a028d092630b9e373f46 WatchSource:0}: Error finding container 1c0f323cffe7dbeb79d1272b578daea0a316029fbc25a028d092630b9e373f46: Status 404 returned error can't find the container with id 1c0f323cffe7dbeb79d1272b578daea0a316029fbc25a028d092630b9e373f46 Apr 20 20:08:41.053243 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:41.053205 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n8mvh" event={"ID":"e50d24aa-c678-44f8-ba98-19a30a72720c","Type":"ContainerStarted","Data":"1c0f323cffe7dbeb79d1272b578daea0a316029fbc25a028d092630b9e373f46"} Apr 20 20:08:41.737985 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:41.737928 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89fd0c53-c978-4418-a79a-46ee5bab209d-service-ca-bundle\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:41.738676 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:41.738655 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89fd0c53-c978-4418-a79a-46ee5bab209d-service-ca-bundle\") pod \"router-default-6697d777bb-g2vc6\" (UID: \"89fd0c53-c978-4418-a79a-46ee5bab209d\") " pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:42.030916 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:42.030891 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:42.059881 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:42.059787 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n8mvh" event={"ID":"e50d24aa-c678-44f8-ba98-19a30a72720c","Type":"ContainerStarted","Data":"4be34e6278894fd3d6aaff8cda30263ceb812f653088c342fa5c9ddd55f337da"} Apr 20 20:08:42.062048 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:42.061998 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q2q25" event={"ID":"10dc391b-44ea-4dcb-8d49-9210e125ae69","Type":"ContainerStarted","Data":"fd8321c8e9c91925222a2da3bb37d042d074d682b5b6fa91b2bfe8c9e06b2a7d"} Apr 20 20:08:42.168610 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:42.168554 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-q2q25" podStartSLOduration=2.017019318 podStartE2EDuration="4.168533149s" podCreationTimestamp="2026-04-20 20:08:38 +0000 UTC" firstStartedPulling="2026-04-20 20:08:38.904584629 +0000 UTC m=+159.944074156" lastFinishedPulling="2026-04-20 20:08:41.056098445 +0000 UTC m=+162.095587987" observedRunningTime="2026-04-20 20:08:42.084104375 +0000 UTC m=+163.123593924" watchObservedRunningTime="2026-04-20 20:08:42.168533149 +0000 UTC m=+163.208022696" Apr 20 20:08:42.169252 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:42.169234 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6697d777bb-g2vc6"] Apr 20 20:08:42.174618 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:08:42.174594 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89fd0c53_c978_4418_a79a_46ee5bab209d.slice/crio-fdf94089f0984329de2e030b647a89c4acc44f7d6fae1a81e3fb43f4a4bdb248 WatchSource:0}: Error finding container fdf94089f0984329de2e030b647a89c4acc44f7d6fae1a81e3fb43f4a4bdb248: Status 404 returned error can't find the container with id fdf94089f0984329de2e030b647a89c4acc44f7d6fae1a81e3fb43f4a4bdb248 Apr 20 20:08:43.066029 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:43.065993 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n8mvh" event={"ID":"e50d24aa-c678-44f8-ba98-19a30a72720c","Type":"ContainerStarted","Data":"bbadf924d19ac8be9bf9d486ac9c9b2767dd655d7f2063eeaf1fb272400445d0"} Apr 20 20:08:43.066502 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:43.066209 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-n8mvh" Apr 20 20:08:43.067345 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:43.067323 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6697d777bb-g2vc6" event={"ID":"89fd0c53-c978-4418-a79a-46ee5bab209d","Type":"ContainerStarted","Data":"553f30f8252c0252b7ba77b90b4f06c27b722d826531d453a460e436e3ae2c5a"} Apr 20 20:08:43.067449 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:43.067351 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6697d777bb-g2vc6" event={"ID":"89fd0c53-c978-4418-a79a-46ee5bab209d","Type":"ContainerStarted","Data":"fdf94089f0984329de2e030b647a89c4acc44f7d6fae1a81e3fb43f4a4bdb248"} Apr 20 20:08:43.083468 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:43.083406 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-n8mvh" podStartSLOduration=129.563211687 podStartE2EDuration="2m11.083394451s" podCreationTimestamp="2026-04-20 20:06:32 +0000 UTC" firstStartedPulling="2026-04-20 20:08:40.390288209 +0000 UTC m=+161.429777737" lastFinishedPulling="2026-04-20 20:08:41.910470976 +0000 UTC m=+162.949960501" observedRunningTime="2026-04-20 20:08:43.082757955 +0000 UTC m=+164.122247503" watchObservedRunningTime="2026-04-20 20:08:43.083394451 +0000 UTC m=+164.122883996" Apr 20 20:08:43.101587 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:43.101551 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-6697d777bb-g2vc6" podStartSLOduration=34.101538718 podStartE2EDuration="34.101538718s" podCreationTimestamp="2026-04-20 20:08:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:08:43.100829023 +0000 UTC m=+164.140318569" watchObservedRunningTime="2026-04-20 20:08:43.101538718 +0000 UTC m=+164.141028309" Apr 20 20:08:44.031349 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:44.031312 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:44.034003 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:44.033980 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:44.070864 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:44.070829 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:44.071924 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:44.071903 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-6697d777bb-g2vc6" Apr 20 20:08:46.522982 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:46.522933 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n74j9" Apr 20 20:08:46.525716 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:46.525695 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xwtbs\"" Apr 20 20:08:46.533849 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:46.533828 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n74j9" Apr 20 20:08:46.671286 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:46.671260 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-n74j9"] Apr 20 20:08:46.674919 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:08:46.674894 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b4f78da_4307_441b_8a96_07e157e132e8.slice/crio-f618b2f12cfcb7c9b29906a73ea03bc47dc1981a1b4ca09fcb0a50067debfecc WatchSource:0}: Error finding container f618b2f12cfcb7c9b29906a73ea03bc47dc1981a1b4ca09fcb0a50067debfecc: Status 404 returned error can't find the container with id f618b2f12cfcb7c9b29906a73ea03bc47dc1981a1b4ca09fcb0a50067debfecc Apr 20 20:08:47.083301 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:47.083269 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-n74j9" event={"ID":"6b4f78da-4307-441b-8a96-07e157e132e8","Type":"ContainerStarted","Data":"f618b2f12cfcb7c9b29906a73ea03bc47dc1981a1b4ca09fcb0a50067debfecc"} Apr 20 20:08:47.557829 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:47.557794 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-g5v6w"] Apr 20 20:08:47.561478 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:47.561457 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-g5v6w" Apr 20 20:08:47.565533 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:47.565285 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 20:08:47.565533 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:47.565321 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 20:08:47.565533 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:47.565357 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 20:08:47.565533 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:47.565368 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-gzrrq\"" Apr 20 20:08:47.565533 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:47.565414 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 20:08:47.565533 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:47.565375 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 20:08:47.569950 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:47.569726 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-g5v6w"] Apr 20 20:08:47.581694 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:47.581667 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f9646e3a-c3e6-4243-9d07-190e48cfd49e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-g5v6w\" (UID: \"f9646e3a-c3e6-4243-9d07-190e48cfd49e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g5v6w" Apr 20 20:08:47.581819 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:47.581721 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f9646e3a-c3e6-4243-9d07-190e48cfd49e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-g5v6w\" (UID: \"f9646e3a-c3e6-4243-9d07-190e48cfd49e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g5v6w" Apr 20 20:08:47.581819 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:47.581764 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sv5h\" (UniqueName: \"kubernetes.io/projected/f9646e3a-c3e6-4243-9d07-190e48cfd49e-kube-api-access-5sv5h\") pod \"prometheus-operator-5676c8c784-g5v6w\" (UID: \"f9646e3a-c3e6-4243-9d07-190e48cfd49e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g5v6w" Apr 20 20:08:47.581933 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:47.581878 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f9646e3a-c3e6-4243-9d07-190e48cfd49e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-g5v6w\" (UID: \"f9646e3a-c3e6-4243-9d07-190e48cfd49e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g5v6w" Apr 20 20:08:47.683093 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:47.683050 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f9646e3a-c3e6-4243-9d07-190e48cfd49e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-g5v6w\" (UID: \"f9646e3a-c3e6-4243-9d07-190e48cfd49e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g5v6w" Apr 20 20:08:47.683257 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:47.683103 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f9646e3a-c3e6-4243-9d07-190e48cfd49e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-g5v6w\" (UID: \"f9646e3a-c3e6-4243-9d07-190e48cfd49e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g5v6w" Apr 20 20:08:47.683257 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:47.683133 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f9646e3a-c3e6-4243-9d07-190e48cfd49e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-g5v6w\" (UID: \"f9646e3a-c3e6-4243-9d07-190e48cfd49e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g5v6w" Apr 20 20:08:47.683257 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:47.683176 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5sv5h\" (UniqueName: \"kubernetes.io/projected/f9646e3a-c3e6-4243-9d07-190e48cfd49e-kube-api-access-5sv5h\") pod \"prometheus-operator-5676c8c784-g5v6w\" (UID: \"f9646e3a-c3e6-4243-9d07-190e48cfd49e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g5v6w" Apr 20 20:08:47.683455 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:47.683303 2571 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 20 20:08:47.683455 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:47.683389 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9646e3a-c3e6-4243-9d07-190e48cfd49e-prometheus-operator-tls podName:f9646e3a-c3e6-4243-9d07-190e48cfd49e nodeName:}" failed. No retries permitted until 2026-04-20 20:08:48.183366856 +0000 UTC m=+169.222856384 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/f9646e3a-c3e6-4243-9d07-190e48cfd49e-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-g5v6w" (UID: "f9646e3a-c3e6-4243-9d07-190e48cfd49e") : secret "prometheus-operator-tls" not found Apr 20 20:08:47.683988 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:47.683924 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f9646e3a-c3e6-4243-9d07-190e48cfd49e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-g5v6w\" (UID: \"f9646e3a-c3e6-4243-9d07-190e48cfd49e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g5v6w" Apr 20 20:08:47.685934 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:47.685902 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f9646e3a-c3e6-4243-9d07-190e48cfd49e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-g5v6w\" (UID: \"f9646e3a-c3e6-4243-9d07-190e48cfd49e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g5v6w" Apr 20 20:08:47.693250 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:47.693228 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sv5h\" (UniqueName: \"kubernetes.io/projected/f9646e3a-c3e6-4243-9d07-190e48cfd49e-kube-api-access-5sv5h\") pod \"prometheus-operator-5676c8c784-g5v6w\" (UID: \"f9646e3a-c3e6-4243-9d07-190e48cfd49e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g5v6w" Apr 20 20:08:48.064261 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.064219 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-799fc4df48-mqvj9"] Apr 20 20:08:48.067698 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.067676 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:48.070470 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.070448 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 20:08:48.070614 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.070501 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 20:08:48.070614 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.070454 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 20:08:48.070740 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.070444 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 20:08:48.071524 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.071503 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 20:08:48.071524 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.071520 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-qkvk8\"" Apr 20 20:08:48.071709 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.071564 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 20:08:48.071818 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.071804 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 20:08:48.076901 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.076879 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-799fc4df48-mqvj9"] Apr 20 20:08:48.087670 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.087642 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90559697-e5dd-4c9e-b3e3-d63828850d61-console-oauth-config\") pod \"console-799fc4df48-mqvj9\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:48.087795 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.087689 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90559697-e5dd-4c9e-b3e3-d63828850d61-oauth-serving-cert\") pod \"console-799fc4df48-mqvj9\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:48.087847 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.087824 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z546s\" (UniqueName: \"kubernetes.io/projected/90559697-e5dd-4c9e-b3e3-d63828850d61-kube-api-access-z546s\") pod \"console-799fc4df48-mqvj9\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:48.087894 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.087863 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90559697-e5dd-4c9e-b3e3-d63828850d61-console-config\") pod \"console-799fc4df48-mqvj9\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:48.087894 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.087888 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90559697-e5dd-4c9e-b3e3-d63828850d61-service-ca\") pod \"console-799fc4df48-mqvj9\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:48.087993 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.087922 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90559697-e5dd-4c9e-b3e3-d63828850d61-console-serving-cert\") pod \"console-799fc4df48-mqvj9\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:48.188737 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.188698 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90559697-e5dd-4c9e-b3e3-d63828850d61-console-oauth-config\") pod \"console-799fc4df48-mqvj9\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:48.188737 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.188741 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90559697-e5dd-4c9e-b3e3-d63828850d61-oauth-serving-cert\") pod \"console-799fc4df48-mqvj9\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:48.188980 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.188772 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f9646e3a-c3e6-4243-9d07-190e48cfd49e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-g5v6w\" (UID: \"f9646e3a-c3e6-4243-9d07-190e48cfd49e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g5v6w" Apr 20 20:08:48.188980 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.188832 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z546s\" (UniqueName: \"kubernetes.io/projected/90559697-e5dd-4c9e-b3e3-d63828850d61-kube-api-access-z546s\") pod \"console-799fc4df48-mqvj9\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:48.188980 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.188863 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90559697-e5dd-4c9e-b3e3-d63828850d61-console-config\") pod \"console-799fc4df48-mqvj9\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:48.189363 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.189217 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90559697-e5dd-4c9e-b3e3-d63828850d61-service-ca\") pod \"console-799fc4df48-mqvj9\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:48.189363 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.189288 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90559697-e5dd-4c9e-b3e3-d63828850d61-console-serving-cert\") pod \"console-799fc4df48-mqvj9\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:48.189608 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.189561 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90559697-e5dd-4c9e-b3e3-d63828850d61-oauth-serving-cert\") pod \"console-799fc4df48-mqvj9\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:48.189608 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.189576 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90559697-e5dd-4c9e-b3e3-d63828850d61-console-config\") pod \"console-799fc4df48-mqvj9\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:48.189897 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.189872 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90559697-e5dd-4c9e-b3e3-d63828850d61-service-ca\") pod \"console-799fc4df48-mqvj9\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:48.192108 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.192087 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f9646e3a-c3e6-4243-9d07-190e48cfd49e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-g5v6w\" (UID: \"f9646e3a-c3e6-4243-9d07-190e48cfd49e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g5v6w" Apr 20 20:08:48.192233 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.192122 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90559697-e5dd-4c9e-b3e3-d63828850d61-console-serving-cert\") pod \"console-799fc4df48-mqvj9\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:48.192233 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.192089 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90559697-e5dd-4c9e-b3e3-d63828850d61-console-oauth-config\") pod \"console-799fc4df48-mqvj9\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:48.197537 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.197516 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z546s\" (UniqueName: \"kubernetes.io/projected/90559697-e5dd-4c9e-b3e3-d63828850d61-kube-api-access-z546s\") pod \"console-799fc4df48-mqvj9\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:48.379691 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.379670 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:48.448217 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.448185 2571 patch_prober.go:28] interesting pod/image-registry-c6bf74cfb-rqlgt container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 20:08:48.448353 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.448245 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" podUID="df27caca-27b2-4278-b3e1-119d6af3a947" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:08:48.473586 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.473560 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-g5v6w" Apr 20 20:08:48.500801 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.500769 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-799fc4df48-mqvj9"] Apr 20 20:08:48.504174 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:08:48.504138 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90559697_e5dd_4c9e_b3e3_d63828850d61.slice/crio-c52cbe4e5d1dfb14fd448d11ccce461df1b25fc46c808367c71ca9a98531f0b1 WatchSource:0}: Error finding container c52cbe4e5d1dfb14fd448d11ccce461df1b25fc46c808367c71ca9a98531f0b1: Status 404 returned error can't find the container with id c52cbe4e5d1dfb14fd448d11ccce461df1b25fc46c808367c71ca9a98531f0b1 Apr 20 20:08:48.590863 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:48.590793 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-g5v6w"] Apr 20 20:08:48.593683 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:08:48.593658 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9646e3a_c3e6_4243_9d07_190e48cfd49e.slice/crio-97ea1ecdbd37423e40be0a348f8c9a4551d1d2393f3903e18f771b5d107dc138 WatchSource:0}: Error finding container 97ea1ecdbd37423e40be0a348f8c9a4551d1d2393f3903e18f771b5d107dc138: Status 404 returned error can't find the container with id 97ea1ecdbd37423e40be0a348f8c9a4551d1d2393f3903e18f771b5d107dc138 Apr 20 20:08:49.093924 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:49.093886 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-n74j9" event={"ID":"6b4f78da-4307-441b-8a96-07e157e132e8","Type":"ContainerStarted","Data":"37c7c3b78a768ea9983b60e123e6c9761903bffa35bfc2278a298b742d6c6cdd"} Apr 20 20:08:49.094990 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:49.094961 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-g5v6w" event={"ID":"f9646e3a-c3e6-4243-9d07-190e48cfd49e","Type":"ContainerStarted","Data":"97ea1ecdbd37423e40be0a348f8c9a4551d1d2393f3903e18f771b5d107dc138"} Apr 20 20:08:49.096087 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:49.096065 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-799fc4df48-mqvj9" event={"ID":"90559697-e5dd-4c9e-b3e3-d63828850d61","Type":"ContainerStarted","Data":"c52cbe4e5d1dfb14fd448d11ccce461df1b25fc46c808367c71ca9a98531f0b1"} Apr 20 20:08:49.115410 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:49.115104 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-n74j9" podStartSLOduration=135.474836376 podStartE2EDuration="2m17.115086713s" podCreationTimestamp="2026-04-20 20:06:32 +0000 UTC" firstStartedPulling="2026-04-20 20:08:46.676919115 +0000 UTC m=+167.716408638" lastFinishedPulling="2026-04-20 20:08:48.317169437 +0000 UTC m=+169.356658975" observedRunningTime="2026-04-20 20:08:49.114365723 +0000 UTC m=+170.153855270" watchObservedRunningTime="2026-04-20 20:08:49.115086713 +0000 UTC m=+170.154576262" Apr 20 20:08:49.526886 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:49.526854 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:08:50.100703 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:50.100639 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-g5v6w" event={"ID":"f9646e3a-c3e6-4243-9d07-190e48cfd49e","Type":"ContainerStarted","Data":"455b6c4bc78ffe7ac3190277e7e48eb35212038a3533fd804e3b3d8d889091d3"} Apr 20 20:08:50.100703 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:50.100679 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-g5v6w" event={"ID":"f9646e3a-c3e6-4243-9d07-190e48cfd49e","Type":"ContainerStarted","Data":"c15b91232ac302cf4e8acc1dc09d55bd9d4219787714d24c6e43a5157e9e245f"} Apr 20 20:08:50.121624 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:50.120774 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-g5v6w" podStartSLOduration=1.819344041 podStartE2EDuration="3.120734139s" podCreationTimestamp="2026-04-20 20:08:47 +0000 UTC" firstStartedPulling="2026-04-20 20:08:48.595456434 +0000 UTC m=+169.634945958" lastFinishedPulling="2026-04-20 20:08:49.896846518 +0000 UTC m=+170.936336056" observedRunningTime="2026-04-20 20:08:50.118931911 +0000 UTC m=+171.158421459" watchObservedRunningTime="2026-04-20 20:08:50.120734139 +0000 UTC m=+171.160223687" Apr 20 20:08:51.910255 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:51.910221 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-7vfx2"] Apr 20 20:08:51.913412 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:51.913392 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-pw58h"] Apr 20 20:08:51.913580 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:51.913562 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7vfx2" Apr 20 20:08:51.916058 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:51.916037 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 20:08:51.916203 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:51.916063 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-9jtzp\"" Apr 20 20:08:51.916203 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:51.916065 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 20:08:51.916629 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:51.916611 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" Apr 20 20:08:51.919204 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:51.919183 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 20 20:08:51.920242 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:51.920112 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 20 20:08:51.920242 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:51.920116 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-pfjgc\"" Apr 20 20:08:51.920242 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:51.920144 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 20 20:08:51.923609 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:51.923579 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-7vfx2"] Apr 20 20:08:51.927771 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:51.927713 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-pw58h"] Apr 20 20:08:51.944685 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:51.944657 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-8s5lf"] Apr 20 20:08:51.949281 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:51.949261 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:51.952266 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:51.952251 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 20:08:51.952266 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:51.952269 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7n892\"" Apr 20 20:08:51.952481 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:51.952460 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 20:08:51.952559 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:51.952509 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 20:08:52.024668 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.024634 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e48ea207-1b2d-460f-b13a-46721f677a63-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-7vfx2\" (UID: \"e48ea207-1b2d-460f-b13a-46721f677a63\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7vfx2" Apr 20 20:08:52.024854 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.024675 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d422197f-777e-4035-a81e-8dcf213b4d0a-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-pw58h\" (UID: \"d422197f-777e-4035-a81e-8dcf213b4d0a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" Apr 20 20:08:52.024854 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.024705 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnzmk\" (UniqueName: \"kubernetes.io/projected/e48ea207-1b2d-460f-b13a-46721f677a63-kube-api-access-qnzmk\") pod \"openshift-state-metrics-9d44df66c-7vfx2\" (UID: \"e48ea207-1b2d-460f-b13a-46721f677a63\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7vfx2" Apr 20 20:08:52.024854 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.024729 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d422197f-777e-4035-a81e-8dcf213b4d0a-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pw58h\" (UID: \"d422197f-777e-4035-a81e-8dcf213b4d0a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" Apr 20 20:08:52.024854 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.024780 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5d9430a-17b7-4994-9576-0dd1247ec436-metrics-client-ca\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.024854 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.024831 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b5d9430a-17b7-4994-9576-0dd1247ec436-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.025047 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.024878 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e48ea207-1b2d-460f-b13a-46721f677a63-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-7vfx2\" (UID: \"e48ea207-1b2d-460f-b13a-46721f677a63\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7vfx2" Apr 20 20:08:52.025047 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.024903 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b5d9430a-17b7-4994-9576-0dd1247ec436-node-exporter-textfile\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.025047 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.024919 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4pp5\" (UniqueName: \"kubernetes.io/projected/d422197f-777e-4035-a81e-8dcf213b4d0a-kube-api-access-z4pp5\") pod \"kube-state-metrics-69db897b98-pw58h\" (UID: \"d422197f-777e-4035-a81e-8dcf213b4d0a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" Apr 20 20:08:52.025047 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.024941 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b5d9430a-17b7-4994-9576-0dd1247ec436-node-exporter-accelerators-collector-config\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.025047 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.024973 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b5d9430a-17b7-4994-9576-0dd1247ec436-node-exporter-tls\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.025047 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.025002 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5d9430a-17b7-4994-9576-0dd1247ec436-sys\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.025277 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.025050 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgsvd\" (UniqueName: \"kubernetes.io/projected/b5d9430a-17b7-4994-9576-0dd1247ec436-kube-api-access-bgsvd\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.025277 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.025069 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e48ea207-1b2d-460f-b13a-46721f677a63-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7vfx2\" (UID: \"e48ea207-1b2d-460f-b13a-46721f677a63\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7vfx2" Apr 20 20:08:52.025277 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.025093 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d422197f-777e-4035-a81e-8dcf213b4d0a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-pw58h\" (UID: \"d422197f-777e-4035-a81e-8dcf213b4d0a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" Apr 20 20:08:52.025277 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.025119 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d422197f-777e-4035-a81e-8dcf213b4d0a-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-pw58h\" (UID: \"d422197f-777e-4035-a81e-8dcf213b4d0a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" Apr 20 20:08:52.025277 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.025136 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d422197f-777e-4035-a81e-8dcf213b4d0a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-pw58h\" (UID: \"d422197f-777e-4035-a81e-8dcf213b4d0a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" Apr 20 20:08:52.025277 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.025155 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b5d9430a-17b7-4994-9576-0dd1247ec436-root\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.025277 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.025210 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b5d9430a-17b7-4994-9576-0dd1247ec436-node-exporter-wtmp\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.107576 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.107539 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-799fc4df48-mqvj9" event={"ID":"90559697-e5dd-4c9e-b3e3-d63828850d61","Type":"ContainerStarted","Data":"639292617edd7fe48fa9512833edbfa1127b1326f620a52c711b14539fa128a6"} Apr 20 20:08:52.125916 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.125866 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-799fc4df48-mqvj9" podStartSLOduration=1.4755305220000001 podStartE2EDuration="4.125852066s" podCreationTimestamp="2026-04-20 20:08:48 +0000 UTC" firstStartedPulling="2026-04-20 20:08:48.506199469 +0000 UTC m=+169.545688995" lastFinishedPulling="2026-04-20 20:08:51.156521008 +0000 UTC m=+172.196010539" observedRunningTime="2026-04-20 20:08:52.124689738 +0000 UTC m=+173.164179287" watchObservedRunningTime="2026-04-20 20:08:52.125852066 +0000 UTC m=+173.165341613" Apr 20 20:08:52.126114 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.126094 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qnzmk\" (UniqueName: \"kubernetes.io/projected/e48ea207-1b2d-460f-b13a-46721f677a63-kube-api-access-qnzmk\") pod \"openshift-state-metrics-9d44df66c-7vfx2\" (UID: \"e48ea207-1b2d-460f-b13a-46721f677a63\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7vfx2" Apr 20 20:08:52.126187 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.126124 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d422197f-777e-4035-a81e-8dcf213b4d0a-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pw58h\" (UID: \"d422197f-777e-4035-a81e-8dcf213b4d0a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" Apr 20 20:08:52.126277 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.126255 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5d9430a-17b7-4994-9576-0dd1247ec436-metrics-client-ca\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.126342 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.126311 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b5d9430a-17b7-4994-9576-0dd1247ec436-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.126393 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.126365 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e48ea207-1b2d-460f-b13a-46721f677a63-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-7vfx2\" (UID: \"e48ea207-1b2d-460f-b13a-46721f677a63\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7vfx2" Apr 20 20:08:52.126478 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.126399 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b5d9430a-17b7-4994-9576-0dd1247ec436-node-exporter-textfile\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.126478 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:52.126367 2571 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 20 20:08:52.126478 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.126442 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4pp5\" (UniqueName: \"kubernetes.io/projected/d422197f-777e-4035-a81e-8dcf213b4d0a-kube-api-access-z4pp5\") pod \"kube-state-metrics-69db897b98-pw58h\" (UID: \"d422197f-777e-4035-a81e-8dcf213b4d0a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" Apr 20 20:08:52.126478 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.126474 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b5d9430a-17b7-4994-9576-0dd1247ec436-node-exporter-accelerators-collector-config\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.126680 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.126503 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b5d9430a-17b7-4994-9576-0dd1247ec436-node-exporter-tls\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.126680 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:52.126524 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d422197f-777e-4035-a81e-8dcf213b4d0a-kube-state-metrics-tls podName:d422197f-777e-4035-a81e-8dcf213b4d0a nodeName:}" failed. No retries permitted until 2026-04-20 20:08:52.626505029 +0000 UTC m=+173.665994567 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/d422197f-777e-4035-a81e-8dcf213b4d0a-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-pw58h" (UID: "d422197f-777e-4035-a81e-8dcf213b4d0a") : secret "kube-state-metrics-tls" not found Apr 20 20:08:52.126680 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.126560 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5d9430a-17b7-4994-9576-0dd1247ec436-sys\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.126680 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:52.126582 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 20:08:52.126680 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.126601 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgsvd\" (UniqueName: \"kubernetes.io/projected/b5d9430a-17b7-4994-9576-0dd1247ec436-kube-api-access-bgsvd\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.126680 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:52.126627 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5d9430a-17b7-4994-9576-0dd1247ec436-node-exporter-tls podName:b5d9430a-17b7-4994-9576-0dd1247ec436 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:52.626614228 +0000 UTC m=+173.666103761 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/b5d9430a-17b7-4994-9576-0dd1247ec436-node-exporter-tls") pod "node-exporter-8s5lf" (UID: "b5d9430a-17b7-4994-9576-0dd1247ec436") : secret "node-exporter-tls" not found Apr 20 20:08:52.126680 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.126644 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e48ea207-1b2d-460f-b13a-46721f677a63-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7vfx2\" (UID: \"e48ea207-1b2d-460f-b13a-46721f677a63\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7vfx2" Apr 20 20:08:52.127048 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.126699 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d422197f-777e-4035-a81e-8dcf213b4d0a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-pw58h\" (UID: \"d422197f-777e-4035-a81e-8dcf213b4d0a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" Apr 20 20:08:52.127048 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.126751 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d422197f-777e-4035-a81e-8dcf213b4d0a-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-pw58h\" (UID: \"d422197f-777e-4035-a81e-8dcf213b4d0a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" Apr 20 20:08:52.127048 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.126787 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d422197f-777e-4035-a81e-8dcf213b4d0a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-pw58h\" (UID: \"d422197f-777e-4035-a81e-8dcf213b4d0a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" Apr 20 20:08:52.127048 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.126849 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b5d9430a-17b7-4994-9576-0dd1247ec436-root\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.127048 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.126893 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b5d9430a-17b7-4994-9576-0dd1247ec436-node-exporter-wtmp\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.127048 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.126938 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e48ea207-1b2d-460f-b13a-46721f677a63-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-7vfx2\" (UID: \"e48ea207-1b2d-460f-b13a-46721f677a63\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7vfx2" Apr 20 20:08:52.127048 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.126972 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d422197f-777e-4035-a81e-8dcf213b4d0a-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-pw58h\" (UID: \"d422197f-777e-4035-a81e-8dcf213b4d0a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" Apr 20 20:08:52.127048 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.126979 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5d9430a-17b7-4994-9576-0dd1247ec436-metrics-client-ca\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.127443 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.127093 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e48ea207-1b2d-460f-b13a-46721f677a63-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-7vfx2\" (UID: \"e48ea207-1b2d-460f-b13a-46721f677a63\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7vfx2" Apr 20 20:08:52.127443 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.127105 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b5d9430a-17b7-4994-9576-0dd1247ec436-node-exporter-wtmp\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.127443 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.127129 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5d9430a-17b7-4994-9576-0dd1247ec436-sys\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.127443 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.127291 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d422197f-777e-4035-a81e-8dcf213b4d0a-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-pw58h\" (UID: \"d422197f-777e-4035-a81e-8dcf213b4d0a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" Apr 20 20:08:52.127443 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.127404 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b5d9430a-17b7-4994-9576-0dd1247ec436-node-exporter-textfile\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.127693 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.127452 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b5d9430a-17b7-4994-9576-0dd1247ec436-root\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.127693 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.127519 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b5d9430a-17b7-4994-9576-0dd1247ec436-node-exporter-accelerators-collector-config\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.127885 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.127858 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d422197f-777e-4035-a81e-8dcf213b4d0a-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-pw58h\" (UID: \"d422197f-777e-4035-a81e-8dcf213b4d0a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" Apr 20 20:08:52.128407 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.128380 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d422197f-777e-4035-a81e-8dcf213b4d0a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-pw58h\" (UID: \"d422197f-777e-4035-a81e-8dcf213b4d0a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" Apr 20 20:08:52.129445 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.129394 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b5d9430a-17b7-4994-9576-0dd1247ec436-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.129535 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.129399 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d422197f-777e-4035-a81e-8dcf213b4d0a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-pw58h\" (UID: \"d422197f-777e-4035-a81e-8dcf213b4d0a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" Apr 20 20:08:52.129936 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.129917 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e48ea207-1b2d-460f-b13a-46721f677a63-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7vfx2\" (UID: \"e48ea207-1b2d-460f-b13a-46721f677a63\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7vfx2" Apr 20 20:08:52.129992 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.129966 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e48ea207-1b2d-460f-b13a-46721f677a63-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-7vfx2\" (UID: \"e48ea207-1b2d-460f-b13a-46721f677a63\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7vfx2" Apr 20 20:08:52.137892 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.137868 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4pp5\" (UniqueName: \"kubernetes.io/projected/d422197f-777e-4035-a81e-8dcf213b4d0a-kube-api-access-z4pp5\") pod \"kube-state-metrics-69db897b98-pw58h\" (UID: \"d422197f-777e-4035-a81e-8dcf213b4d0a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" Apr 20 20:08:52.138476 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.138456 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnzmk\" (UniqueName: \"kubernetes.io/projected/e48ea207-1b2d-460f-b13a-46721f677a63-kube-api-access-qnzmk\") pod \"openshift-state-metrics-9d44df66c-7vfx2\" (UID: \"e48ea207-1b2d-460f-b13a-46721f677a63\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7vfx2" Apr 20 20:08:52.138571 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.138558 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgsvd\" (UniqueName: \"kubernetes.io/projected/b5d9430a-17b7-4994-9576-0dd1247ec436-kube-api-access-bgsvd\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.224614 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.224538 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7vfx2" Apr 20 20:08:52.346400 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.346346 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-7vfx2"] Apr 20 20:08:52.348986 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:08:52.348961 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode48ea207_1b2d_460f_b13a_46721f677a63.slice/crio-1c2daf7ba1b140f7b289910a6a9432f29ffc2311c453e6dd6b4dcf3161892c85 WatchSource:0}: Error finding container 1c2daf7ba1b140f7b289910a6a9432f29ffc2311c453e6dd6b4dcf3161892c85: Status 404 returned error can't find the container with id 1c2daf7ba1b140f7b289910a6a9432f29ffc2311c453e6dd6b4dcf3161892c85 Apr 20 20:08:52.631676 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.631634 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b5d9430a-17b7-4994-9576-0dd1247ec436-node-exporter-tls\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.631830 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.631737 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d422197f-777e-4035-a81e-8dcf213b4d0a-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pw58h\" (UID: \"d422197f-777e-4035-a81e-8dcf213b4d0a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" Apr 20 20:08:52.634085 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.634058 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b5d9430a-17b7-4994-9576-0dd1247ec436-node-exporter-tls\") pod \"node-exporter-8s5lf\" (UID: \"b5d9430a-17b7-4994-9576-0dd1247ec436\") " pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.634214 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.634196 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d422197f-777e-4035-a81e-8dcf213b4d0a-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pw58h\" (UID: \"d422197f-777e-4035-a81e-8dcf213b4d0a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" Apr 20 20:08:52.830392 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.830358 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" Apr 20 20:08:52.857795 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.857767 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8s5lf" Apr 20 20:08:52.866332 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:08:52.866296 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5d9430a_17b7_4994_9576_0dd1247ec436.slice/crio-f69938fc52421d3377fc4b60cfafaeb8df012fd5cbb7210165c90679c8c414e5 WatchSource:0}: Error finding container f69938fc52421d3377fc4b60cfafaeb8df012fd5cbb7210165c90679c8c414e5: Status 404 returned error can't find the container with id f69938fc52421d3377fc4b60cfafaeb8df012fd5cbb7210165c90679c8c414e5 Apr 20 20:08:52.968554 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.968522 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-pw58h"] Apr 20 20:08:52.971984 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:08:52.971936 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd422197f_777e_4035_a81e_8dcf213b4d0a.slice/crio-2ade0f665056a52bd345009d36391ce14f8b8aecdfcc7dea34634c7460eb70f3 WatchSource:0}: Error finding container 2ade0f665056a52bd345009d36391ce14f8b8aecdfcc7dea34634c7460eb70f3: Status 404 returned error can't find the container with id 2ade0f665056a52bd345009d36391ce14f8b8aecdfcc7dea34634c7460eb70f3 Apr 20 20:08:52.984739 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.984710 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:08:52.988888 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.988865 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:52.995083 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.994783 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 20:08:52.995083 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.994831 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 20:08:52.995083 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.994863 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 20:08:52.995083 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.994783 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 20:08:52.995083 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.995002 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 20:08:52.995083 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.995044 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 20:08:52.995083 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.995044 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 20:08:52.995534 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.995300 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-4b9mz\"" Apr 20 20:08:52.995597 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.995546 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 20:08:52.995597 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:52.995566 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 20:08:53.002238 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.002175 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:08:53.035483 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.035445 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.035653 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.035494 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/689afe77-0baf-42f8-aabf-a28730e1c663-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.035653 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.035526 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.035653 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.035553 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncjhm\" (UniqueName: \"kubernetes.io/projected/689afe77-0baf-42f8-aabf-a28730e1c663-kube-api-access-ncjhm\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.035653 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.035622 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.035653 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.035645 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/689afe77-0baf-42f8-aabf-a28730e1c663-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.035905 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.035674 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-web-config\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.035905 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.035704 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.035905 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.035766 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-config-volume\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.035905 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.035867 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/689afe77-0baf-42f8-aabf-a28730e1c663-config-out\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.035905 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.035895 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/689afe77-0baf-42f8-aabf-a28730e1c663-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.036182 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.035970 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/689afe77-0baf-42f8-aabf-a28730e1c663-tls-assets\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.036182 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.035996 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.073254 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.073222 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-n8mvh" Apr 20 20:08:53.112143 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.112061 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8s5lf" event={"ID":"b5d9430a-17b7-4994-9576-0dd1247ec436","Type":"ContainerStarted","Data":"f69938fc52421d3377fc4b60cfafaeb8df012fd5cbb7210165c90679c8c414e5"} Apr 20 20:08:53.113463 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.113407 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" event={"ID":"d422197f-777e-4035-a81e-8dcf213b4d0a","Type":"ContainerStarted","Data":"2ade0f665056a52bd345009d36391ce14f8b8aecdfcc7dea34634c7460eb70f3"} Apr 20 20:08:53.115517 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.115491 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7vfx2" event={"ID":"e48ea207-1b2d-460f-b13a-46721f677a63","Type":"ContainerStarted","Data":"bdb3e0622942f4651a320459584b012f806eab8ee6bb8bc8ae8609e3de3f36b6"} Apr 20 20:08:53.115634 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.115523 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7vfx2" event={"ID":"e48ea207-1b2d-460f-b13a-46721f677a63","Type":"ContainerStarted","Data":"bf8302fa3030c8fe9bbe49df5a1bd0a9ade52bd443605e408ffb89483bc1efbe"} Apr 20 20:08:53.115634 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.115537 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7vfx2" event={"ID":"e48ea207-1b2d-460f-b13a-46721f677a63","Type":"ContainerStarted","Data":"1c2daf7ba1b140f7b289910a6a9432f29ffc2311c453e6dd6b4dcf3161892c85"} Apr 20 20:08:53.136577 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.136537 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.136577 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.136576 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ncjhm\" (UniqueName: \"kubernetes.io/projected/689afe77-0baf-42f8-aabf-a28730e1c663-kube-api-access-ncjhm\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.136819 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.136610 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.136819 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.136636 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/689afe77-0baf-42f8-aabf-a28730e1c663-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.136819 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.136676 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-web-config\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.136819 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:53.136699 2571 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 20 20:08:53.136819 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.136717 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.136819 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.136756 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-config-volume\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.136819 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:53.136769 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-main-tls podName:689afe77-0baf-42f8-aabf-a28730e1c663 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:53.636748217 +0000 UTC m=+174.676237757 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "689afe77-0baf-42f8-aabf-a28730e1c663") : secret "alertmanager-main-tls" not found Apr 20 20:08:53.137184 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.136841 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/689afe77-0baf-42f8-aabf-a28730e1c663-config-out\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.137184 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.136870 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/689afe77-0baf-42f8-aabf-a28730e1c663-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.137184 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.137004 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/689afe77-0baf-42f8-aabf-a28730e1c663-tls-assets\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.137184 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.137040 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.137184 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.137102 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.137184 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.137129 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/689afe77-0baf-42f8-aabf-a28730e1c663-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.138138 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.137838 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/689afe77-0baf-42f8-aabf-a28730e1c663-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.138334 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:53.137459 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/689afe77-0baf-42f8-aabf-a28730e1c663-alertmanager-trusted-ca-bundle podName:689afe77-0baf-42f8-aabf-a28730e1c663 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:53.637415905 +0000 UTC m=+174.676905433 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/689afe77-0baf-42f8-aabf-a28730e1c663-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "689afe77-0baf-42f8-aabf-a28730e1c663") : configmap references non-existent config key: ca-bundle.crt Apr 20 20:08:53.138920 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.138892 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/689afe77-0baf-42f8-aabf-a28730e1c663-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.140316 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.140289 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.140576 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.140552 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.141260 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.141235 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/689afe77-0baf-42f8-aabf-a28730e1c663-tls-assets\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.141355 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.141262 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-web-config\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.141355 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.141265 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/689afe77-0baf-42f8-aabf-a28730e1c663-config-out\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.141908 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.141884 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-config-volume\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.142858 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.142840 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.142960 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.142939 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.146507 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.146487 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncjhm\" (UniqueName: \"kubernetes.io/projected/689afe77-0baf-42f8-aabf-a28730e1c663-kube-api-access-ncjhm\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.640755 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.640658 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.640755 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.640743 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/689afe77-0baf-42f8-aabf-a28730e1c663-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:53.640966 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:53.640857 2571 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 20 20:08:53.640966 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:08:53.640945 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-main-tls podName:689afe77-0baf-42f8-aabf-a28730e1c663 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:54.640922638 +0000 UTC m=+175.680412178 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "689afe77-0baf-42f8-aabf-a28730e1c663") : secret "alertmanager-main-tls" not found Apr 20 20:08:53.641713 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:53.641691 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/689afe77-0baf-42f8-aabf-a28730e1c663-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:54.120627 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:54.120587 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8s5lf" event={"ID":"b5d9430a-17b7-4994-9576-0dd1247ec436","Type":"ContainerStarted","Data":"acaa407de9f3c4106d039397192a1ecce87a52038511e9f9c3f2f2f9912c3626"} Apr 20 20:08:54.122728 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:54.122690 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7vfx2" event={"ID":"e48ea207-1b2d-460f-b13a-46721f677a63","Type":"ContainerStarted","Data":"308d23c22b3a5245414934b38f69cd84d330a705a5b8fd9348fb4c8577c424ef"} Apr 20 20:08:54.183514 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:54.183463 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7vfx2" podStartSLOduration=2.291615541 podStartE2EDuration="3.183443798s" podCreationTimestamp="2026-04-20 20:08:51 +0000 UTC" firstStartedPulling="2026-04-20 20:08:52.477038853 +0000 UTC m=+173.516528377" lastFinishedPulling="2026-04-20 20:08:53.368867094 +0000 UTC m=+174.408356634" observedRunningTime="2026-04-20 20:08:54.182773558 +0000 UTC m=+175.222263104" watchObservedRunningTime="2026-04-20 20:08:54.183443798 +0000 UTC m=+175.222933336" Apr 20 20:08:54.651166 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:54.651131 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:54.653711 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:54.653689 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:54.803017 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:54.802988 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:08:54.936560 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:54.936536 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:08:54.938368 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:08:54.938337 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod689afe77_0baf_42f8_aabf_a28730e1c663.slice/crio-5496c26339b61ac0538e847107602a4e63cf65698070742856924b6e3d0e442d WatchSource:0}: Error finding container 5496c26339b61ac0538e847107602a4e63cf65698070742856924b6e3d0e442d: Status 404 returned error can't find the container with id 5496c26339b61ac0538e847107602a4e63cf65698070742856924b6e3d0e442d Apr 20 20:08:55.127512 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:55.127477 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" event={"ID":"d422197f-777e-4035-a81e-8dcf213b4d0a","Type":"ContainerStarted","Data":"8f427228410a8bb3f408bce552950ec2f477ea280ba455e8bed1beaba05199e3"} Apr 20 20:08:55.127512 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:55.127515 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" event={"ID":"d422197f-777e-4035-a81e-8dcf213b4d0a","Type":"ContainerStarted","Data":"20d1e682136a2e4c12103cdd0dbda9d32067303aab35c07325017951e9c2dd41"} Apr 20 20:08:55.127992 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:55.127532 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" event={"ID":"d422197f-777e-4035-a81e-8dcf213b4d0a","Type":"ContainerStarted","Data":"be3b9e79d88b6b768860df2503b7725c3bd04925ff9a629058ecd2f78d4ae43e"} Apr 20 20:08:55.128636 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:55.128612 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"689afe77-0baf-42f8-aabf-a28730e1c663","Type":"ContainerStarted","Data":"5496c26339b61ac0538e847107602a4e63cf65698070742856924b6e3d0e442d"} Apr 20 20:08:55.129804 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:55.129780 2571 generic.go:358] "Generic (PLEG): container finished" podID="b5d9430a-17b7-4994-9576-0dd1247ec436" containerID="acaa407de9f3c4106d039397192a1ecce87a52038511e9f9c3f2f2f9912c3626" exitCode=0 Apr 20 20:08:55.129900 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:55.129861 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8s5lf" event={"ID":"b5d9430a-17b7-4994-9576-0dd1247ec436","Type":"ContainerDied","Data":"acaa407de9f3c4106d039397192a1ecce87a52038511e9f9c3f2f2f9912c3626"} Apr 20 20:08:55.148867 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:55.148826 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-pw58h" podStartSLOduration=2.939318863 podStartE2EDuration="4.14881341s" podCreationTimestamp="2026-04-20 20:08:51 +0000 UTC" firstStartedPulling="2026-04-20 20:08:52.974085587 +0000 UTC m=+174.013575114" lastFinishedPulling="2026-04-20 20:08:54.183580134 +0000 UTC m=+175.223069661" observedRunningTime="2026-04-20 20:08:55.147909296 +0000 UTC m=+176.187398853" watchObservedRunningTime="2026-04-20 20:08:55.14881341 +0000 UTC m=+176.188302956" Apr 20 20:08:56.050690 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.050668 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56448cd7d-vzb9k"] Apr 20 20:08:56.053867 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.053849 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.066023 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.065405 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56448cd7d-vzb9k"] Apr 20 20:08:56.073342 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.072711 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 20:08:56.134193 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.134168 2571 generic.go:358] "Generic (PLEG): container finished" podID="689afe77-0baf-42f8-aabf-a28730e1c663" containerID="525340c82e7e258af0b47538cae9314f2403bf9f16da188a958132c617e08fd9" exitCode=0 Apr 20 20:08:56.134581 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.134250 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"689afe77-0baf-42f8-aabf-a28730e1c663","Type":"ContainerDied","Data":"525340c82e7e258af0b47538cae9314f2403bf9f16da188a958132c617e08fd9"} Apr 20 20:08:56.136205 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.136181 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8s5lf" event={"ID":"b5d9430a-17b7-4994-9576-0dd1247ec436","Type":"ContainerStarted","Data":"2adc1528dd80674a0ce2247c2257a14469f02ba9d29bdf0f995312fa2793a705"} Apr 20 20:08:56.136296 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.136213 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8s5lf" event={"ID":"b5d9430a-17b7-4994-9576-0dd1247ec436","Type":"ContainerStarted","Data":"4dccbaa97f96f7697f9e5b7b7f521c150aada92a3b4cd94b5d8a4623306a4309"} Apr 20 20:08:56.169352 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.169326 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-console-serving-cert\") pod \"console-56448cd7d-vzb9k\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.169508 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.169489 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-service-ca\") pod \"console-56448cd7d-vzb9k\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.169593 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.169523 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5slx5\" (UniqueName: \"kubernetes.io/projected/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-kube-api-access-5slx5\") pod \"console-56448cd7d-vzb9k\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.169751 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.169714 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-oauth-serving-cert\") pod \"console-56448cd7d-vzb9k\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.169823 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.169799 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-trusted-ca-bundle\") pod \"console-56448cd7d-vzb9k\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.169984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.169963 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-console-oauth-config\") pod \"console-56448cd7d-vzb9k\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.170257 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.170230 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-console-config\") pod \"console-56448cd7d-vzb9k\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.186094 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.186052 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-8s5lf" podStartSLOduration=4.282661118 podStartE2EDuration="5.18604262s" podCreationTimestamp="2026-04-20 20:08:51 +0000 UTC" firstStartedPulling="2026-04-20 20:08:52.868087195 +0000 UTC m=+173.907576737" lastFinishedPulling="2026-04-20 20:08:53.771468713 +0000 UTC m=+174.810958239" observedRunningTime="2026-04-20 20:08:56.184633146 +0000 UTC m=+177.224122692" watchObservedRunningTime="2026-04-20 20:08:56.18604262 +0000 UTC m=+177.225532166" Apr 20 20:08:56.271205 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.271178 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-oauth-serving-cert\") pod \"console-56448cd7d-vzb9k\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.271319 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.271212 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-trusted-ca-bundle\") pod \"console-56448cd7d-vzb9k\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.271319 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.271249 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-console-oauth-config\") pod \"console-56448cd7d-vzb9k\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.271319 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.271273 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-console-config\") pod \"console-56448cd7d-vzb9k\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.271319 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.271297 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-console-serving-cert\") pod \"console-56448cd7d-vzb9k\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.271535 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.271323 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-service-ca\") pod \"console-56448cd7d-vzb9k\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.271535 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.271337 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5slx5\" (UniqueName: \"kubernetes.io/projected/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-kube-api-access-5slx5\") pod \"console-56448cd7d-vzb9k\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.271987 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.271957 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-oauth-serving-cert\") pod \"console-56448cd7d-vzb9k\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.272119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.272104 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-trusted-ca-bundle\") pod \"console-56448cd7d-vzb9k\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.272162 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.272115 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-console-config\") pod \"console-56448cd7d-vzb9k\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.272524 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.272507 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-service-ca\") pod \"console-56448cd7d-vzb9k\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.274501 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.274478 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-console-serving-cert\") pod \"console-56448cd7d-vzb9k\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.274573 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.274485 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-console-oauth-config\") pod \"console-56448cd7d-vzb9k\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.279956 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.279919 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5slx5\" (UniqueName: \"kubernetes.io/projected/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-kube-api-access-5slx5\") pod \"console-56448cd7d-vzb9k\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.373350 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.373314 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:08:56.491268 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:56.491237 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56448cd7d-vzb9k"] Apr 20 20:08:56.494914 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:08:56.494885 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fcde684_dc76_4cf0_b02b_c9d8bc27703f.slice/crio-e08165dbab43ef723259a29c94bff4d3fa54e8e3147beb5ed6fcfbb8d60d4676 WatchSource:0}: Error finding container e08165dbab43ef723259a29c94bff4d3fa54e8e3147beb5ed6fcfbb8d60d4676: Status 404 returned error can't find the container with id e08165dbab43ef723259a29c94bff4d3fa54e8e3147beb5ed6fcfbb8d60d4676 Apr 20 20:08:57.141063 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:57.141027 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56448cd7d-vzb9k" event={"ID":"7fcde684-dc76-4cf0-b02b-c9d8bc27703f","Type":"ContainerStarted","Data":"1c600e4d9af77e2794d8043c1beb9a6121971c5285dc84ef32a80fa66ac8c638"} Apr 20 20:08:57.141517 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:57.141072 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56448cd7d-vzb9k" event={"ID":"7fcde684-dc76-4cf0-b02b-c9d8bc27703f","Type":"ContainerStarted","Data":"e08165dbab43ef723259a29c94bff4d3fa54e8e3147beb5ed6fcfbb8d60d4676"} Apr 20 20:08:57.161325 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:57.161272 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56448cd7d-vzb9k" podStartSLOduration=1.161257136 podStartE2EDuration="1.161257136s" podCreationTimestamp="2026-04-20 20:08:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:08:57.159540945 +0000 UTC m=+178.199030485" watchObservedRunningTime="2026-04-20 20:08:57.161257136 +0000 UTC m=+178.200746682" Apr 20 20:08:58.148270 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:58.148229 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"689afe77-0baf-42f8-aabf-a28730e1c663","Type":"ContainerStarted","Data":"6670288f366cb270940e7acb6c043e90ed7d7eb8f113fa7884016d209bd1e598"} Apr 20 20:08:58.148729 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:58.148277 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"689afe77-0baf-42f8-aabf-a28730e1c663","Type":"ContainerStarted","Data":"ce922266eecce4e7eeed17167268480eecfd64d03410c59a2f7cf1b8069b0d70"} Apr 20 20:08:58.148729 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:58.148294 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"689afe77-0baf-42f8-aabf-a28730e1c663","Type":"ContainerStarted","Data":"fe70cbe6aba93eb1c70e5b4edd62fac56cad1af33f7e71aea532f8995ea83072"} Apr 20 20:08:58.148729 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:58.148306 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"689afe77-0baf-42f8-aabf-a28730e1c663","Type":"ContainerStarted","Data":"8605dff8170b83f6facff19fe0df8cc61e1f580e257debd85cac36def8d59cea"} Apr 20 20:08:58.148729 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:58.148319 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"689afe77-0baf-42f8-aabf-a28730e1c663","Type":"ContainerStarted","Data":"406796faf2058bf914fe9981e4e02e0669e29cba8364f7f837ae8e1332af3958"} Apr 20 20:08:58.380162 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:58.380128 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:58.380491 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:58.380465 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:58.385935 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:58.385909 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:58.447867 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:58.447782 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:08:59.157617 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:59.157574 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"689afe77-0baf-42f8-aabf-a28730e1c663","Type":"ContainerStarted","Data":"9b0b6b6a2c63178905f1c14f97f6bd70d005cb7a85b6b23c5666c6ed99ced3b1"} Apr 20 20:08:59.161475 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:59.161448 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:08:59.186510 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:08:59.186466 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.505123676 podStartE2EDuration="7.186451322s" podCreationTimestamp="2026-04-20 20:08:52 +0000 UTC" firstStartedPulling="2026-04-20 20:08:54.940287787 +0000 UTC m=+175.979777311" lastFinishedPulling="2026-04-20 20:08:58.62161542 +0000 UTC m=+179.661104957" observedRunningTime="2026-04-20 20:08:59.184268684 +0000 UTC m=+180.223758230" watchObservedRunningTime="2026-04-20 20:08:59.186451322 +0000 UTC m=+180.225940880" Apr 20 20:09:00.479669 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.479634 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56448cd7d-vzb9k"] Apr 20 20:09:00.510019 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.509991 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7db6575c65-48mn2"] Apr 20 20:09:00.514339 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.514320 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.520573 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.520546 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7db6575c65-48mn2"] Apr 20 20:09:00.613566 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.613539 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-console-oauth-config\") pod \"console-7db6575c65-48mn2\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.613675 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.613568 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpvr5\" (UniqueName: \"kubernetes.io/projected/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-kube-api-access-cpvr5\") pod \"console-7db6575c65-48mn2\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.613675 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.613643 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-console-serving-cert\") pod \"console-7db6575c65-48mn2\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.613758 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.613684 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-console-config\") pod \"console-7db6575c65-48mn2\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.613758 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.613731 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-service-ca\") pod \"console-7db6575c65-48mn2\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.613816 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.613775 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-trusted-ca-bundle\") pod \"console-7db6575c65-48mn2\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.613853 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.613813 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-oauth-serving-cert\") pod \"console-7db6575c65-48mn2\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.715196 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.715162 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-console-oauth-config\") pod \"console-7db6575c65-48mn2\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.715196 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.715196 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpvr5\" (UniqueName: \"kubernetes.io/projected/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-kube-api-access-cpvr5\") pod \"console-7db6575c65-48mn2\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.715387 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.715230 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-console-serving-cert\") pod \"console-7db6575c65-48mn2\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.715387 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.715266 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-console-config\") pod \"console-7db6575c65-48mn2\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.715387 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.715291 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-service-ca\") pod \"console-7db6575c65-48mn2\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.715387 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.715317 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-trusted-ca-bundle\") pod \"console-7db6575c65-48mn2\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.715387 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.715351 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-oauth-serving-cert\") pod \"console-7db6575c65-48mn2\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.716217 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.716186 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-oauth-serving-cert\") pod \"console-7db6575c65-48mn2\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.716345 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.716186 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-service-ca\") pod \"console-7db6575c65-48mn2\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.716345 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.716309 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-console-config\") pod \"console-7db6575c65-48mn2\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.716464 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.716306 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-trusted-ca-bundle\") pod \"console-7db6575c65-48mn2\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.717794 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.717778 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-console-serving-cert\") pod \"console-7db6575c65-48mn2\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.718370 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.718354 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-console-oauth-config\") pod \"console-7db6575c65-48mn2\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.723662 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.723641 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpvr5\" (UniqueName: \"kubernetes.io/projected/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-kube-api-access-cpvr5\") pod \"console-7db6575c65-48mn2\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.823820 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.823792 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:00.958464 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:00.958414 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7db6575c65-48mn2"] Apr 20 20:09:00.961106 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:09:00.961082 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdfb92cd_2029_45d6_aa1b_7b28cb1ad973.slice/crio-817fa58f256b85397d4af7009dff74b2db4162747ada14ed8a657861db2b6a8e WatchSource:0}: Error finding container 817fa58f256b85397d4af7009dff74b2db4162747ada14ed8a657861db2b6a8e: Status 404 returned error can't find the container with id 817fa58f256b85397d4af7009dff74b2db4162747ada14ed8a657861db2b6a8e Apr 20 20:09:01.165560 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:01.165478 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7db6575c65-48mn2" event={"ID":"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973","Type":"ContainerStarted","Data":"12c49800dfff57fdd62dc902ec1f554e200d362a49f363ab8b74201898d8f471"} Apr 20 20:09:01.165560 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:01.165514 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7db6575c65-48mn2" event={"ID":"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973","Type":"ContainerStarted","Data":"817fa58f256b85397d4af7009dff74b2db4162747ada14ed8a657861db2b6a8e"} Apr 20 20:09:01.182748 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:01.182694 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7db6575c65-48mn2" podStartSLOduration=1.182680386 podStartE2EDuration="1.182680386s" podCreationTimestamp="2026-04-20 20:09:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:09:01.182241299 +0000 UTC m=+182.221730857" watchObservedRunningTime="2026-04-20 20:09:01.182680386 +0000 UTC m=+182.222169933" Apr 20 20:09:03.462256 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.462192 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" podUID="df27caca-27b2-4278-b3e1-119d6af3a947" containerName="registry" containerID="cri-o://945e77343ece114d8cf8afc20c356abc7d9d289b77dfa8c30d7a81c76ee8e144" gracePeriod=30 Apr 20 20:09:03.691149 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.691126 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:09:03.741803 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.741737 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df27caca-27b2-4278-b3e1-119d6af3a947-registry-certificates\") pod \"df27caca-27b2-4278-b3e1-119d6af3a947\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " Apr 20 20:09:03.741803 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.741791 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kqsg\" (UniqueName: \"kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-kube-api-access-6kqsg\") pod \"df27caca-27b2-4278-b3e1-119d6af3a947\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " Apr 20 20:09:03.741987 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.741828 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df27caca-27b2-4278-b3e1-119d6af3a947-installation-pull-secrets\") pod \"df27caca-27b2-4278-b3e1-119d6af3a947\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " Apr 20 20:09:03.741987 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.741865 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-bound-sa-token\") pod \"df27caca-27b2-4278-b3e1-119d6af3a947\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " Apr 20 20:09:03.741987 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.741885 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df27caca-27b2-4278-b3e1-119d6af3a947-ca-trust-extracted\") pod \"df27caca-27b2-4278-b3e1-119d6af3a947\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " Apr 20 20:09:03.741987 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.741920 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-registry-tls\") pod \"df27caca-27b2-4278-b3e1-119d6af3a947\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " Apr 20 20:09:03.741987 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.741952 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/df27caca-27b2-4278-b3e1-119d6af3a947-image-registry-private-configuration\") pod \"df27caca-27b2-4278-b3e1-119d6af3a947\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " Apr 20 20:09:03.741987 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.741973 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df27caca-27b2-4278-b3e1-119d6af3a947-trusted-ca\") pod \"df27caca-27b2-4278-b3e1-119d6af3a947\" (UID: \"df27caca-27b2-4278-b3e1-119d6af3a947\") " Apr 20 20:09:03.742271 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.742250 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df27caca-27b2-4278-b3e1-119d6af3a947-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "df27caca-27b2-4278-b3e1-119d6af3a947" (UID: "df27caca-27b2-4278-b3e1-119d6af3a947"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:03.742577 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.742536 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df27caca-27b2-4278-b3e1-119d6af3a947-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "df27caca-27b2-4278-b3e1-119d6af3a947" (UID: "df27caca-27b2-4278-b3e1-119d6af3a947"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:03.744658 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.744575 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df27caca-27b2-4278-b3e1-119d6af3a947-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "df27caca-27b2-4278-b3e1-119d6af3a947" (UID: "df27caca-27b2-4278-b3e1-119d6af3a947"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:03.744658 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.744640 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df27caca-27b2-4278-b3e1-119d6af3a947-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "df27caca-27b2-4278-b3e1-119d6af3a947" (UID: "df27caca-27b2-4278-b3e1-119d6af3a947"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:03.744795 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.744696 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "df27caca-27b2-4278-b3e1-119d6af3a947" (UID: "df27caca-27b2-4278-b3e1-119d6af3a947"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:03.744795 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.744715 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "df27caca-27b2-4278-b3e1-119d6af3a947" (UID: "df27caca-27b2-4278-b3e1-119d6af3a947"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:03.744795 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.744725 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-kube-api-access-6kqsg" (OuterVolumeSpecName: "kube-api-access-6kqsg") pod "df27caca-27b2-4278-b3e1-119d6af3a947" (UID: "df27caca-27b2-4278-b3e1-119d6af3a947"). InnerVolumeSpecName "kube-api-access-6kqsg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:03.750601 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.750579 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df27caca-27b2-4278-b3e1-119d6af3a947-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "df27caca-27b2-4278-b3e1-119d6af3a947" (UID: "df27caca-27b2-4278-b3e1-119d6af3a947"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:09:03.843030 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.843002 2571 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df27caca-27b2-4278-b3e1-119d6af3a947-installation-pull-secrets\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:03.843030 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.843028 2571 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-bound-sa-token\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:03.843159 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.843038 2571 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df27caca-27b2-4278-b3e1-119d6af3a947-ca-trust-extracted\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:03.843159 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.843047 2571 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-registry-tls\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:03.843159 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.843057 2571 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/df27caca-27b2-4278-b3e1-119d6af3a947-image-registry-private-configuration\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:03.843159 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.843066 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df27caca-27b2-4278-b3e1-119d6af3a947-trusted-ca\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:03.843159 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.843074 2571 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df27caca-27b2-4278-b3e1-119d6af3a947-registry-certificates\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:03.843159 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:03.843082 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6kqsg\" (UniqueName: \"kubernetes.io/projected/df27caca-27b2-4278-b3e1-119d6af3a947-kube-api-access-6kqsg\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:04.175034 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:04.174999 2571 generic.go:358] "Generic (PLEG): container finished" podID="df27caca-27b2-4278-b3e1-119d6af3a947" containerID="945e77343ece114d8cf8afc20c356abc7d9d289b77dfa8c30d7a81c76ee8e144" exitCode=0 Apr 20 20:09:04.175166 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:04.175050 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" event={"ID":"df27caca-27b2-4278-b3e1-119d6af3a947","Type":"ContainerDied","Data":"945e77343ece114d8cf8afc20c356abc7d9d289b77dfa8c30d7a81c76ee8e144"} Apr 20 20:09:04.175166 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:04.175067 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" Apr 20 20:09:04.175166 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:04.175085 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c6bf74cfb-rqlgt" event={"ID":"df27caca-27b2-4278-b3e1-119d6af3a947","Type":"ContainerDied","Data":"d22bac5702bc7f327b6e5152d2f6fcd3c80dc02b8c621212654e5687c632fbc5"} Apr 20 20:09:04.175166 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:04.175106 2571 scope.go:117] "RemoveContainer" containerID="945e77343ece114d8cf8afc20c356abc7d9d289b77dfa8c30d7a81c76ee8e144" Apr 20 20:09:04.183530 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:04.183509 2571 scope.go:117] "RemoveContainer" containerID="945e77343ece114d8cf8afc20c356abc7d9d289b77dfa8c30d7a81c76ee8e144" Apr 20 20:09:04.183778 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:09:04.183758 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"945e77343ece114d8cf8afc20c356abc7d9d289b77dfa8c30d7a81c76ee8e144\": container with ID starting with 945e77343ece114d8cf8afc20c356abc7d9d289b77dfa8c30d7a81c76ee8e144 not found: ID does not exist" containerID="945e77343ece114d8cf8afc20c356abc7d9d289b77dfa8c30d7a81c76ee8e144" Apr 20 20:09:04.183854 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:04.183786 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"945e77343ece114d8cf8afc20c356abc7d9d289b77dfa8c30d7a81c76ee8e144"} err="failed to get container status \"945e77343ece114d8cf8afc20c356abc7d9d289b77dfa8c30d7a81c76ee8e144\": rpc error: code = NotFound desc = could not find container \"945e77343ece114d8cf8afc20c356abc7d9d289b77dfa8c30d7a81c76ee8e144\": container with ID starting with 945e77343ece114d8cf8afc20c356abc7d9d289b77dfa8c30d7a81c76ee8e144 not found: ID does not exist" Apr 20 20:09:04.199066 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:04.199046 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-c6bf74cfb-rqlgt"] Apr 20 20:09:04.202107 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:04.202088 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-c6bf74cfb-rqlgt"] Apr 20 20:09:05.527479 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:05.527413 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df27caca-27b2-4278-b3e1-119d6af3a947" path="/var/lib/kubelet/pods/df27caca-27b2-4278-b3e1-119d6af3a947/volumes" Apr 20 20:09:06.373645 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.373608 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:09:06.532033 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.531995 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7db6575c65-48mn2"] Apr 20 20:09:06.567734 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.567701 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6dff456c99-rql5b"] Apr 20 20:09:06.568043 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.568030 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df27caca-27b2-4278-b3e1-119d6af3a947" containerName="registry" Apr 20 20:09:06.568087 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.568046 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="df27caca-27b2-4278-b3e1-119d6af3a947" containerName="registry" Apr 20 20:09:06.568118 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.568102 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="df27caca-27b2-4278-b3e1-119d6af3a947" containerName="registry" Apr 20 20:09:06.572880 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.572864 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:06.582648 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.582622 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dff456c99-rql5b"] Apr 20 20:09:06.666949 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.666860 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-console-serving-cert\") pod \"console-6dff456c99-rql5b\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:06.666949 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.666913 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-service-ca\") pod \"console-6dff456c99-rql5b\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:06.666949 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.666931 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-console-config\") pod \"console-6dff456c99-rql5b\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:06.667148 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.667015 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26pmq\" (UniqueName: \"kubernetes.io/projected/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-kube-api-access-26pmq\") pod \"console-6dff456c99-rql5b\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:06.667148 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.667044 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-oauth-serving-cert\") pod \"console-6dff456c99-rql5b\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:06.667148 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.667073 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-trusted-ca-bundle\") pod \"console-6dff456c99-rql5b\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:06.667148 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.667122 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-console-oauth-config\") pod \"console-6dff456c99-rql5b\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:06.768323 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.768286 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-trusted-ca-bundle\") pod \"console-6dff456c99-rql5b\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:06.768477 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.768336 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-console-oauth-config\") pod \"console-6dff456c99-rql5b\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:06.768477 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.768364 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-console-serving-cert\") pod \"console-6dff456c99-rql5b\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:06.768477 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.768399 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-service-ca\") pod \"console-6dff456c99-rql5b\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:06.768477 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.768415 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-console-config\") pod \"console-6dff456c99-rql5b\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:06.768674 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.768490 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26pmq\" (UniqueName: \"kubernetes.io/projected/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-kube-api-access-26pmq\") pod \"console-6dff456c99-rql5b\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:06.768674 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.768555 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-oauth-serving-cert\") pod \"console-6dff456c99-rql5b\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:06.769204 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.769153 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-trusted-ca-bundle\") pod \"console-6dff456c99-rql5b\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:06.769299 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.769286 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-oauth-serving-cert\") pod \"console-6dff456c99-rql5b\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:06.769463 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.769443 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-service-ca\") pod \"console-6dff456c99-rql5b\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:06.769512 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.769479 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-console-config\") pod \"console-6dff456c99-rql5b\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:06.770964 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.770945 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-console-serving-cert\") pod \"console-6dff456c99-rql5b\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:06.771068 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.771017 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-console-oauth-config\") pod \"console-6dff456c99-rql5b\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:06.776533 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.776509 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26pmq\" (UniqueName: \"kubernetes.io/projected/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-kube-api-access-26pmq\") pod \"console-6dff456c99-rql5b\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:06.881584 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:06.881549 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:07.002224 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:07.002189 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dff456c99-rql5b"] Apr 20 20:09:07.004525 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:09:07.004499 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6b91604_e5cc_4420_8304_b2cc7e9fbaf1.slice/crio-62b88b22609188c2b48f3cf85287fe2b99ce8620da5a0c6767428eb4385e8712 WatchSource:0}: Error finding container 62b88b22609188c2b48f3cf85287fe2b99ce8620da5a0c6767428eb4385e8712: Status 404 returned error can't find the container with id 62b88b22609188c2b48f3cf85287fe2b99ce8620da5a0c6767428eb4385e8712 Apr 20 20:09:07.190197 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:07.190118 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dff456c99-rql5b" event={"ID":"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1","Type":"ContainerStarted","Data":"8ac5710b3558a59b3975d7e0281b474112722ad9f28dc9e1c472dcc4c86f7725"} Apr 20 20:09:07.190197 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:07.190153 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dff456c99-rql5b" event={"ID":"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1","Type":"ContainerStarted","Data":"62b88b22609188c2b48f3cf85287fe2b99ce8620da5a0c6767428eb4385e8712"} Apr 20 20:09:07.212691 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:07.212651 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6dff456c99-rql5b" podStartSLOduration=1.212639745 podStartE2EDuration="1.212639745s" podCreationTimestamp="2026-04-20 20:09:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:09:07.212217067 +0000 UTC m=+188.251706613" watchObservedRunningTime="2026-04-20 20:09:07.212639745 +0000 UTC m=+188.252129268" Apr 20 20:09:10.824787 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:10.824754 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:16.882146 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:16.882108 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:16.882706 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:16.882178 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:16.887844 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:16.887823 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:17.221973 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:17.221885 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:09:17.269201 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:17.269168 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-799fc4df48-mqvj9"] Apr 20 20:09:25.498817 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.498756 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-56448cd7d-vzb9k" podUID="7fcde684-dc76-4cf0-b02b-c9d8bc27703f" containerName="console" containerID="cri-o://1c600e4d9af77e2794d8043c1beb9a6121971c5285dc84ef32a80fa66ac8c638" gracePeriod=15 Apr 20 20:09:25.739877 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.739855 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56448cd7d-vzb9k_7fcde684-dc76-4cf0-b02b-c9d8bc27703f/console/0.log" Apr 20 20:09:25.740004 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.739921 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:09:25.836136 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.836042 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-service-ca\") pod \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " Apr 20 20:09:25.836136 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.836093 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-oauth-serving-cert\") pod \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " Apr 20 20:09:25.836136 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.836131 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5slx5\" (UniqueName: \"kubernetes.io/projected/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-kube-api-access-5slx5\") pod \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " Apr 20 20:09:25.836357 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.836249 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-console-config\") pod \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " Apr 20 20:09:25.836357 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.836302 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-trusted-ca-bundle\") pod \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " Apr 20 20:09:25.836357 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.836342 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-console-oauth-config\") pod \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " Apr 20 20:09:25.836508 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.836395 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-service-ca" (OuterVolumeSpecName: "service-ca") pod "7fcde684-dc76-4cf0-b02b-c9d8bc27703f" (UID: "7fcde684-dc76-4cf0-b02b-c9d8bc27703f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:25.836578 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.836558 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7fcde684-dc76-4cf0-b02b-c9d8bc27703f" (UID: "7fcde684-dc76-4cf0-b02b-c9d8bc27703f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:25.836635 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.836608 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-console-config" (OuterVolumeSpecName: "console-config") pod "7fcde684-dc76-4cf0-b02b-c9d8bc27703f" (UID: "7fcde684-dc76-4cf0-b02b-c9d8bc27703f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:25.836695 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.836658 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-service-ca\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:25.836695 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.836679 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-oauth-serving-cert\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:25.836793 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.836713 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7fcde684-dc76-4cf0-b02b-c9d8bc27703f" (UID: "7fcde684-dc76-4cf0-b02b-c9d8bc27703f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:25.838453 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.838407 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-kube-api-access-5slx5" (OuterVolumeSpecName: "kube-api-access-5slx5") pod "7fcde684-dc76-4cf0-b02b-c9d8bc27703f" (UID: "7fcde684-dc76-4cf0-b02b-c9d8bc27703f"). InnerVolumeSpecName "kube-api-access-5slx5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:25.838583 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.838510 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7fcde684-dc76-4cf0-b02b-c9d8bc27703f" (UID: "7fcde684-dc76-4cf0-b02b-c9d8bc27703f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:25.937005 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.936960 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-console-serving-cert\") pod \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\" (UID: \"7fcde684-dc76-4cf0-b02b-c9d8bc27703f\") " Apr 20 20:09:25.937195 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.937182 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5slx5\" (UniqueName: \"kubernetes.io/projected/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-kube-api-access-5slx5\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:25.937257 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.937198 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-console-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:25.937257 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.937207 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-trusted-ca-bundle\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:25.937257 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.937216 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-console-oauth-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:25.939148 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:25.939126 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7fcde684-dc76-4cf0-b02b-c9d8bc27703f" (UID: "7fcde684-dc76-4cf0-b02b-c9d8bc27703f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:26.038342 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:26.038311 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7fcde684-dc76-4cf0-b02b-c9d8bc27703f-console-serving-cert\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:26.245036 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:26.244953 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56448cd7d-vzb9k_7fcde684-dc76-4cf0-b02b-c9d8bc27703f/console/0.log" Apr 20 20:09:26.245036 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:26.244993 2571 generic.go:358] "Generic (PLEG): container finished" podID="7fcde684-dc76-4cf0-b02b-c9d8bc27703f" containerID="1c600e4d9af77e2794d8043c1beb9a6121971c5285dc84ef32a80fa66ac8c638" exitCode=2 Apr 20 20:09:26.245248 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:26.245076 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56448cd7d-vzb9k" Apr 20 20:09:26.245248 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:26.245089 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56448cd7d-vzb9k" event={"ID":"7fcde684-dc76-4cf0-b02b-c9d8bc27703f","Type":"ContainerDied","Data":"1c600e4d9af77e2794d8043c1beb9a6121971c5285dc84ef32a80fa66ac8c638"} Apr 20 20:09:26.245248 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:26.245128 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56448cd7d-vzb9k" event={"ID":"7fcde684-dc76-4cf0-b02b-c9d8bc27703f","Type":"ContainerDied","Data":"e08165dbab43ef723259a29c94bff4d3fa54e8e3147beb5ed6fcfbb8d60d4676"} Apr 20 20:09:26.245248 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:26.245143 2571 scope.go:117] "RemoveContainer" containerID="1c600e4d9af77e2794d8043c1beb9a6121971c5285dc84ef32a80fa66ac8c638" Apr 20 20:09:26.258329 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:26.258310 2571 scope.go:117] "RemoveContainer" containerID="1c600e4d9af77e2794d8043c1beb9a6121971c5285dc84ef32a80fa66ac8c638" Apr 20 20:09:26.258671 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:09:26.258649 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c600e4d9af77e2794d8043c1beb9a6121971c5285dc84ef32a80fa66ac8c638\": container with ID starting with 1c600e4d9af77e2794d8043c1beb9a6121971c5285dc84ef32a80fa66ac8c638 not found: ID does not exist" containerID="1c600e4d9af77e2794d8043c1beb9a6121971c5285dc84ef32a80fa66ac8c638" Apr 20 20:09:26.258832 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:26.258678 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c600e4d9af77e2794d8043c1beb9a6121971c5285dc84ef32a80fa66ac8c638"} err="failed to get container status \"1c600e4d9af77e2794d8043c1beb9a6121971c5285dc84ef32a80fa66ac8c638\": rpc error: code = NotFound desc = could not find container \"1c600e4d9af77e2794d8043c1beb9a6121971c5285dc84ef32a80fa66ac8c638\": container with ID starting with 1c600e4d9af77e2794d8043c1beb9a6121971c5285dc84ef32a80fa66ac8c638 not found: ID does not exist" Apr 20 20:09:26.268339 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:26.268316 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56448cd7d-vzb9k"] Apr 20 20:09:26.272134 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:26.272113 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-56448cd7d-vzb9k"] Apr 20 20:09:27.527477 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:27.527442 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fcde684-dc76-4cf0-b02b-c9d8bc27703f" path="/var/lib/kubelet/pods/7fcde684-dc76-4cf0-b02b-c9d8bc27703f/volumes" Apr 20 20:09:28.252489 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:28.252455 2571 generic.go:358] "Generic (PLEG): container finished" podID="0b604c7c-dbda-486e-9ca5-fd23ee10bc87" containerID="b98e5d6e80a67d8fc5ef2e630a709532f553d959b066ee051b7a053ce6960d19" exitCode=0 Apr 20 20:09:28.252652 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:28.252519 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8ftcl" event={"ID":"0b604c7c-dbda-486e-9ca5-fd23ee10bc87","Type":"ContainerDied","Data":"b98e5d6e80a67d8fc5ef2e630a709532f553d959b066ee051b7a053ce6960d19"} Apr 20 20:09:28.252868 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:28.252853 2571 scope.go:117] "RemoveContainer" containerID="b98e5d6e80a67d8fc5ef2e630a709532f553d959b066ee051b7a053ce6960d19" Apr 20 20:09:29.258461 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:29.258402 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8ftcl" event={"ID":"0b604c7c-dbda-486e-9ca5-fd23ee10bc87","Type":"ContainerStarted","Data":"8033d220eb77c3639e859c8409986becfdb17c40e2979a9e67c15494b1f70801"} Apr 20 20:09:31.550975 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.550910 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7db6575c65-48mn2" podUID="cdfb92cd-2029-45d6-aa1b-7b28cb1ad973" containerName="console" containerID="cri-o://12c49800dfff57fdd62dc902ec1f554e200d362a49f363ab8b74201898d8f471" gracePeriod=15 Apr 20 20:09:31.796518 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.796497 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7db6575c65-48mn2_cdfb92cd-2029-45d6-aa1b-7b28cb1ad973/console/0.log" Apr 20 20:09:31.796627 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.796558 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:31.884982 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.884899 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-console-serving-cert\") pod \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " Apr 20 20:09:31.884982 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.884940 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-console-oauth-config\") pod \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " Apr 20 20:09:31.884982 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.884981 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-trusted-ca-bundle\") pod \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " Apr 20 20:09:31.885239 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.885038 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-console-config\") pod \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " Apr 20 20:09:31.885239 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.885065 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpvr5\" (UniqueName: \"kubernetes.io/projected/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-kube-api-access-cpvr5\") pod \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " Apr 20 20:09:31.885239 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.885087 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-oauth-serving-cert\") pod \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " Apr 20 20:09:31.885239 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.885112 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-service-ca\") pod \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\" (UID: \"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973\") " Apr 20 20:09:31.885491 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.885455 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-console-config" (OuterVolumeSpecName: "console-config") pod "cdfb92cd-2029-45d6-aa1b-7b28cb1ad973" (UID: "cdfb92cd-2029-45d6-aa1b-7b28cb1ad973"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:31.885565 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.885518 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "cdfb92cd-2029-45d6-aa1b-7b28cb1ad973" (UID: "cdfb92cd-2029-45d6-aa1b-7b28cb1ad973"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:31.885637 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.885603 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-service-ca" (OuterVolumeSpecName: "service-ca") pod "cdfb92cd-2029-45d6-aa1b-7b28cb1ad973" (UID: "cdfb92cd-2029-45d6-aa1b-7b28cb1ad973"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:31.885756 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.885678 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "cdfb92cd-2029-45d6-aa1b-7b28cb1ad973" (UID: "cdfb92cd-2029-45d6-aa1b-7b28cb1ad973"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:31.885827 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.885807 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-console-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:31.885885 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.885832 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-oauth-serving-cert\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:31.885885 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.885847 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-service-ca\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:31.885885 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.885862 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-trusted-ca-bundle\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:31.887140 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.887119 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "cdfb92cd-2029-45d6-aa1b-7b28cb1ad973" (UID: "cdfb92cd-2029-45d6-aa1b-7b28cb1ad973"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:31.887814 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.887779 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "cdfb92cd-2029-45d6-aa1b-7b28cb1ad973" (UID: "cdfb92cd-2029-45d6-aa1b-7b28cb1ad973"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:31.887814 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.887791 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-kube-api-access-cpvr5" (OuterVolumeSpecName: "kube-api-access-cpvr5") pod "cdfb92cd-2029-45d6-aa1b-7b28cb1ad973" (UID: "cdfb92cd-2029-45d6-aa1b-7b28cb1ad973"). InnerVolumeSpecName "kube-api-access-cpvr5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:31.986406 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.986359 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-console-serving-cert\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:31.986406 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.986401 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-console-oauth-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:31.986406 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:31.986412 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cpvr5\" (UniqueName: \"kubernetes.io/projected/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973-kube-api-access-cpvr5\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:32.267755 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:32.267725 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7db6575c65-48mn2_cdfb92cd-2029-45d6-aa1b-7b28cb1ad973/console/0.log" Apr 20 20:09:32.267901 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:32.267766 2571 generic.go:358] "Generic (PLEG): container finished" podID="cdfb92cd-2029-45d6-aa1b-7b28cb1ad973" containerID="12c49800dfff57fdd62dc902ec1f554e200d362a49f363ab8b74201898d8f471" exitCode=2 Apr 20 20:09:32.267901 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:32.267804 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7db6575c65-48mn2" event={"ID":"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973","Type":"ContainerDied","Data":"12c49800dfff57fdd62dc902ec1f554e200d362a49f363ab8b74201898d8f471"} Apr 20 20:09:32.267901 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:32.267826 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7db6575c65-48mn2" event={"ID":"cdfb92cd-2029-45d6-aa1b-7b28cb1ad973","Type":"ContainerDied","Data":"817fa58f256b85397d4af7009dff74b2db4162747ada14ed8a657861db2b6a8e"} Apr 20 20:09:32.267901 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:32.267840 2571 scope.go:117] "RemoveContainer" containerID="12c49800dfff57fdd62dc902ec1f554e200d362a49f363ab8b74201898d8f471" Apr 20 20:09:32.267901 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:32.267838 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7db6575c65-48mn2" Apr 20 20:09:32.276756 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:32.276738 2571 scope.go:117] "RemoveContainer" containerID="12c49800dfff57fdd62dc902ec1f554e200d362a49f363ab8b74201898d8f471" Apr 20 20:09:32.276989 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:09:32.276970 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12c49800dfff57fdd62dc902ec1f554e200d362a49f363ab8b74201898d8f471\": container with ID starting with 12c49800dfff57fdd62dc902ec1f554e200d362a49f363ab8b74201898d8f471 not found: ID does not exist" containerID="12c49800dfff57fdd62dc902ec1f554e200d362a49f363ab8b74201898d8f471" Apr 20 20:09:32.277058 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:32.276998 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12c49800dfff57fdd62dc902ec1f554e200d362a49f363ab8b74201898d8f471"} err="failed to get container status \"12c49800dfff57fdd62dc902ec1f554e200d362a49f363ab8b74201898d8f471\": rpc error: code = NotFound desc = could not find container \"12c49800dfff57fdd62dc902ec1f554e200d362a49f363ab8b74201898d8f471\": container with ID starting with 12c49800dfff57fdd62dc902ec1f554e200d362a49f363ab8b74201898d8f471 not found: ID does not exist" Apr 20 20:09:32.288769 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:32.288742 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7db6575c65-48mn2"] Apr 20 20:09:32.293021 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:32.292999 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7db6575c65-48mn2"] Apr 20 20:09:33.526909 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:33.526868 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdfb92cd-2029-45d6-aa1b-7b28cb1ad973" path="/var/lib/kubelet/pods/cdfb92cd-2029-45d6-aa1b-7b28cb1ad973/volumes" Apr 20 20:09:42.291157 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:42.291099 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-799fc4df48-mqvj9" podUID="90559697-e5dd-4c9e-b3e3-d63828850d61" containerName="console" containerID="cri-o://639292617edd7fe48fa9512833edbfa1127b1326f620a52c711b14539fa128a6" gracePeriod=15 Apr 20 20:09:42.557414 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:42.557391 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-799fc4df48-mqvj9_90559697-e5dd-4c9e-b3e3-d63828850d61/console/0.log" Apr 20 20:09:42.557561 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:42.557477 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:09:42.678065 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:42.678031 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90559697-e5dd-4c9e-b3e3-d63828850d61-service-ca\") pod \"90559697-e5dd-4c9e-b3e3-d63828850d61\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " Apr 20 20:09:42.678354 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:42.678336 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90559697-e5dd-4c9e-b3e3-d63828850d61-oauth-serving-cert\") pod \"90559697-e5dd-4c9e-b3e3-d63828850d61\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " Apr 20 20:09:42.678950 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:42.678588 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90559697-e5dd-4c9e-b3e3-d63828850d61-service-ca" (OuterVolumeSpecName: "service-ca") pod "90559697-e5dd-4c9e-b3e3-d63828850d61" (UID: "90559697-e5dd-4c9e-b3e3-d63828850d61"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:42.678950 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:42.678796 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90559697-e5dd-4c9e-b3e3-d63828850d61-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "90559697-e5dd-4c9e-b3e3-d63828850d61" (UID: "90559697-e5dd-4c9e-b3e3-d63828850d61"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:42.679159 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:42.679141 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90559697-e5dd-4c9e-b3e3-d63828850d61-console-config\") pod \"90559697-e5dd-4c9e-b3e3-d63828850d61\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " Apr 20 20:09:42.679654 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:42.679534 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90559697-e5dd-4c9e-b3e3-d63828850d61-console-config" (OuterVolumeSpecName: "console-config") pod "90559697-e5dd-4c9e-b3e3-d63828850d61" (UID: "90559697-e5dd-4c9e-b3e3-d63828850d61"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:42.680931 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:42.679608 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90559697-e5dd-4c9e-b3e3-d63828850d61-console-serving-cert\") pod \"90559697-e5dd-4c9e-b3e3-d63828850d61\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " Apr 20 20:09:42.680931 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:42.680111 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90559697-e5dd-4c9e-b3e3-d63828850d61-console-oauth-config\") pod \"90559697-e5dd-4c9e-b3e3-d63828850d61\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " Apr 20 20:09:42.680931 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:42.680199 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z546s\" (UniqueName: \"kubernetes.io/projected/90559697-e5dd-4c9e-b3e3-d63828850d61-kube-api-access-z546s\") pod \"90559697-e5dd-4c9e-b3e3-d63828850d61\" (UID: \"90559697-e5dd-4c9e-b3e3-d63828850d61\") " Apr 20 20:09:42.680931 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:42.680465 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90559697-e5dd-4c9e-b3e3-d63828850d61-service-ca\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:42.680931 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:42.680487 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90559697-e5dd-4c9e-b3e3-d63828850d61-oauth-serving-cert\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:42.680931 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:42.680503 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90559697-e5dd-4c9e-b3e3-d63828850d61-console-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:42.686681 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:42.686644 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90559697-e5dd-4c9e-b3e3-d63828850d61-kube-api-access-z546s" (OuterVolumeSpecName: "kube-api-access-z546s") pod "90559697-e5dd-4c9e-b3e3-d63828850d61" (UID: "90559697-e5dd-4c9e-b3e3-d63828850d61"). InnerVolumeSpecName "kube-api-access-z546s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:42.687893 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:42.687856 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90559697-e5dd-4c9e-b3e3-d63828850d61-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "90559697-e5dd-4c9e-b3e3-d63828850d61" (UID: "90559697-e5dd-4c9e-b3e3-d63828850d61"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:42.690412 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:42.690387 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90559697-e5dd-4c9e-b3e3-d63828850d61-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "90559697-e5dd-4c9e-b3e3-d63828850d61" (UID: "90559697-e5dd-4c9e-b3e3-d63828850d61"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:42.781130 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:42.781045 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90559697-e5dd-4c9e-b3e3-d63828850d61-console-serving-cert\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:42.781130 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:42.781084 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90559697-e5dd-4c9e-b3e3-d63828850d61-console-oauth-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:42.781130 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:42.781102 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z546s\" (UniqueName: \"kubernetes.io/projected/90559697-e5dd-4c9e-b3e3-d63828850d61-kube-api-access-z546s\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:09:43.303194 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:43.303165 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-799fc4df48-mqvj9_90559697-e5dd-4c9e-b3e3-d63828850d61/console/0.log" Apr 20 20:09:43.303626 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:43.303205 2571 generic.go:358] "Generic (PLEG): container finished" podID="90559697-e5dd-4c9e-b3e3-d63828850d61" containerID="639292617edd7fe48fa9512833edbfa1127b1326f620a52c711b14539fa128a6" exitCode=2 Apr 20 20:09:43.303626 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:43.303260 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-799fc4df48-mqvj9" event={"ID":"90559697-e5dd-4c9e-b3e3-d63828850d61","Type":"ContainerDied","Data":"639292617edd7fe48fa9512833edbfa1127b1326f620a52c711b14539fa128a6"} Apr 20 20:09:43.303626 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:43.303265 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-799fc4df48-mqvj9" Apr 20 20:09:43.303626 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:43.303299 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-799fc4df48-mqvj9" event={"ID":"90559697-e5dd-4c9e-b3e3-d63828850d61","Type":"ContainerDied","Data":"c52cbe4e5d1dfb14fd448d11ccce461df1b25fc46c808367c71ca9a98531f0b1"} Apr 20 20:09:43.303626 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:43.303314 2571 scope.go:117] "RemoveContainer" containerID="639292617edd7fe48fa9512833edbfa1127b1326f620a52c711b14539fa128a6" Apr 20 20:09:43.312039 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:43.312021 2571 scope.go:117] "RemoveContainer" containerID="639292617edd7fe48fa9512833edbfa1127b1326f620a52c711b14539fa128a6" Apr 20 20:09:43.312289 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:09:43.312272 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"639292617edd7fe48fa9512833edbfa1127b1326f620a52c711b14539fa128a6\": container with ID starting with 639292617edd7fe48fa9512833edbfa1127b1326f620a52c711b14539fa128a6 not found: ID does not exist" containerID="639292617edd7fe48fa9512833edbfa1127b1326f620a52c711b14539fa128a6" Apr 20 20:09:43.312338 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:43.312297 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"639292617edd7fe48fa9512833edbfa1127b1326f620a52c711b14539fa128a6"} err="failed to get container status \"639292617edd7fe48fa9512833edbfa1127b1326f620a52c711b14539fa128a6\": rpc error: code = NotFound desc = could not find container \"639292617edd7fe48fa9512833edbfa1127b1326f620a52c711b14539fa128a6\": container with ID starting with 639292617edd7fe48fa9512833edbfa1127b1326f620a52c711b14539fa128a6 not found: ID does not exist" Apr 20 20:09:43.323721 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:43.323689 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-799fc4df48-mqvj9"] Apr 20 20:09:43.332696 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:43.332670 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-799fc4df48-mqvj9"] Apr 20 20:09:43.526636 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:09:43.526605 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90559697-e5dd-4c9e-b3e3-d63828850d61" path="/var/lib/kubelet/pods/90559697-e5dd-4c9e-b3e3-d63828850d61/volumes" Apr 20 20:10:10.312317 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.312278 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs\") pod \"network-metrics-daemon-9sbrz\" (UID: \"012dcd86-26f0-4115-bd86-d5066c900541\") " pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:10:10.314719 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.314692 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012dcd86-26f0-4115-bd86-d5066c900541-metrics-certs\") pod \"network-metrics-daemon-9sbrz\" (UID: \"012dcd86-26f0-4115-bd86-d5066c900541\") " pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:10:10.529989 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.529962 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-4tpsm\"" Apr 20 20:10:10.538165 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.538148 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9sbrz" Apr 20 20:10:10.656613 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.656575 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9sbrz"] Apr 20 20:10:10.660160 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:10:10.660123 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod012dcd86_26f0_4115_bd86_d5066c900541.slice/crio-e9291750238604ae6366d91b5f0edefa7fbb3f55fdbad9e6e806345ba51747d2 WatchSource:0}: Error finding container e9291750238604ae6366d91b5f0edefa7fbb3f55fdbad9e6e806345ba51747d2: Status 404 returned error can't find the container with id e9291750238604ae6366d91b5f0edefa7fbb3f55fdbad9e6e806345ba51747d2 Apr 20 20:10:10.794828 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.794796 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7d756b6df4-svlfv"] Apr 20 20:10:10.795179 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.795164 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90559697-e5dd-4c9e-b3e3-d63828850d61" containerName="console" Apr 20 20:10:10.795245 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.795181 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="90559697-e5dd-4c9e-b3e3-d63828850d61" containerName="console" Apr 20 20:10:10.795245 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.795193 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7fcde684-dc76-4cf0-b02b-c9d8bc27703f" containerName="console" Apr 20 20:10:10.795245 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.795198 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fcde684-dc76-4cf0-b02b-c9d8bc27703f" containerName="console" Apr 20 20:10:10.795245 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.795205 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdfb92cd-2029-45d6-aa1b-7b28cb1ad973" containerName="console" Apr 20 20:10:10.795245 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.795210 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfb92cd-2029-45d6-aa1b-7b28cb1ad973" containerName="console" Apr 20 20:10:10.795394 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.795268 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="7fcde684-dc76-4cf0-b02b-c9d8bc27703f" containerName="console" Apr 20 20:10:10.795394 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.795276 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdfb92cd-2029-45d6-aa1b-7b28cb1ad973" containerName="console" Apr 20 20:10:10.795394 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.795283 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="90559697-e5dd-4c9e-b3e3-d63828850d61" containerName="console" Apr 20 20:10:10.799480 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.799460 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:10.811196 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.811167 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d756b6df4-svlfv"] Apr 20 20:10:10.817008 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.816983 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-console-serving-cert\") pod \"console-7d756b6df4-svlfv\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:10.817095 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.817044 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-console-oauth-config\") pod \"console-7d756b6df4-svlfv\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:10.817095 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.817074 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-trusted-ca-bundle\") pod \"console-7d756b6df4-svlfv\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:10.817168 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.817133 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-console-config\") pod \"console-7d756b6df4-svlfv\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:10.817208 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.817193 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7j55\" (UniqueName: \"kubernetes.io/projected/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-kube-api-access-r7j55\") pod \"console-7d756b6df4-svlfv\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:10.817243 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.817218 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-service-ca\") pod \"console-7d756b6df4-svlfv\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:10.817275 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.817246 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-oauth-serving-cert\") pod \"console-7d756b6df4-svlfv\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:10.918547 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.918476 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-console-config\") pod \"console-7d756b6df4-svlfv\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:10.918547 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.918518 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7j55\" (UniqueName: \"kubernetes.io/projected/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-kube-api-access-r7j55\") pod \"console-7d756b6df4-svlfv\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:10.918693 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.918566 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-service-ca\") pod \"console-7d756b6df4-svlfv\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:10.918693 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.918615 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-oauth-serving-cert\") pod \"console-7d756b6df4-svlfv\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:10.918833 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.918808 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-console-serving-cert\") pod \"console-7d756b6df4-svlfv\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:10.918903 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.918886 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-console-oauth-config\") pod \"console-7d756b6df4-svlfv\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:10.918955 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.918920 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-trusted-ca-bundle\") pod \"console-7d756b6df4-svlfv\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:10.919345 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.919312 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-service-ca\") pod \"console-7d756b6df4-svlfv\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:10.919345 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.919328 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-console-config\") pod \"console-7d756b6df4-svlfv\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:10.919345 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.919342 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-oauth-serving-cert\") pod \"console-7d756b6df4-svlfv\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:10.919653 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.919637 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-trusted-ca-bundle\") pod \"console-7d756b6df4-svlfv\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:10.921391 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.921374 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-console-serving-cert\") pod \"console-7d756b6df4-svlfv\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:10.921458 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.921408 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-console-oauth-config\") pod \"console-7d756b6df4-svlfv\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:10.933789 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:10.933767 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7j55\" (UniqueName: \"kubernetes.io/projected/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-kube-api-access-r7j55\") pod \"console-7d756b6df4-svlfv\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:11.108835 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:11.108793 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:11.275167 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:11.275122 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d756b6df4-svlfv"] Apr 20 20:10:11.278408 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:10:11.278382 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5aff7e3c_f58f_418e_9e7e_5377d4ba84ab.slice/crio-fc4d2849d92b775c99e06f21f6b832e04468aeb85d96fa0c6e9a4b4f173ee732 WatchSource:0}: Error finding container fc4d2849d92b775c99e06f21f6b832e04468aeb85d96fa0c6e9a4b4f173ee732: Status 404 returned error can't find the container with id fc4d2849d92b775c99e06f21f6b832e04468aeb85d96fa0c6e9a4b4f173ee732 Apr 20 20:10:11.383734 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:11.383695 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d756b6df4-svlfv" event={"ID":"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab","Type":"ContainerStarted","Data":"da779d6fb29b470d619323d691421cb30132be875d4adccc6dfa290468680eac"} Apr 20 20:10:11.383734 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:11.383741 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d756b6df4-svlfv" event={"ID":"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab","Type":"ContainerStarted","Data":"fc4d2849d92b775c99e06f21f6b832e04468aeb85d96fa0c6e9a4b4f173ee732"} Apr 20 20:10:11.384976 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:11.384955 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9sbrz" event={"ID":"012dcd86-26f0-4115-bd86-d5066c900541","Type":"ContainerStarted","Data":"e9291750238604ae6366d91b5f0edefa7fbb3f55fdbad9e6e806345ba51747d2"} Apr 20 20:10:11.415398 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:11.414720 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7d756b6df4-svlfv" podStartSLOduration=1.414691823 podStartE2EDuration="1.414691823s" podCreationTimestamp="2026-04-20 20:10:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:10:11.413601358 +0000 UTC m=+252.453090905" watchObservedRunningTime="2026-04-20 20:10:11.414691823 +0000 UTC m=+252.454181371" Apr 20 20:10:12.389637 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:12.389596 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9sbrz" event={"ID":"012dcd86-26f0-4115-bd86-d5066c900541","Type":"ContainerStarted","Data":"94c8e2adfb2fa1c012e62a4d19f9d99b7ac8ae993bd2688816a7b11d361dac5f"} Apr 20 20:10:12.389637 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:12.389641 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9sbrz" event={"ID":"012dcd86-26f0-4115-bd86-d5066c900541","Type":"ContainerStarted","Data":"f09b73d0a0d0f7106d48c309f2d2fccb946ca5e7d8216ed1e2223d003720c263"} Apr 20 20:10:12.400411 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:12.400383 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:10:12.401032 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:12.400893 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="alertmanager" containerID="cri-o://406796faf2058bf914fe9981e4e02e0669e29cba8364f7f837ae8e1332af3958" gracePeriod=120 Apr 20 20:10:12.401032 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:12.400916 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="kube-rbac-proxy-web" containerID="cri-o://fe70cbe6aba93eb1c70e5b4edd62fac56cad1af33f7e71aea532f8995ea83072" gracePeriod=120 Apr 20 20:10:12.401032 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:12.400931 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="prom-label-proxy" containerID="cri-o://9b0b6b6a2c63178905f1c14f97f6bd70d005cb7a85b6b23c5666c6ed99ced3b1" gracePeriod=120 Apr 20 20:10:12.401032 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:12.400941 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="config-reloader" containerID="cri-o://8605dff8170b83f6facff19fe0df8cc61e1f580e257debd85cac36def8d59cea" gracePeriod=120 Apr 20 20:10:12.401032 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:12.400896 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="kube-rbac-proxy" containerID="cri-o://ce922266eecce4e7eeed17167268480eecfd64d03410c59a2f7cf1b8069b0d70" gracePeriod=120 Apr 20 20:10:12.401032 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:12.400930 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="kube-rbac-proxy-metric" containerID="cri-o://6670288f366cb270940e7acb6c043e90ed7d7eb8f113fa7884016d209bd1e598" gracePeriod=120 Apr 20 20:10:12.407548 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:12.407454 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9sbrz" podStartSLOduration=252.39800483 podStartE2EDuration="4m13.40740633s" podCreationTimestamp="2026-04-20 20:05:59 +0000 UTC" firstStartedPulling="2026-04-20 20:10:10.661844983 +0000 UTC m=+251.701334506" lastFinishedPulling="2026-04-20 20:10:11.671246463 +0000 UTC m=+252.710736006" observedRunningTime="2026-04-20 20:10:12.406898178 +0000 UTC m=+253.446387724" watchObservedRunningTime="2026-04-20 20:10:12.40740633 +0000 UTC m=+253.446895877" Apr 20 20:10:13.394712 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.394672 2571 generic.go:358] "Generic (PLEG): container finished" podID="689afe77-0baf-42f8-aabf-a28730e1c663" containerID="9b0b6b6a2c63178905f1c14f97f6bd70d005cb7a85b6b23c5666c6ed99ced3b1" exitCode=0 Apr 20 20:10:13.394712 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.394703 2571 generic.go:358] "Generic (PLEG): container finished" podID="689afe77-0baf-42f8-aabf-a28730e1c663" containerID="ce922266eecce4e7eeed17167268480eecfd64d03410c59a2f7cf1b8069b0d70" exitCode=0 Apr 20 20:10:13.394712 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.394710 2571 generic.go:358] "Generic (PLEG): container finished" podID="689afe77-0baf-42f8-aabf-a28730e1c663" containerID="8605dff8170b83f6facff19fe0df8cc61e1f580e257debd85cac36def8d59cea" exitCode=0 Apr 20 20:10:13.394712 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.394718 2571 generic.go:358] "Generic (PLEG): container finished" podID="689afe77-0baf-42f8-aabf-a28730e1c663" containerID="406796faf2058bf914fe9981e4e02e0669e29cba8364f7f837ae8e1332af3958" exitCode=0 Apr 20 20:10:13.395164 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.394739 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"689afe77-0baf-42f8-aabf-a28730e1c663","Type":"ContainerDied","Data":"9b0b6b6a2c63178905f1c14f97f6bd70d005cb7a85b6b23c5666c6ed99ced3b1"} Apr 20 20:10:13.395164 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.394771 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"689afe77-0baf-42f8-aabf-a28730e1c663","Type":"ContainerDied","Data":"ce922266eecce4e7eeed17167268480eecfd64d03410c59a2f7cf1b8069b0d70"} Apr 20 20:10:13.395164 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.394785 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"689afe77-0baf-42f8-aabf-a28730e1c663","Type":"ContainerDied","Data":"8605dff8170b83f6facff19fe0df8cc61e1f580e257debd85cac36def8d59cea"} Apr 20 20:10:13.395164 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.394797 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"689afe77-0baf-42f8-aabf-a28730e1c663","Type":"ContainerDied","Data":"406796faf2058bf914fe9981e4e02e0669e29cba8364f7f837ae8e1332af3958"} Apr 20 20:10:13.647249 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.647188 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:13.744094 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.744058 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/689afe77-0baf-42f8-aabf-a28730e1c663-tls-assets\") pod \"689afe77-0baf-42f8-aabf-a28730e1c663\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " Apr 20 20:10:13.744283 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.744103 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-kube-rbac-proxy-metric\") pod \"689afe77-0baf-42f8-aabf-a28730e1c663\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " Apr 20 20:10:13.744283 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.744142 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-kube-rbac-proxy\") pod \"689afe77-0baf-42f8-aabf-a28730e1c663\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " Apr 20 20:10:13.744283 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.744179 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/689afe77-0baf-42f8-aabf-a28730e1c663-alertmanager-trusted-ca-bundle\") pod \"689afe77-0baf-42f8-aabf-a28730e1c663\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " Apr 20 20:10:13.744283 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.744208 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-main-tls\") pod \"689afe77-0baf-42f8-aabf-a28730e1c663\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " Apr 20 20:10:13.744521 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.744347 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-web-config\") pod \"689afe77-0baf-42f8-aabf-a28730e1c663\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " Apr 20 20:10:13.744521 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.744397 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncjhm\" (UniqueName: \"kubernetes.io/projected/689afe77-0baf-42f8-aabf-a28730e1c663-kube-api-access-ncjhm\") pod \"689afe77-0baf-42f8-aabf-a28730e1c663\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " Apr 20 20:10:13.744521 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.744454 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-kube-rbac-proxy-web\") pod \"689afe77-0baf-42f8-aabf-a28730e1c663\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " Apr 20 20:10:13.744521 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.744485 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-cluster-tls-config\") pod \"689afe77-0baf-42f8-aabf-a28730e1c663\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " Apr 20 20:10:13.744521 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.744511 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/689afe77-0baf-42f8-aabf-a28730e1c663-metrics-client-ca\") pod \"689afe77-0baf-42f8-aabf-a28730e1c663\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " Apr 20 20:10:13.744783 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.744564 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-config-volume\") pod \"689afe77-0baf-42f8-aabf-a28730e1c663\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " Apr 20 20:10:13.744783 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.744610 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/689afe77-0baf-42f8-aabf-a28730e1c663-config-out\") pod \"689afe77-0baf-42f8-aabf-a28730e1c663\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " Apr 20 20:10:13.744783 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.744614 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/689afe77-0baf-42f8-aabf-a28730e1c663-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "689afe77-0baf-42f8-aabf-a28730e1c663" (UID: "689afe77-0baf-42f8-aabf-a28730e1c663"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:10:13.744783 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.744644 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/689afe77-0baf-42f8-aabf-a28730e1c663-alertmanager-main-db\") pod \"689afe77-0baf-42f8-aabf-a28730e1c663\" (UID: \"689afe77-0baf-42f8-aabf-a28730e1c663\") " Apr 20 20:10:13.745160 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.745070 2571 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/689afe77-0baf-42f8-aabf-a28730e1c663-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:10:13.745370 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.745347 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/689afe77-0baf-42f8-aabf-a28730e1c663-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "689afe77-0baf-42f8-aabf-a28730e1c663" (UID: "689afe77-0baf-42f8-aabf-a28730e1c663"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:10:13.746854 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.746831 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "689afe77-0baf-42f8-aabf-a28730e1c663" (UID: "689afe77-0baf-42f8-aabf-a28730e1c663"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:10:13.747329 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.747234 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "689afe77-0baf-42f8-aabf-a28730e1c663" (UID: "689afe77-0baf-42f8-aabf-a28730e1c663"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:10:13.747417 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.747376 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/689afe77-0baf-42f8-aabf-a28730e1c663-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "689afe77-0baf-42f8-aabf-a28730e1c663" (UID: "689afe77-0baf-42f8-aabf-a28730e1c663"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:10:13.747417 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.747385 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "689afe77-0baf-42f8-aabf-a28730e1c663" (UID: "689afe77-0baf-42f8-aabf-a28730e1c663"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:10:13.747632 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.747604 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-config-volume" (OuterVolumeSpecName: "config-volume") pod "689afe77-0baf-42f8-aabf-a28730e1c663" (UID: "689afe77-0baf-42f8-aabf-a28730e1c663"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:10:13.747739 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.747702 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/689afe77-0baf-42f8-aabf-a28730e1c663-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "689afe77-0baf-42f8-aabf-a28730e1c663" (UID: "689afe77-0baf-42f8-aabf-a28730e1c663"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:10:13.747928 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.747895 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/689afe77-0baf-42f8-aabf-a28730e1c663-config-out" (OuterVolumeSpecName: "config-out") pod "689afe77-0baf-42f8-aabf-a28730e1c663" (UID: "689afe77-0baf-42f8-aabf-a28730e1c663"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:10:13.749209 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.749190 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/689afe77-0baf-42f8-aabf-a28730e1c663-kube-api-access-ncjhm" (OuterVolumeSpecName: "kube-api-access-ncjhm") pod "689afe77-0baf-42f8-aabf-a28730e1c663" (UID: "689afe77-0baf-42f8-aabf-a28730e1c663"). InnerVolumeSpecName "kube-api-access-ncjhm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:10:13.749336 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.749316 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "689afe77-0baf-42f8-aabf-a28730e1c663" (UID: "689afe77-0baf-42f8-aabf-a28730e1c663"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:10:13.752774 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.752748 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "689afe77-0baf-42f8-aabf-a28730e1c663" (UID: "689afe77-0baf-42f8-aabf-a28730e1c663"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:10:13.759436 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.759399 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-web-config" (OuterVolumeSpecName: "web-config") pod "689afe77-0baf-42f8-aabf-a28730e1c663" (UID: "689afe77-0baf-42f8-aabf-a28730e1c663"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:10:13.845708 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.845661 2571 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-config-volume\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:10:13.845708 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.845704 2571 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/689afe77-0baf-42f8-aabf-a28730e1c663-config-out\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:10:13.845708 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.845714 2571 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/689afe77-0baf-42f8-aabf-a28730e1c663-alertmanager-main-db\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:10:13.845708 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.845723 2571 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/689afe77-0baf-42f8-aabf-a28730e1c663-tls-assets\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:10:13.845940 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.845733 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:10:13.845940 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.845743 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:10:13.845940 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.845754 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-main-tls\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:10:13.845940 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.845763 2571 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-web-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:10:13.845940 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.845772 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ncjhm\" (UniqueName: \"kubernetes.io/projected/689afe77-0baf-42f8-aabf-a28730e1c663-kube-api-access-ncjhm\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:10:13.845940 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.845781 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:10:13.845940 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.845790 2571 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/689afe77-0baf-42f8-aabf-a28730e1c663-cluster-tls-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:10:13.845940 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:13.845798 2571 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/689afe77-0baf-42f8-aabf-a28730e1c663-metrics-client-ca\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:10:14.404676 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.404640 2571 generic.go:358] "Generic (PLEG): container finished" podID="689afe77-0baf-42f8-aabf-a28730e1c663" containerID="6670288f366cb270940e7acb6c043e90ed7d7eb8f113fa7884016d209bd1e598" exitCode=0 Apr 20 20:10:14.404676 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.404666 2571 generic.go:358] "Generic (PLEG): container finished" podID="689afe77-0baf-42f8-aabf-a28730e1c663" containerID="fe70cbe6aba93eb1c70e5b4edd62fac56cad1af33f7e71aea532f8995ea83072" exitCode=0 Apr 20 20:10:14.405193 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.404707 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"689afe77-0baf-42f8-aabf-a28730e1c663","Type":"ContainerDied","Data":"6670288f366cb270940e7acb6c043e90ed7d7eb8f113fa7884016d209bd1e598"} Apr 20 20:10:14.405193 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.404733 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"689afe77-0baf-42f8-aabf-a28730e1c663","Type":"ContainerDied","Data":"fe70cbe6aba93eb1c70e5b4edd62fac56cad1af33f7e71aea532f8995ea83072"} Apr 20 20:10:14.405193 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.404743 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"689afe77-0baf-42f8-aabf-a28730e1c663","Type":"ContainerDied","Data":"5496c26339b61ac0538e847107602a4e63cf65698070742856924b6e3d0e442d"} Apr 20 20:10:14.405193 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.404758 2571 scope.go:117] "RemoveContainer" containerID="9b0b6b6a2c63178905f1c14f97f6bd70d005cb7a85b6b23c5666c6ed99ced3b1" Apr 20 20:10:14.405193 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.404786 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.412120 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.412050 2571 scope.go:117] "RemoveContainer" containerID="6670288f366cb270940e7acb6c043e90ed7d7eb8f113fa7884016d209bd1e598" Apr 20 20:10:14.419630 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.419615 2571 scope.go:117] "RemoveContainer" containerID="ce922266eecce4e7eeed17167268480eecfd64d03410c59a2f7cf1b8069b0d70" Apr 20 20:10:14.425610 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.425595 2571 scope.go:117] "RemoveContainer" containerID="fe70cbe6aba93eb1c70e5b4edd62fac56cad1af33f7e71aea532f8995ea83072" Apr 20 20:10:14.428239 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.428215 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:10:14.432940 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.432924 2571 scope.go:117] "RemoveContainer" containerID="8605dff8170b83f6facff19fe0df8cc61e1f580e257debd85cac36def8d59cea" Apr 20 20:10:14.433634 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.433612 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:10:14.442123 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.442105 2571 scope.go:117] "RemoveContainer" containerID="406796faf2058bf914fe9981e4e02e0669e29cba8364f7f837ae8e1332af3958" Apr 20 20:10:14.448984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.448958 2571 scope.go:117] "RemoveContainer" containerID="525340c82e7e258af0b47538cae9314f2403bf9f16da188a958132c617e08fd9" Apr 20 20:10:14.455130 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.455114 2571 scope.go:117] "RemoveContainer" containerID="9b0b6b6a2c63178905f1c14f97f6bd70d005cb7a85b6b23c5666c6ed99ced3b1" Apr 20 20:10:14.455365 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:10:14.455346 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b0b6b6a2c63178905f1c14f97f6bd70d005cb7a85b6b23c5666c6ed99ced3b1\": container with ID starting with 9b0b6b6a2c63178905f1c14f97f6bd70d005cb7a85b6b23c5666c6ed99ced3b1 not found: ID does not exist" containerID="9b0b6b6a2c63178905f1c14f97f6bd70d005cb7a85b6b23c5666c6ed99ced3b1" Apr 20 20:10:14.455437 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.455375 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b0b6b6a2c63178905f1c14f97f6bd70d005cb7a85b6b23c5666c6ed99ced3b1"} err="failed to get container status \"9b0b6b6a2c63178905f1c14f97f6bd70d005cb7a85b6b23c5666c6ed99ced3b1\": rpc error: code = NotFound desc = could not find container \"9b0b6b6a2c63178905f1c14f97f6bd70d005cb7a85b6b23c5666c6ed99ced3b1\": container with ID starting with 9b0b6b6a2c63178905f1c14f97f6bd70d005cb7a85b6b23c5666c6ed99ced3b1 not found: ID does not exist" Apr 20 20:10:14.455437 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.455394 2571 scope.go:117] "RemoveContainer" containerID="6670288f366cb270940e7acb6c043e90ed7d7eb8f113fa7884016d209bd1e598" Apr 20 20:10:14.455665 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:10:14.455649 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6670288f366cb270940e7acb6c043e90ed7d7eb8f113fa7884016d209bd1e598\": container with ID starting with 6670288f366cb270940e7acb6c043e90ed7d7eb8f113fa7884016d209bd1e598 not found: ID does not exist" containerID="6670288f366cb270940e7acb6c043e90ed7d7eb8f113fa7884016d209bd1e598" Apr 20 20:10:14.455716 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.455669 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6670288f366cb270940e7acb6c043e90ed7d7eb8f113fa7884016d209bd1e598"} err="failed to get container status \"6670288f366cb270940e7acb6c043e90ed7d7eb8f113fa7884016d209bd1e598\": rpc error: code = NotFound desc = could not find container \"6670288f366cb270940e7acb6c043e90ed7d7eb8f113fa7884016d209bd1e598\": container with ID starting with 6670288f366cb270940e7acb6c043e90ed7d7eb8f113fa7884016d209bd1e598 not found: ID does not exist" Apr 20 20:10:14.455716 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.455683 2571 scope.go:117] "RemoveContainer" containerID="ce922266eecce4e7eeed17167268480eecfd64d03410c59a2f7cf1b8069b0d70" Apr 20 20:10:14.455937 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:10:14.455911 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce922266eecce4e7eeed17167268480eecfd64d03410c59a2f7cf1b8069b0d70\": container with ID starting with ce922266eecce4e7eeed17167268480eecfd64d03410c59a2f7cf1b8069b0d70 not found: ID does not exist" containerID="ce922266eecce4e7eeed17167268480eecfd64d03410c59a2f7cf1b8069b0d70" Apr 20 20:10:14.456039 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.455944 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce922266eecce4e7eeed17167268480eecfd64d03410c59a2f7cf1b8069b0d70"} err="failed to get container status \"ce922266eecce4e7eeed17167268480eecfd64d03410c59a2f7cf1b8069b0d70\": rpc error: code = NotFound desc = could not find container \"ce922266eecce4e7eeed17167268480eecfd64d03410c59a2f7cf1b8069b0d70\": container with ID starting with ce922266eecce4e7eeed17167268480eecfd64d03410c59a2f7cf1b8069b0d70 not found: ID does not exist" Apr 20 20:10:14.456039 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.455967 2571 scope.go:117] "RemoveContainer" containerID="fe70cbe6aba93eb1c70e5b4edd62fac56cad1af33f7e71aea532f8995ea83072" Apr 20 20:10:14.456262 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:10:14.456224 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe70cbe6aba93eb1c70e5b4edd62fac56cad1af33f7e71aea532f8995ea83072\": container with ID starting with fe70cbe6aba93eb1c70e5b4edd62fac56cad1af33f7e71aea532f8995ea83072 not found: ID does not exist" containerID="fe70cbe6aba93eb1c70e5b4edd62fac56cad1af33f7e71aea532f8995ea83072" Apr 20 20:10:14.456345 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.456253 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe70cbe6aba93eb1c70e5b4edd62fac56cad1af33f7e71aea532f8995ea83072"} err="failed to get container status \"fe70cbe6aba93eb1c70e5b4edd62fac56cad1af33f7e71aea532f8995ea83072\": rpc error: code = NotFound desc = could not find container \"fe70cbe6aba93eb1c70e5b4edd62fac56cad1af33f7e71aea532f8995ea83072\": container with ID starting with fe70cbe6aba93eb1c70e5b4edd62fac56cad1af33f7e71aea532f8995ea83072 not found: ID does not exist" Apr 20 20:10:14.456345 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.456275 2571 scope.go:117] "RemoveContainer" containerID="8605dff8170b83f6facff19fe0df8cc61e1f580e257debd85cac36def8d59cea" Apr 20 20:10:14.456879 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:10:14.456834 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8605dff8170b83f6facff19fe0df8cc61e1f580e257debd85cac36def8d59cea\": container with ID starting with 8605dff8170b83f6facff19fe0df8cc61e1f580e257debd85cac36def8d59cea not found: ID does not exist" containerID="8605dff8170b83f6facff19fe0df8cc61e1f580e257debd85cac36def8d59cea" Apr 20 20:10:14.456879 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.456868 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8605dff8170b83f6facff19fe0df8cc61e1f580e257debd85cac36def8d59cea"} err="failed to get container status \"8605dff8170b83f6facff19fe0df8cc61e1f580e257debd85cac36def8d59cea\": rpc error: code = NotFound desc = could not find container \"8605dff8170b83f6facff19fe0df8cc61e1f580e257debd85cac36def8d59cea\": container with ID starting with 8605dff8170b83f6facff19fe0df8cc61e1f580e257debd85cac36def8d59cea not found: ID does not exist" Apr 20 20:10:14.457017 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.456889 2571 scope.go:117] "RemoveContainer" containerID="406796faf2058bf914fe9981e4e02e0669e29cba8364f7f837ae8e1332af3958" Apr 20 20:10:14.457189 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:10:14.457172 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"406796faf2058bf914fe9981e4e02e0669e29cba8364f7f837ae8e1332af3958\": container with ID starting with 406796faf2058bf914fe9981e4e02e0669e29cba8364f7f837ae8e1332af3958 not found: ID does not exist" containerID="406796faf2058bf914fe9981e4e02e0669e29cba8364f7f837ae8e1332af3958" Apr 20 20:10:14.457266 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.457197 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"406796faf2058bf914fe9981e4e02e0669e29cba8364f7f837ae8e1332af3958"} err="failed to get container status \"406796faf2058bf914fe9981e4e02e0669e29cba8364f7f837ae8e1332af3958\": rpc error: code = NotFound desc = could not find container \"406796faf2058bf914fe9981e4e02e0669e29cba8364f7f837ae8e1332af3958\": container with ID starting with 406796faf2058bf914fe9981e4e02e0669e29cba8364f7f837ae8e1332af3958 not found: ID does not exist" Apr 20 20:10:14.457266 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.457215 2571 scope.go:117] "RemoveContainer" containerID="525340c82e7e258af0b47538cae9314f2403bf9f16da188a958132c617e08fd9" Apr 20 20:10:14.457523 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:10:14.457487 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"525340c82e7e258af0b47538cae9314f2403bf9f16da188a958132c617e08fd9\": container with ID starting with 525340c82e7e258af0b47538cae9314f2403bf9f16da188a958132c617e08fd9 not found: ID does not exist" containerID="525340c82e7e258af0b47538cae9314f2403bf9f16da188a958132c617e08fd9" Apr 20 20:10:14.457594 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.457518 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"525340c82e7e258af0b47538cae9314f2403bf9f16da188a958132c617e08fd9"} err="failed to get container status \"525340c82e7e258af0b47538cae9314f2403bf9f16da188a958132c617e08fd9\": rpc error: code = NotFound desc = could not find container \"525340c82e7e258af0b47538cae9314f2403bf9f16da188a958132c617e08fd9\": container with ID starting with 525340c82e7e258af0b47538cae9314f2403bf9f16da188a958132c617e08fd9 not found: ID does not exist" Apr 20 20:10:14.457594 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.457538 2571 scope.go:117] "RemoveContainer" containerID="9b0b6b6a2c63178905f1c14f97f6bd70d005cb7a85b6b23c5666c6ed99ced3b1" Apr 20 20:10:14.457801 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.457776 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b0b6b6a2c63178905f1c14f97f6bd70d005cb7a85b6b23c5666c6ed99ced3b1"} err="failed to get container status \"9b0b6b6a2c63178905f1c14f97f6bd70d005cb7a85b6b23c5666c6ed99ced3b1\": rpc error: code = NotFound desc = could not find container \"9b0b6b6a2c63178905f1c14f97f6bd70d005cb7a85b6b23c5666c6ed99ced3b1\": container with ID starting with 9b0b6b6a2c63178905f1c14f97f6bd70d005cb7a85b6b23c5666c6ed99ced3b1 not found: ID does not exist" Apr 20 20:10:14.457845 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.457804 2571 scope.go:117] "RemoveContainer" containerID="6670288f366cb270940e7acb6c043e90ed7d7eb8f113fa7884016d209bd1e598" Apr 20 20:10:14.457895 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.457876 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:10:14.458091 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458060 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6670288f366cb270940e7acb6c043e90ed7d7eb8f113fa7884016d209bd1e598"} err="failed to get container status \"6670288f366cb270940e7acb6c043e90ed7d7eb8f113fa7884016d209bd1e598\": rpc error: code = NotFound desc = could not find container \"6670288f366cb270940e7acb6c043e90ed7d7eb8f113fa7884016d209bd1e598\": container with ID starting with 6670288f366cb270940e7acb6c043e90ed7d7eb8f113fa7884016d209bd1e598 not found: ID does not exist" Apr 20 20:10:14.458153 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458093 2571 scope.go:117] "RemoveContainer" containerID="ce922266eecce4e7eeed17167268480eecfd64d03410c59a2f7cf1b8069b0d70" Apr 20 20:10:14.458229 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458217 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="kube-rbac-proxy" Apr 20 20:10:14.458277 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458234 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="kube-rbac-proxy" Apr 20 20:10:14.458277 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458253 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="alertmanager" Apr 20 20:10:14.458277 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458263 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="alertmanager" Apr 20 20:10:14.458277 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458274 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="config-reloader" Apr 20 20:10:14.458471 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458280 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="config-reloader" Apr 20 20:10:14.458471 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458294 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="prom-label-proxy" Apr 20 20:10:14.458471 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458300 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="prom-label-proxy" Apr 20 20:10:14.458471 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458313 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="kube-rbac-proxy-metric" Apr 20 20:10:14.458471 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458328 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="kube-rbac-proxy-metric" Apr 20 20:10:14.458471 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458341 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="init-config-reloader" Apr 20 20:10:14.458471 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458346 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="init-config-reloader" Apr 20 20:10:14.458471 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458351 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="kube-rbac-proxy-web" Apr 20 20:10:14.458471 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458357 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="kube-rbac-proxy-web" Apr 20 20:10:14.458471 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458305 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce922266eecce4e7eeed17167268480eecfd64d03410c59a2f7cf1b8069b0d70"} err="failed to get container status \"ce922266eecce4e7eeed17167268480eecfd64d03410c59a2f7cf1b8069b0d70\": rpc error: code = NotFound desc = could not find container \"ce922266eecce4e7eeed17167268480eecfd64d03410c59a2f7cf1b8069b0d70\": container with ID starting with ce922266eecce4e7eeed17167268480eecfd64d03410c59a2f7cf1b8069b0d70 not found: ID does not exist" Apr 20 20:10:14.458471 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458405 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="kube-rbac-proxy-web" Apr 20 20:10:14.458471 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458415 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="prom-label-proxy" Apr 20 20:10:14.458471 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458404 2571 scope.go:117] "RemoveContainer" containerID="fe70cbe6aba93eb1c70e5b4edd62fac56cad1af33f7e71aea532f8995ea83072" Apr 20 20:10:14.458471 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458440 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="config-reloader" Apr 20 20:10:14.458471 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458452 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="kube-rbac-proxy-metric" Apr 20 20:10:14.458471 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458458 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="alertmanager" Apr 20 20:10:14.458471 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458464 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" containerName="kube-rbac-proxy" Apr 20 20:10:14.459094 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458685 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe70cbe6aba93eb1c70e5b4edd62fac56cad1af33f7e71aea532f8995ea83072"} err="failed to get container status \"fe70cbe6aba93eb1c70e5b4edd62fac56cad1af33f7e71aea532f8995ea83072\": rpc error: code = NotFound desc = could not find container \"fe70cbe6aba93eb1c70e5b4edd62fac56cad1af33f7e71aea532f8995ea83072\": container with ID starting with fe70cbe6aba93eb1c70e5b4edd62fac56cad1af33f7e71aea532f8995ea83072 not found: ID does not exist" Apr 20 20:10:14.459094 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458704 2571 scope.go:117] "RemoveContainer" containerID="8605dff8170b83f6facff19fe0df8cc61e1f580e257debd85cac36def8d59cea" Apr 20 20:10:14.459094 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458877 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8605dff8170b83f6facff19fe0df8cc61e1f580e257debd85cac36def8d59cea"} err="failed to get container status \"8605dff8170b83f6facff19fe0df8cc61e1f580e257debd85cac36def8d59cea\": rpc error: code = NotFound desc = could not find container \"8605dff8170b83f6facff19fe0df8cc61e1f580e257debd85cac36def8d59cea\": container with ID starting with 8605dff8170b83f6facff19fe0df8cc61e1f580e257debd85cac36def8d59cea not found: ID does not exist" Apr 20 20:10:14.459094 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.458900 2571 scope.go:117] "RemoveContainer" containerID="406796faf2058bf914fe9981e4e02e0669e29cba8364f7f837ae8e1332af3958" Apr 20 20:10:14.459094 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.459085 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"406796faf2058bf914fe9981e4e02e0669e29cba8364f7f837ae8e1332af3958"} err="failed to get container status \"406796faf2058bf914fe9981e4e02e0669e29cba8364f7f837ae8e1332af3958\": rpc error: code = NotFound desc = could not find container \"406796faf2058bf914fe9981e4e02e0669e29cba8364f7f837ae8e1332af3958\": container with ID starting with 406796faf2058bf914fe9981e4e02e0669e29cba8364f7f837ae8e1332af3958 not found: ID does not exist" Apr 20 20:10:14.459296 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.459103 2571 scope.go:117] "RemoveContainer" containerID="525340c82e7e258af0b47538cae9314f2403bf9f16da188a958132c617e08fd9" Apr 20 20:10:14.459333 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.459315 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"525340c82e7e258af0b47538cae9314f2403bf9f16da188a958132c617e08fd9"} err="failed to get container status \"525340c82e7e258af0b47538cae9314f2403bf9f16da188a958132c617e08fd9\": rpc error: code = NotFound desc = could not find container \"525340c82e7e258af0b47538cae9314f2403bf9f16da188a958132c617e08fd9\": container with ID starting with 525340c82e7e258af0b47538cae9314f2403bf9f16da188a958132c617e08fd9 not found: ID does not exist" Apr 20 20:10:14.463818 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.463801 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.466649 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.466617 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 20:10:14.466755 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.466721 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 20:10:14.466755 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.466744 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 20:10:14.466853 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.466746 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 20:10:14.467322 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.467306 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 20:10:14.467382 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.467325 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-4b9mz\"" Apr 20 20:10:14.467382 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.467315 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 20:10:14.467502 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.467394 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 20:10:14.467502 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.467349 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 20:10:14.472398 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.472380 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 20:10:14.476077 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.476037 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:10:14.550837 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.550803 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fff0c0d6-bd89-4736-a984-a965f262948b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.550976 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.550850 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fff0c0d6-bd89-4736-a984-a965f262948b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.550976 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.550881 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fff0c0d6-bd89-4736-a984-a965f262948b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.550976 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.550901 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fff0c0d6-bd89-4736-a984-a965f262948b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.550976 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.550921 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fff0c0d6-bd89-4736-a984-a965f262948b-config-out\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.551132 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.550994 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fff0c0d6-bd89-4736-a984-a965f262948b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.551132 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.551019 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8vhc\" (UniqueName: \"kubernetes.io/projected/fff0c0d6-bd89-4736-a984-a965f262948b-kube-api-access-h8vhc\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.551132 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.551035 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fff0c0d6-bd89-4736-a984-a965f262948b-config-volume\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.551132 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.551049 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fff0c0d6-bd89-4736-a984-a965f262948b-web-config\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.551132 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.551070 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fff0c0d6-bd89-4736-a984-a965f262948b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.551132 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.551095 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fff0c0d6-bd89-4736-a984-a965f262948b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.551132 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.551112 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fff0c0d6-bd89-4736-a984-a965f262948b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.551132 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.551129 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fff0c0d6-bd89-4736-a984-a965f262948b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.652293 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.652237 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fff0c0d6-bd89-4736-a984-a965f262948b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.652293 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.652301 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fff0c0d6-bd89-4736-a984-a965f262948b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.652568 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.652328 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fff0c0d6-bd89-4736-a984-a965f262948b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.652568 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.652351 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fff0c0d6-bd89-4736-a984-a965f262948b-config-out\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.652568 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.652380 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fff0c0d6-bd89-4736-a984-a965f262948b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.652568 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.652415 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8vhc\" (UniqueName: \"kubernetes.io/projected/fff0c0d6-bd89-4736-a984-a965f262948b-kube-api-access-h8vhc\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.652766 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.652635 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fff0c0d6-bd89-4736-a984-a965f262948b-config-volume\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.652766 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.652663 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fff0c0d6-bd89-4736-a984-a965f262948b-web-config\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.652766 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.652689 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fff0c0d6-bd89-4736-a984-a965f262948b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.652766 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.652720 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fff0c0d6-bd89-4736-a984-a965f262948b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.652766 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.652747 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fff0c0d6-bd89-4736-a984-a965f262948b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.652998 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.652778 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fff0c0d6-bd89-4736-a984-a965f262948b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.652998 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.652826 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fff0c0d6-bd89-4736-a984-a965f262948b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.653317 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.653289 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fff0c0d6-bd89-4736-a984-a965f262948b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.653780 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.653749 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fff0c0d6-bd89-4736-a984-a965f262948b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.655813 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.655606 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fff0c0d6-bd89-4736-a984-a965f262948b-config-out\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.655813 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.655707 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fff0c0d6-bd89-4736-a984-a965f262948b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.655813 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.655790 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fff0c0d6-bd89-4736-a984-a965f262948b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.656213 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.656069 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fff0c0d6-bd89-4736-a984-a965f262948b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.656213 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.656146 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fff0c0d6-bd89-4736-a984-a965f262948b-web-config\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.656341 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.656235 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fff0c0d6-bd89-4736-a984-a965f262948b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.656341 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.656315 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fff0c0d6-bd89-4736-a984-a965f262948b-config-volume\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.656442 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.656385 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fff0c0d6-bd89-4736-a984-a965f262948b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.656792 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.656771 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fff0c0d6-bd89-4736-a984-a965f262948b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.658004 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.657984 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fff0c0d6-bd89-4736-a984-a965f262948b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.660532 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.660513 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8vhc\" (UniqueName: \"kubernetes.io/projected/fff0c0d6-bd89-4736-a984-a965f262948b-kube-api-access-h8vhc\") pod \"alertmanager-main-0\" (UID: \"fff0c0d6-bd89-4736-a984-a965f262948b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.774896 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.774852 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:10:14.900449 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:14.900407 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:10:14.902822 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:10:14.902793 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfff0c0d6_bd89_4736_a984_a965f262948b.slice/crio-837ca63e14f0af6f2fe4ed6789afbfdb8b5c2447ec8824bdc24355290b328572 WatchSource:0}: Error finding container 837ca63e14f0af6f2fe4ed6789afbfdb8b5c2447ec8824bdc24355290b328572: Status 404 returned error can't find the container with id 837ca63e14f0af6f2fe4ed6789afbfdb8b5c2447ec8824bdc24355290b328572 Apr 20 20:10:15.409538 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:15.409502 2571 generic.go:358] "Generic (PLEG): container finished" podID="fff0c0d6-bd89-4736-a984-a965f262948b" containerID="35cb9c1f7ab008fc6a6ab57f432ee481c1c042fd89b8fc055d072165b3c9aafa" exitCode=0 Apr 20 20:10:15.409949 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:15.409592 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fff0c0d6-bd89-4736-a984-a965f262948b","Type":"ContainerDied","Data":"35cb9c1f7ab008fc6a6ab57f432ee481c1c042fd89b8fc055d072165b3c9aafa"} Apr 20 20:10:15.409949 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:15.409627 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fff0c0d6-bd89-4736-a984-a965f262948b","Type":"ContainerStarted","Data":"837ca63e14f0af6f2fe4ed6789afbfdb8b5c2447ec8824bdc24355290b328572"} Apr 20 20:10:15.527551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:15.527526 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="689afe77-0baf-42f8-aabf-a28730e1c663" path="/var/lib/kubelet/pods/689afe77-0baf-42f8-aabf-a28730e1c663/volumes" Apr 20 20:10:16.419184 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:16.419145 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fff0c0d6-bd89-4736-a984-a965f262948b","Type":"ContainerStarted","Data":"862fd5de46c768799f7cee7a1d2f1f3fb84ba6cfa43f078c5dd40fa40368e041"} Apr 20 20:10:16.419586 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:16.419190 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fff0c0d6-bd89-4736-a984-a965f262948b","Type":"ContainerStarted","Data":"e79f0b19f60170e7734a34fbe0823e80548b1a3f37112f970a3a2fd57e379c8d"} Apr 20 20:10:16.419586 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:16.419205 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fff0c0d6-bd89-4736-a984-a965f262948b","Type":"ContainerStarted","Data":"c4eeefea0cea84bbf597eb7cb460e17076a5ca16cc0a36c8dbea182fef5c132a"} Apr 20 20:10:16.419586 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:16.419216 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fff0c0d6-bd89-4736-a984-a965f262948b","Type":"ContainerStarted","Data":"5ed09c6d8aa02cf4e98a136b6462933193a74acd65a9ebbadfe67d2f0806fe8d"} Apr 20 20:10:16.419586 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:16.419229 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fff0c0d6-bd89-4736-a984-a965f262948b","Type":"ContainerStarted","Data":"ab921a47aa9bffaf8d46cabe37b65fc6909da0b06d6da09c8a52c3a2fec96cd7"} Apr 20 20:10:16.419586 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:16.419241 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fff0c0d6-bd89-4736-a984-a965f262948b","Type":"ContainerStarted","Data":"bca0c8a234cb0f4f05e1e985b87ac25e0263aaa53922e84828b4eb842aab1256"} Apr 20 20:10:16.445243 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:16.445186 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.445168087 podStartE2EDuration="2.445168087s" podCreationTimestamp="2026-04-20 20:10:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:10:16.443772225 +0000 UTC m=+257.483261771" watchObservedRunningTime="2026-04-20 20:10:16.445168087 +0000 UTC m=+257.484657630" Apr 20 20:10:21.109147 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:21.109069 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:21.109568 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:21.109231 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:21.113829 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:21.113804 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:21.437914 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:21.437840 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:10:21.486270 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:21.486236 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6dff456c99-rql5b"] Apr 20 20:10:46.505324 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.505250 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6dff456c99-rql5b" podUID="f6b91604-e5cc-4420-8304-b2cc7e9fbaf1" containerName="console" containerID="cri-o://8ac5710b3558a59b3975d7e0281b474112722ad9f28dc9e1c472dcc4c86f7725" gracePeriod=15 Apr 20 20:10:46.737594 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.737568 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dff456c99-rql5b_f6b91604-e5cc-4420-8304-b2cc7e9fbaf1/console/0.log" Apr 20 20:10:46.737732 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.737633 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:10:46.806262 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.806184 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-trusted-ca-bundle\") pod \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " Apr 20 20:10:46.806416 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.806264 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-console-serving-cert\") pod \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " Apr 20 20:10:46.806416 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.806365 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-oauth-serving-cert\") pod \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " Apr 20 20:10:46.806416 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.806387 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-console-config\") pod \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " Apr 20 20:10:46.806416 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.806402 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-service-ca\") pod \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " Apr 20 20:10:46.806655 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.806444 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26pmq\" (UniqueName: \"kubernetes.io/projected/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-kube-api-access-26pmq\") pod \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " Apr 20 20:10:46.806655 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.806580 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-console-oauth-config\") pod \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\" (UID: \"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1\") " Apr 20 20:10:46.806767 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.806601 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f6b91604-e5cc-4420-8304-b2cc7e9fbaf1" (UID: "f6b91604-e5cc-4420-8304-b2cc7e9fbaf1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:10:46.806819 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.806786 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f6b91604-e5cc-4420-8304-b2cc7e9fbaf1" (UID: "f6b91604-e5cc-4420-8304-b2cc7e9fbaf1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:10:46.806872 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.806853 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-console-config" (OuterVolumeSpecName: "console-config") pod "f6b91604-e5cc-4420-8304-b2cc7e9fbaf1" (UID: "f6b91604-e5cc-4420-8304-b2cc7e9fbaf1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:10:46.806920 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.806872 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-trusted-ca-bundle\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:10:46.806920 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.806890 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-oauth-serving-cert\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:10:46.806983 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.806943 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-service-ca" (OuterVolumeSpecName: "service-ca") pod "f6b91604-e5cc-4420-8304-b2cc7e9fbaf1" (UID: "f6b91604-e5cc-4420-8304-b2cc7e9fbaf1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:10:46.808727 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.808706 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f6b91604-e5cc-4420-8304-b2cc7e9fbaf1" (UID: "f6b91604-e5cc-4420-8304-b2cc7e9fbaf1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:10:46.809070 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.809052 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f6b91604-e5cc-4420-8304-b2cc7e9fbaf1" (UID: "f6b91604-e5cc-4420-8304-b2cc7e9fbaf1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:10:46.809129 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.809062 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-kube-api-access-26pmq" (OuterVolumeSpecName: "kube-api-access-26pmq") pod "f6b91604-e5cc-4420-8304-b2cc7e9fbaf1" (UID: "f6b91604-e5cc-4420-8304-b2cc7e9fbaf1"). InnerVolumeSpecName "kube-api-access-26pmq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:10:46.907525 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.907489 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-console-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:10:46.907525 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.907517 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-service-ca\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:10:46.907525 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.907526 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-26pmq\" (UniqueName: \"kubernetes.io/projected/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-kube-api-access-26pmq\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:10:46.907525 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.907536 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-console-oauth-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:10:46.907777 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:46.907546 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1-console-serving-cert\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:10:47.509353 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:47.509325 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dff456c99-rql5b_f6b91604-e5cc-4420-8304-b2cc7e9fbaf1/console/0.log" Apr 20 20:10:47.509824 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:47.509364 2571 generic.go:358] "Generic (PLEG): container finished" podID="f6b91604-e5cc-4420-8304-b2cc7e9fbaf1" containerID="8ac5710b3558a59b3975d7e0281b474112722ad9f28dc9e1c472dcc4c86f7725" exitCode=2 Apr 20 20:10:47.509824 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:47.509456 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dff456c99-rql5b" Apr 20 20:10:47.509824 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:47.509466 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dff456c99-rql5b" event={"ID":"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1","Type":"ContainerDied","Data":"8ac5710b3558a59b3975d7e0281b474112722ad9f28dc9e1c472dcc4c86f7725"} Apr 20 20:10:47.509824 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:47.509513 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dff456c99-rql5b" event={"ID":"f6b91604-e5cc-4420-8304-b2cc7e9fbaf1","Type":"ContainerDied","Data":"62b88b22609188c2b48f3cf85287fe2b99ce8620da5a0c6767428eb4385e8712"} Apr 20 20:10:47.509824 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:47.509537 2571 scope.go:117] "RemoveContainer" containerID="8ac5710b3558a59b3975d7e0281b474112722ad9f28dc9e1c472dcc4c86f7725" Apr 20 20:10:47.517600 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:47.517577 2571 scope.go:117] "RemoveContainer" containerID="8ac5710b3558a59b3975d7e0281b474112722ad9f28dc9e1c472dcc4c86f7725" Apr 20 20:10:47.517829 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:10:47.517812 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ac5710b3558a59b3975d7e0281b474112722ad9f28dc9e1c472dcc4c86f7725\": container with ID starting with 8ac5710b3558a59b3975d7e0281b474112722ad9f28dc9e1c472dcc4c86f7725 not found: ID does not exist" containerID="8ac5710b3558a59b3975d7e0281b474112722ad9f28dc9e1c472dcc4c86f7725" Apr 20 20:10:47.517872 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:47.517836 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac5710b3558a59b3975d7e0281b474112722ad9f28dc9e1c472dcc4c86f7725"} err="failed to get container status \"8ac5710b3558a59b3975d7e0281b474112722ad9f28dc9e1c472dcc4c86f7725\": rpc error: code = NotFound desc = could not find container \"8ac5710b3558a59b3975d7e0281b474112722ad9f28dc9e1c472dcc4c86f7725\": container with ID starting with 8ac5710b3558a59b3975d7e0281b474112722ad9f28dc9e1c472dcc4c86f7725 not found: ID does not exist" Apr 20 20:10:47.530038 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:47.530010 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6dff456c99-rql5b"] Apr 20 20:10:47.534174 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:47.534143 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6dff456c99-rql5b"] Apr 20 20:10:49.526951 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:49.526911 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6b91604-e5cc-4420-8304-b2cc7e9fbaf1" path="/var/lib/kubelet/pods/f6b91604-e5cc-4420-8304-b2cc7e9fbaf1/volumes" Apr 20 20:10:58.317686 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:58.317650 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss"] Apr 20 20:10:58.318193 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:58.318083 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6b91604-e5cc-4420-8304-b2cc7e9fbaf1" containerName="console" Apr 20 20:10:58.318193 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:58.318102 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b91604-e5cc-4420-8304-b2cc7e9fbaf1" containerName="console" Apr 20 20:10:58.318193 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:58.318172 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6b91604-e5cc-4420-8304-b2cc7e9fbaf1" containerName="console" Apr 20 20:10:58.322644 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:58.322624 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss" Apr 20 20:10:58.325308 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:58.325289 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wwghg\"" Apr 20 20:10:58.325467 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:58.325287 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 20:10:58.326401 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:58.326385 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 20:10:58.329096 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:58.329069 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss"] Apr 20 20:10:58.393139 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:58.393081 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81957881-eddf-460e-8aba-87a673a0dcaf-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss\" (UID: \"81957881-eddf-460e-8aba-87a673a0dcaf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss" Apr 20 20:10:58.393139 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:58.393153 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrmtt\" (UniqueName: \"kubernetes.io/projected/81957881-eddf-460e-8aba-87a673a0dcaf-kube-api-access-vrmtt\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss\" (UID: \"81957881-eddf-460e-8aba-87a673a0dcaf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss" Apr 20 20:10:58.393363 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:58.393193 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81957881-eddf-460e-8aba-87a673a0dcaf-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss\" (UID: \"81957881-eddf-460e-8aba-87a673a0dcaf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss" Apr 20 20:10:58.494416 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:58.494381 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81957881-eddf-460e-8aba-87a673a0dcaf-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss\" (UID: \"81957881-eddf-460e-8aba-87a673a0dcaf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss" Apr 20 20:10:58.494619 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:58.494467 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrmtt\" (UniqueName: \"kubernetes.io/projected/81957881-eddf-460e-8aba-87a673a0dcaf-kube-api-access-vrmtt\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss\" (UID: \"81957881-eddf-460e-8aba-87a673a0dcaf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss" Apr 20 20:10:58.494619 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:58.494503 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81957881-eddf-460e-8aba-87a673a0dcaf-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss\" (UID: \"81957881-eddf-460e-8aba-87a673a0dcaf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss" Apr 20 20:10:58.494872 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:58.494850 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81957881-eddf-460e-8aba-87a673a0dcaf-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss\" (UID: \"81957881-eddf-460e-8aba-87a673a0dcaf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss" Apr 20 20:10:58.494933 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:58.494882 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81957881-eddf-460e-8aba-87a673a0dcaf-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss\" (UID: \"81957881-eddf-460e-8aba-87a673a0dcaf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss" Apr 20 20:10:58.503669 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:58.503643 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrmtt\" (UniqueName: \"kubernetes.io/projected/81957881-eddf-460e-8aba-87a673a0dcaf-kube-api-access-vrmtt\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss\" (UID: \"81957881-eddf-460e-8aba-87a673a0dcaf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss" Apr 20 20:10:58.633072 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:58.632982 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss" Apr 20 20:10:58.752410 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:58.752387 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss"] Apr 20 20:10:58.754656 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:10:58.754632 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81957881_eddf_460e_8aba_87a673a0dcaf.slice/crio-1b02e6ab7a32d3be3b4925f1465deeb92bbf8c37ba1fc47604c656c83e6266a4 WatchSource:0}: Error finding container 1b02e6ab7a32d3be3b4925f1465deeb92bbf8c37ba1fc47604c656c83e6266a4: Status 404 returned error can't find the container with id 1b02e6ab7a32d3be3b4925f1465deeb92bbf8c37ba1fc47604c656c83e6266a4 Apr 20 20:10:59.408611 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:59.408584 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kldp5_997e9539-5288-4af5-92f4-55d8ccefbbf7/ovn-acl-logging/0.log" Apr 20 20:10:59.409521 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:59.409493 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kldp5_997e9539-5288-4af5-92f4-55d8ccefbbf7/ovn-acl-logging/0.log" Apr 20 20:10:59.412499 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:59.412478 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 20:10:59.546226 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:10:59.546091 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss" event={"ID":"81957881-eddf-460e-8aba-87a673a0dcaf","Type":"ContainerStarted","Data":"1b02e6ab7a32d3be3b4925f1465deeb92bbf8c37ba1fc47604c656c83e6266a4"} Apr 20 20:11:04.561651 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:04.561609 2571 generic.go:358] "Generic (PLEG): container finished" podID="81957881-eddf-460e-8aba-87a673a0dcaf" containerID="ae0007be16304993d6d4d7c79c19431c01b86d1e12fb76530f73c8fce0901bb3" exitCode=0 Apr 20 20:11:04.562114 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:04.561678 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss" event={"ID":"81957881-eddf-460e-8aba-87a673a0dcaf","Type":"ContainerDied","Data":"ae0007be16304993d6d4d7c79c19431c01b86d1e12fb76530f73c8fce0901bb3"} Apr 20 20:11:04.562704 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:04.562689 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:11:06.568999 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:06.568970 2571 generic.go:358] "Generic (PLEG): container finished" podID="81957881-eddf-460e-8aba-87a673a0dcaf" containerID="f35904ce9db82570912642ca4007fb4db77d62a7ffd469bdef3bd078295a0b59" exitCode=0 Apr 20 20:11:06.569338 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:06.569017 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss" event={"ID":"81957881-eddf-460e-8aba-87a673a0dcaf","Type":"ContainerDied","Data":"f35904ce9db82570912642ca4007fb4db77d62a7ffd469bdef3bd078295a0b59"} Apr 20 20:11:13.591854 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:13.591822 2571 generic.go:358] "Generic (PLEG): container finished" podID="81957881-eddf-460e-8aba-87a673a0dcaf" containerID="2a45dbb8c199b546a521465e18482246f6d2fe213b920d6997d88c91d9970801" exitCode=0 Apr 20 20:11:13.592231 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:13.591901 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss" event={"ID":"81957881-eddf-460e-8aba-87a673a0dcaf","Type":"ContainerDied","Data":"2a45dbb8c199b546a521465e18482246f6d2fe213b920d6997d88c91d9970801"} Apr 20 20:11:14.713688 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:14.713664 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss" Apr 20 20:11:14.739651 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:14.739623 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81957881-eddf-460e-8aba-87a673a0dcaf-bundle\") pod \"81957881-eddf-460e-8aba-87a673a0dcaf\" (UID: \"81957881-eddf-460e-8aba-87a673a0dcaf\") " Apr 20 20:11:14.739800 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:14.739717 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81957881-eddf-460e-8aba-87a673a0dcaf-util\") pod \"81957881-eddf-460e-8aba-87a673a0dcaf\" (UID: \"81957881-eddf-460e-8aba-87a673a0dcaf\") " Apr 20 20:11:14.739800 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:14.739751 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrmtt\" (UniqueName: \"kubernetes.io/projected/81957881-eddf-460e-8aba-87a673a0dcaf-kube-api-access-vrmtt\") pod \"81957881-eddf-460e-8aba-87a673a0dcaf\" (UID: \"81957881-eddf-460e-8aba-87a673a0dcaf\") " Apr 20 20:11:14.740301 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:14.740267 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81957881-eddf-460e-8aba-87a673a0dcaf-bundle" (OuterVolumeSpecName: "bundle") pod "81957881-eddf-460e-8aba-87a673a0dcaf" (UID: "81957881-eddf-460e-8aba-87a673a0dcaf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:11:14.742173 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:14.742139 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81957881-eddf-460e-8aba-87a673a0dcaf-kube-api-access-vrmtt" (OuterVolumeSpecName: "kube-api-access-vrmtt") pod "81957881-eddf-460e-8aba-87a673a0dcaf" (UID: "81957881-eddf-460e-8aba-87a673a0dcaf"). InnerVolumeSpecName "kube-api-access-vrmtt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:11:14.744108 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:14.744087 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81957881-eddf-460e-8aba-87a673a0dcaf-util" (OuterVolumeSpecName: "util") pod "81957881-eddf-460e-8aba-87a673a0dcaf" (UID: "81957881-eddf-460e-8aba-87a673a0dcaf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:11:14.840529 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:14.840484 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vrmtt\" (UniqueName: \"kubernetes.io/projected/81957881-eddf-460e-8aba-87a673a0dcaf-kube-api-access-vrmtt\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:11:14.840529 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:14.840524 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81957881-eddf-460e-8aba-87a673a0dcaf-bundle\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:11:14.840529 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:14.840538 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81957881-eddf-460e-8aba-87a673a0dcaf-util\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:11:15.598389 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:15.598363 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss" Apr 20 20:11:15.598573 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:15.598359 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cvgnss" event={"ID":"81957881-eddf-460e-8aba-87a673a0dcaf","Type":"ContainerDied","Data":"1b02e6ab7a32d3be3b4925f1465deeb92bbf8c37ba1fc47604c656c83e6266a4"} Apr 20 20:11:15.598573 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:15.598473 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b02e6ab7a32d3be3b4925f1465deeb92bbf8c37ba1fc47604c656c83e6266a4" Apr 20 20:11:19.954945 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:19.954909 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xvnr2"] Apr 20 20:11:19.955416 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:19.955315 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81957881-eddf-460e-8aba-87a673a0dcaf" containerName="util" Apr 20 20:11:19.955416 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:19.955329 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="81957881-eddf-460e-8aba-87a673a0dcaf" containerName="util" Apr 20 20:11:19.955416 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:19.955339 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81957881-eddf-460e-8aba-87a673a0dcaf" containerName="pull" Apr 20 20:11:19.955416 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:19.955347 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="81957881-eddf-460e-8aba-87a673a0dcaf" containerName="pull" Apr 20 20:11:19.955416 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:19.955371 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81957881-eddf-460e-8aba-87a673a0dcaf" containerName="extract" Apr 20 20:11:19.955416 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:19.955379 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="81957881-eddf-460e-8aba-87a673a0dcaf" containerName="extract" Apr 20 20:11:19.955681 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:19.955453 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="81957881-eddf-460e-8aba-87a673a0dcaf" containerName="extract" Apr 20 20:11:19.996231 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:19.996201 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xvnr2"] Apr 20 20:11:19.996375 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:19.996315 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xvnr2" Apr 20 20:11:20.004191 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:20.004164 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-9sxhj\"" Apr 20 20:11:20.004477 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:20.004443 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 20 20:11:20.004599 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:20.004561 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 20 20:11:20.004599 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:20.004588 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 20 20:11:20.085730 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:20.085700 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/efb263a7-d50a-4e29-8a17-a5d09d5c7bbe-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xvnr2\" (UID: \"efb263a7-d50a-4e29-8a17-a5d09d5c7bbe\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xvnr2" Apr 20 20:11:20.085906 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:20.085769 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5cjx\" (UniqueName: \"kubernetes.io/projected/efb263a7-d50a-4e29-8a17-a5d09d5c7bbe-kube-api-access-s5cjx\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xvnr2\" (UID: \"efb263a7-d50a-4e29-8a17-a5d09d5c7bbe\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xvnr2" Apr 20 20:11:20.186132 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:20.186095 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/efb263a7-d50a-4e29-8a17-a5d09d5c7bbe-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xvnr2\" (UID: \"efb263a7-d50a-4e29-8a17-a5d09d5c7bbe\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xvnr2" Apr 20 20:11:20.186294 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:20.186165 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5cjx\" (UniqueName: \"kubernetes.io/projected/efb263a7-d50a-4e29-8a17-a5d09d5c7bbe-kube-api-access-s5cjx\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xvnr2\" (UID: \"efb263a7-d50a-4e29-8a17-a5d09d5c7bbe\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xvnr2" Apr 20 20:11:20.188745 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:20.188725 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/efb263a7-d50a-4e29-8a17-a5d09d5c7bbe-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xvnr2\" (UID: \"efb263a7-d50a-4e29-8a17-a5d09d5c7bbe\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xvnr2" Apr 20 20:11:20.197936 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:20.197911 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5cjx\" (UniqueName: \"kubernetes.io/projected/efb263a7-d50a-4e29-8a17-a5d09d5c7bbe-kube-api-access-s5cjx\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xvnr2\" (UID: \"efb263a7-d50a-4e29-8a17-a5d09d5c7bbe\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xvnr2" Apr 20 20:11:20.306315 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:20.306283 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xvnr2" Apr 20 20:11:20.428110 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:20.428075 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xvnr2"] Apr 20 20:11:20.430817 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:11:20.430785 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefb263a7_d50a_4e29_8a17_a5d09d5c7bbe.slice/crio-acd2f2b849107ebb3f934329e599cfd551534a96891eb08fb47604fac9cc6631 WatchSource:0}: Error finding container acd2f2b849107ebb3f934329e599cfd551534a96891eb08fb47604fac9cc6631: Status 404 returned error can't find the container with id acd2f2b849107ebb3f934329e599cfd551534a96891eb08fb47604fac9cc6631 Apr 20 20:11:20.615002 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:20.614922 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xvnr2" event={"ID":"efb263a7-d50a-4e29-8a17-a5d09d5c7bbe","Type":"ContainerStarted","Data":"acd2f2b849107ebb3f934329e599cfd551534a96891eb08fb47604fac9cc6631"} Apr 20 20:11:25.346664 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:25.346633 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-dx4nr"] Apr 20 20:11:25.350134 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:25.350112 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-dx4nr" Apr 20 20:11:25.352941 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:25.352915 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 20 20:11:25.353043 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:25.352919 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 20 20:11:25.353043 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:25.352971 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-m9cwt\"" Apr 20 20:11:25.357959 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:25.357939 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-dx4nr"] Apr 20 20:11:25.433482 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:25.433451 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28f2q\" (UniqueName: \"kubernetes.io/projected/3120646b-079d-424b-a31c-2db07264a522-kube-api-access-28f2q\") pod \"keda-operator-ffbb595cb-dx4nr\" (UID: \"3120646b-079d-424b-a31c-2db07264a522\") " pod="openshift-keda/keda-operator-ffbb595cb-dx4nr" Apr 20 20:11:25.433625 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:25.433497 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/3120646b-079d-424b-a31c-2db07264a522-cabundle0\") pod \"keda-operator-ffbb595cb-dx4nr\" (UID: \"3120646b-079d-424b-a31c-2db07264a522\") " pod="openshift-keda/keda-operator-ffbb595cb-dx4nr" Apr 20 20:11:25.433625 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:25.433535 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3120646b-079d-424b-a31c-2db07264a522-certificates\") pod \"keda-operator-ffbb595cb-dx4nr\" (UID: \"3120646b-079d-424b-a31c-2db07264a522\") " pod="openshift-keda/keda-operator-ffbb595cb-dx4nr" Apr 20 20:11:25.534477 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:25.534448 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28f2q\" (UniqueName: \"kubernetes.io/projected/3120646b-079d-424b-a31c-2db07264a522-kube-api-access-28f2q\") pod \"keda-operator-ffbb595cb-dx4nr\" (UID: \"3120646b-079d-424b-a31c-2db07264a522\") " pod="openshift-keda/keda-operator-ffbb595cb-dx4nr" Apr 20 20:11:25.534614 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:25.534487 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/3120646b-079d-424b-a31c-2db07264a522-cabundle0\") pod \"keda-operator-ffbb595cb-dx4nr\" (UID: \"3120646b-079d-424b-a31c-2db07264a522\") " pod="openshift-keda/keda-operator-ffbb595cb-dx4nr" Apr 20 20:11:25.534614 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:25.534512 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3120646b-079d-424b-a31c-2db07264a522-certificates\") pod \"keda-operator-ffbb595cb-dx4nr\" (UID: \"3120646b-079d-424b-a31c-2db07264a522\") " pod="openshift-keda/keda-operator-ffbb595cb-dx4nr" Apr 20 20:11:25.534614 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:11:25.534593 2571 secret.go:281] references non-existent secret key: ca.crt Apr 20 20:11:25.534614 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:11:25.534603 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 20 20:11:25.534614 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:11:25.534612 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-dx4nr: references non-existent secret key: ca.crt Apr 20 20:11:25.534795 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:11:25.534665 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3120646b-079d-424b-a31c-2db07264a522-certificates podName:3120646b-079d-424b-a31c-2db07264a522 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:26.034651422 +0000 UTC m=+327.074140946 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3120646b-079d-424b-a31c-2db07264a522-certificates") pod "keda-operator-ffbb595cb-dx4nr" (UID: "3120646b-079d-424b-a31c-2db07264a522") : references non-existent secret key: ca.crt Apr 20 20:11:25.535065 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:25.535047 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/3120646b-079d-424b-a31c-2db07264a522-cabundle0\") pod \"keda-operator-ffbb595cb-dx4nr\" (UID: \"3120646b-079d-424b-a31c-2db07264a522\") " pod="openshift-keda/keda-operator-ffbb595cb-dx4nr" Apr 20 20:11:25.555994 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:25.555962 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28f2q\" (UniqueName: \"kubernetes.io/projected/3120646b-079d-424b-a31c-2db07264a522-kube-api-access-28f2q\") pod \"keda-operator-ffbb595cb-dx4nr\" (UID: \"3120646b-079d-424b-a31c-2db07264a522\") " pod="openshift-keda/keda-operator-ffbb595cb-dx4nr" Apr 20 20:11:25.632052 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:25.631965 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xvnr2" event={"ID":"efb263a7-d50a-4e29-8a17-a5d09d5c7bbe","Type":"ContainerStarted","Data":"d2555ba68abe7d600770905626e015a34e785f218ce669cff01d0448a93617fe"} Apr 20 20:11:25.632052 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:25.632027 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xvnr2" Apr 20 20:11:25.665109 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:25.665057 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xvnr2" podStartSLOduration=2.344266856 podStartE2EDuration="6.665040324s" podCreationTimestamp="2026-04-20 20:11:19 +0000 UTC" firstStartedPulling="2026-04-20 20:11:20.43262161 +0000 UTC m=+321.472111133" lastFinishedPulling="2026-04-20 20:11:24.753395075 +0000 UTC m=+325.792884601" observedRunningTime="2026-04-20 20:11:25.662066873 +0000 UTC m=+326.701556419" watchObservedRunningTime="2026-04-20 20:11:25.665040324 +0000 UTC m=+326.704529871" Apr 20 20:11:26.039242 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:26.039208 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3120646b-079d-424b-a31c-2db07264a522-certificates\") pod \"keda-operator-ffbb595cb-dx4nr\" (UID: \"3120646b-079d-424b-a31c-2db07264a522\") " pod="openshift-keda/keda-operator-ffbb595cb-dx4nr" Apr 20 20:11:26.039403 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:11:26.039359 2571 secret.go:281] references non-existent secret key: ca.crt Apr 20 20:11:26.039403 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:11:26.039379 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 20 20:11:26.039403 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:11:26.039389 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-dx4nr: references non-existent secret key: ca.crt Apr 20 20:11:26.039532 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:11:26.039467 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3120646b-079d-424b-a31c-2db07264a522-certificates podName:3120646b-079d-424b-a31c-2db07264a522 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:27.039449987 +0000 UTC m=+328.078939512 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3120646b-079d-424b-a31c-2db07264a522-certificates") pod "keda-operator-ffbb595cb-dx4nr" (UID: "3120646b-079d-424b-a31c-2db07264a522") : references non-existent secret key: ca.crt Apr 20 20:11:27.047627 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:27.047585 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3120646b-079d-424b-a31c-2db07264a522-certificates\") pod \"keda-operator-ffbb595cb-dx4nr\" (UID: \"3120646b-079d-424b-a31c-2db07264a522\") " pod="openshift-keda/keda-operator-ffbb595cb-dx4nr" Apr 20 20:11:27.048042 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:11:27.047724 2571 secret.go:281] references non-existent secret key: ca.crt Apr 20 20:11:27.048042 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:11:27.047741 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 20 20:11:27.048042 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:11:27.047752 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-dx4nr: references non-existent secret key: ca.crt Apr 20 20:11:27.048042 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:11:27.047812 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3120646b-079d-424b-a31c-2db07264a522-certificates podName:3120646b-079d-424b-a31c-2db07264a522 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:29.047797387 +0000 UTC m=+330.087286911 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3120646b-079d-424b-a31c-2db07264a522-certificates") pod "keda-operator-ffbb595cb-dx4nr" (UID: "3120646b-079d-424b-a31c-2db07264a522") : references non-existent secret key: ca.crt Apr 20 20:11:29.063752 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:29.063725 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3120646b-079d-424b-a31c-2db07264a522-certificates\") pod \"keda-operator-ffbb595cb-dx4nr\" (UID: \"3120646b-079d-424b-a31c-2db07264a522\") " pod="openshift-keda/keda-operator-ffbb595cb-dx4nr" Apr 20 20:11:29.064127 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:11:29.063835 2571 secret.go:281] references non-existent secret key: ca.crt Apr 20 20:11:29.064127 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:11:29.063847 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 20 20:11:29.064127 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:11:29.063855 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-dx4nr: references non-existent secret key: ca.crt Apr 20 20:11:29.064127 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:11:29.063914 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3120646b-079d-424b-a31c-2db07264a522-certificates podName:3120646b-079d-424b-a31c-2db07264a522 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:33.06390134 +0000 UTC m=+334.103390863 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3120646b-079d-424b-a31c-2db07264a522-certificates") pod "keda-operator-ffbb595cb-dx4nr" (UID: "3120646b-079d-424b-a31c-2db07264a522") : references non-existent secret key: ca.crt Apr 20 20:11:33.096448 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:33.096379 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3120646b-079d-424b-a31c-2db07264a522-certificates\") pod \"keda-operator-ffbb595cb-dx4nr\" (UID: \"3120646b-079d-424b-a31c-2db07264a522\") " pod="openshift-keda/keda-operator-ffbb595cb-dx4nr" Apr 20 20:11:33.099093 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:33.099065 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3120646b-079d-424b-a31c-2db07264a522-certificates\") pod \"keda-operator-ffbb595cb-dx4nr\" (UID: \"3120646b-079d-424b-a31c-2db07264a522\") " pod="openshift-keda/keda-operator-ffbb595cb-dx4nr" Apr 20 20:11:33.160851 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:33.160803 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-dx4nr" Apr 20 20:11:33.282687 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:33.282662 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-dx4nr"] Apr 20 20:11:33.284821 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:11:33.284784 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3120646b_079d_424b_a31c_2db07264a522.slice/crio-7497b0be6fb6113de5867e669a883003a66d72103cec9f59a6e5a2e8c6a5d1f8 WatchSource:0}: Error finding container 7497b0be6fb6113de5867e669a883003a66d72103cec9f59a6e5a2e8c6a5d1f8: Status 404 returned error can't find the container with id 7497b0be6fb6113de5867e669a883003a66d72103cec9f59a6e5a2e8c6a5d1f8 Apr 20 20:11:33.659917 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:33.659880 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-dx4nr" event={"ID":"3120646b-079d-424b-a31c-2db07264a522","Type":"ContainerStarted","Data":"7497b0be6fb6113de5867e669a883003a66d72103cec9f59a6e5a2e8c6a5d1f8"} Apr 20 20:11:36.672289 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:36.672201 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-dx4nr" event={"ID":"3120646b-079d-424b-a31c-2db07264a522","Type":"ContainerStarted","Data":"35e5029c0ac498520edc5c30386eda77cd9639d0677413c4539818c4a92af96a"} Apr 20 20:11:36.672664 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:36.672309 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-dx4nr" Apr 20 20:11:36.710514 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:36.710468 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-dx4nr" podStartSLOduration=8.981475094 podStartE2EDuration="11.710453622s" podCreationTimestamp="2026-04-20 20:11:25 +0000 UTC" firstStartedPulling="2026-04-20 20:11:33.286184181 +0000 UTC m=+334.325673704" lastFinishedPulling="2026-04-20 20:11:36.015162704 +0000 UTC m=+337.054652232" observedRunningTime="2026-04-20 20:11:36.708360948 +0000 UTC m=+337.747850492" watchObservedRunningTime="2026-04-20 20:11:36.710453622 +0000 UTC m=+337.749943163" Apr 20 20:11:46.637336 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:46.637304 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xvnr2" Apr 20 20:11:57.678411 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:11:57.678340 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-dx4nr" Apr 20 20:12:32.370071 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.370028 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-gzfb6"] Apr 20 20:12:32.373191 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.373171 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-8zbhw"] Apr 20 20:12:32.373354 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.373334 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-gzfb6" Apr 20 20:12:32.376120 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.376105 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8zbhw" Apr 20 20:12:32.378752 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.378729 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 20 20:12:32.379274 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.379257 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 20 20:12:32.381400 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.381381 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-xdksc\"" Apr 20 20:12:32.381868 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.381851 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 20 20:12:32.382084 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.382063 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-gzfb6"] Apr 20 20:12:32.384297 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.384279 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 20 20:12:32.385536 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.385518 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-szhrw\"" Apr 20 20:12:32.391661 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.391639 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-8zbhw"] Apr 20 20:12:32.454272 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.454242 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-xfmm8"] Apr 20 20:12:32.457287 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.457265 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-xfmm8" Apr 20 20:12:32.461319 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.461301 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-h9zb4\"" Apr 20 20:12:32.461434 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.461334 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 20 20:12:32.466610 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.466587 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2fa9258-8ac4-437c-ae86-0be123a966e4-cert\") pod \"kserve-controller-manager-6f655776dd-gzfb6\" (UID: \"d2fa9258-8ac4-437c-ae86-0be123a966e4\") " pod="kserve/kserve-controller-manager-6f655776dd-gzfb6" Apr 20 20:12:32.466724 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.466624 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e7cfa8d-c464-4632-8bd5-63a6a9a8bf66-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-8zbhw\" (UID: \"6e7cfa8d-c464-4632-8bd5-63a6a9a8bf66\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8zbhw" Apr 20 20:12:32.466724 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.466662 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz7lw\" (UniqueName: \"kubernetes.io/projected/d2fa9258-8ac4-437c-ae86-0be123a966e4-kube-api-access-mz7lw\") pod \"kserve-controller-manager-6f655776dd-gzfb6\" (UID: \"d2fa9258-8ac4-437c-ae86-0be123a966e4\") " pod="kserve/kserve-controller-manager-6f655776dd-gzfb6" Apr 20 20:12:32.466833 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.466729 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwn9x\" (UniqueName: \"kubernetes.io/projected/6e7cfa8d-c464-4632-8bd5-63a6a9a8bf66-kube-api-access-dwn9x\") pod \"llmisvc-controller-manager-68cc5db7c4-8zbhw\" (UID: \"6e7cfa8d-c464-4632-8bd5-63a6a9a8bf66\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8zbhw" Apr 20 20:12:32.473398 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.473376 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-xfmm8"] Apr 20 20:12:32.567769 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.567740 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/14c1f75f-12c2-4d61-86c7-49b01cae6c40-data\") pod \"seaweedfs-86cc847c5c-xfmm8\" (UID: \"14c1f75f-12c2-4d61-86c7-49b01cae6c40\") " pod="kserve/seaweedfs-86cc847c5c-xfmm8" Apr 20 20:12:32.567980 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.567785 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2fa9258-8ac4-437c-ae86-0be123a966e4-cert\") pod \"kserve-controller-manager-6f655776dd-gzfb6\" (UID: \"d2fa9258-8ac4-437c-ae86-0be123a966e4\") " pod="kserve/kserve-controller-manager-6f655776dd-gzfb6" Apr 20 20:12:32.567980 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.567807 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e7cfa8d-c464-4632-8bd5-63a6a9a8bf66-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-8zbhw\" (UID: \"6e7cfa8d-c464-4632-8bd5-63a6a9a8bf66\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8zbhw" Apr 20 20:12:32.567980 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:12:32.567898 2571 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 20 20:12:32.567980 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.567911 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mz7lw\" (UniqueName: \"kubernetes.io/projected/d2fa9258-8ac4-437c-ae86-0be123a966e4-kube-api-access-mz7lw\") pod \"kserve-controller-manager-6f655776dd-gzfb6\" (UID: \"d2fa9258-8ac4-437c-ae86-0be123a966e4\") " pod="kserve/kserve-controller-manager-6f655776dd-gzfb6" Apr 20 20:12:32.567980 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:12:32.567955 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e7cfa8d-c464-4632-8bd5-63a6a9a8bf66-cert podName:6e7cfa8d-c464-4632-8bd5-63a6a9a8bf66 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:33.067932681 +0000 UTC m=+394.107422210 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e7cfa8d-c464-4632-8bd5-63a6a9a8bf66-cert") pod "llmisvc-controller-manager-68cc5db7c4-8zbhw" (UID: "6e7cfa8d-c464-4632-8bd5-63a6a9a8bf66") : secret "llmisvc-webhook-server-cert" not found Apr 20 20:12:32.568272 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.567991 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwn9x\" (UniqueName: \"kubernetes.io/projected/6e7cfa8d-c464-4632-8bd5-63a6a9a8bf66-kube-api-access-dwn9x\") pod \"llmisvc-controller-manager-68cc5db7c4-8zbhw\" (UID: \"6e7cfa8d-c464-4632-8bd5-63a6a9a8bf66\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8zbhw" Apr 20 20:12:32.568272 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.568052 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl7lw\" (UniqueName: \"kubernetes.io/projected/14c1f75f-12c2-4d61-86c7-49b01cae6c40-kube-api-access-rl7lw\") pod \"seaweedfs-86cc847c5c-xfmm8\" (UID: \"14c1f75f-12c2-4d61-86c7-49b01cae6c40\") " pod="kserve/seaweedfs-86cc847c5c-xfmm8" Apr 20 20:12:32.570997 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.570977 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2fa9258-8ac4-437c-ae86-0be123a966e4-cert\") pod \"kserve-controller-manager-6f655776dd-gzfb6\" (UID: \"d2fa9258-8ac4-437c-ae86-0be123a966e4\") " pod="kserve/kserve-controller-manager-6f655776dd-gzfb6" Apr 20 20:12:32.578504 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.578482 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwn9x\" (UniqueName: \"kubernetes.io/projected/6e7cfa8d-c464-4632-8bd5-63a6a9a8bf66-kube-api-access-dwn9x\") pod \"llmisvc-controller-manager-68cc5db7c4-8zbhw\" (UID: \"6e7cfa8d-c464-4632-8bd5-63a6a9a8bf66\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8zbhw" Apr 20 20:12:32.578618 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.578481 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz7lw\" (UniqueName: \"kubernetes.io/projected/d2fa9258-8ac4-437c-ae86-0be123a966e4-kube-api-access-mz7lw\") pod \"kserve-controller-manager-6f655776dd-gzfb6\" (UID: \"d2fa9258-8ac4-437c-ae86-0be123a966e4\") " pod="kserve/kserve-controller-manager-6f655776dd-gzfb6" Apr 20 20:12:32.668626 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.668555 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/14c1f75f-12c2-4d61-86c7-49b01cae6c40-data\") pod \"seaweedfs-86cc847c5c-xfmm8\" (UID: \"14c1f75f-12c2-4d61-86c7-49b01cae6c40\") " pod="kserve/seaweedfs-86cc847c5c-xfmm8" Apr 20 20:12:32.668752 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.668638 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rl7lw\" (UniqueName: \"kubernetes.io/projected/14c1f75f-12c2-4d61-86c7-49b01cae6c40-kube-api-access-rl7lw\") pod \"seaweedfs-86cc847c5c-xfmm8\" (UID: \"14c1f75f-12c2-4d61-86c7-49b01cae6c40\") " pod="kserve/seaweedfs-86cc847c5c-xfmm8" Apr 20 20:12:32.668914 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.668895 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/14c1f75f-12c2-4d61-86c7-49b01cae6c40-data\") pod \"seaweedfs-86cc847c5c-xfmm8\" (UID: \"14c1f75f-12c2-4d61-86c7-49b01cae6c40\") " pod="kserve/seaweedfs-86cc847c5c-xfmm8" Apr 20 20:12:32.679902 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.679871 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl7lw\" (UniqueName: \"kubernetes.io/projected/14c1f75f-12c2-4d61-86c7-49b01cae6c40-kube-api-access-rl7lw\") pod \"seaweedfs-86cc847c5c-xfmm8\" (UID: \"14c1f75f-12c2-4d61-86c7-49b01cae6c40\") " pod="kserve/seaweedfs-86cc847c5c-xfmm8" Apr 20 20:12:32.684705 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.684680 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-gzfb6" Apr 20 20:12:32.767774 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.767740 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-xfmm8" Apr 20 20:12:32.803683 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.803659 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-gzfb6"] Apr 20 20:12:32.842789 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.842757 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-gzfb6" event={"ID":"d2fa9258-8ac4-437c-ae86-0be123a966e4","Type":"ContainerStarted","Data":"1fe499ce76e1a92b32bf2f55389fc0d4a17cb7bbb9722e765489b9d5d1ab0853"} Apr 20 20:12:32.890484 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:32.890362 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-xfmm8"] Apr 20 20:12:32.892951 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:12:32.892923 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14c1f75f_12c2_4d61_86c7_49b01cae6c40.slice/crio-1165379b49786f554440695fccb54ee6b7a2d85b939640edc0d619e6e413bf8e WatchSource:0}: Error finding container 1165379b49786f554440695fccb54ee6b7a2d85b939640edc0d619e6e413bf8e: Status 404 returned error can't find the container with id 1165379b49786f554440695fccb54ee6b7a2d85b939640edc0d619e6e413bf8e Apr 20 20:12:33.072069 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:33.072032 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e7cfa8d-c464-4632-8bd5-63a6a9a8bf66-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-8zbhw\" (UID: \"6e7cfa8d-c464-4632-8bd5-63a6a9a8bf66\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8zbhw" Apr 20 20:12:33.074611 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:33.074586 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e7cfa8d-c464-4632-8bd5-63a6a9a8bf66-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-8zbhw\" (UID: \"6e7cfa8d-c464-4632-8bd5-63a6a9a8bf66\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8zbhw" Apr 20 20:12:33.291596 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:33.291561 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8zbhw" Apr 20 20:12:33.464319 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:33.464263 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-8zbhw"] Apr 20 20:12:33.491290 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:12:33.491250 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6e7cfa8d_c464_4632_8bd5_63a6a9a8bf66.slice/crio-e743ef2f36e1ff596e5e436e3161e58b4ba967425869f4937a62dad0e92fe593 WatchSource:0}: Error finding container e743ef2f36e1ff596e5e436e3161e58b4ba967425869f4937a62dad0e92fe593: Status 404 returned error can't find the container with id e743ef2f36e1ff596e5e436e3161e58b4ba967425869f4937a62dad0e92fe593 Apr 20 20:12:33.853412 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:33.853326 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8zbhw" event={"ID":"6e7cfa8d-c464-4632-8bd5-63a6a9a8bf66","Type":"ContainerStarted","Data":"e743ef2f36e1ff596e5e436e3161e58b4ba967425869f4937a62dad0e92fe593"} Apr 20 20:12:33.860458 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:33.860399 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-xfmm8" event={"ID":"14c1f75f-12c2-4d61-86c7-49b01cae6c40","Type":"ContainerStarted","Data":"1165379b49786f554440695fccb54ee6b7a2d85b939640edc0d619e6e413bf8e"} Apr 20 20:12:37.875504 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:37.875465 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-gzfb6" event={"ID":"d2fa9258-8ac4-437c-ae86-0be123a966e4","Type":"ContainerStarted","Data":"a9bcf5864c677c04661c2b3179f5e6e74b9633e2591412531b83d74af177d0a4"} Apr 20 20:12:37.875969 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:37.875549 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6f655776dd-gzfb6" Apr 20 20:12:37.876721 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:37.876701 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8zbhw" event={"ID":"6e7cfa8d-c464-4632-8bd5-63a6a9a8bf66","Type":"ContainerStarted","Data":"b43e3cb1cced51a9908553cf27ed3c47f9dd5221c625c62e02a78777235df6ba"} Apr 20 20:12:37.876859 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:37.876789 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8zbhw" Apr 20 20:12:37.877918 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:37.877898 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-xfmm8" event={"ID":"14c1f75f-12c2-4d61-86c7-49b01cae6c40","Type":"ContainerStarted","Data":"af1f2dd89a28b4bb8df0d23afd40669cff9b50f0848d6f60ff2dc8b50ce3203f"} Apr 20 20:12:37.878040 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:37.878026 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-xfmm8" Apr 20 20:12:37.909317 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:37.909233 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6f655776dd-gzfb6" podStartSLOduration=1.758796491 podStartE2EDuration="5.909221062s" podCreationTimestamp="2026-04-20 20:12:32 +0000 UTC" firstStartedPulling="2026-04-20 20:12:32.808520547 +0000 UTC m=+393.848010072" lastFinishedPulling="2026-04-20 20:12:36.958945117 +0000 UTC m=+397.998434643" observedRunningTime="2026-04-20 20:12:37.907005404 +0000 UTC m=+398.946494951" watchObservedRunningTime="2026-04-20 20:12:37.909221062 +0000 UTC m=+398.948710607" Apr 20 20:12:37.953032 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:37.952987 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-xfmm8" podStartSLOduration=1.14643884 podStartE2EDuration="5.952973859s" podCreationTimestamp="2026-04-20 20:12:32 +0000 UTC" firstStartedPulling="2026-04-20 20:12:32.894214287 +0000 UTC m=+393.933703811" lastFinishedPulling="2026-04-20 20:12:37.700749288 +0000 UTC m=+398.740238830" observedRunningTime="2026-04-20 20:12:37.945672089 +0000 UTC m=+398.985161635" watchObservedRunningTime="2026-04-20 20:12:37.952973859 +0000 UTC m=+398.992463405" Apr 20 20:12:37.966329 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:37.966280 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8zbhw" podStartSLOduration=1.762896543 podStartE2EDuration="5.966266393s" podCreationTimestamp="2026-04-20 20:12:32 +0000 UTC" firstStartedPulling="2026-04-20 20:12:33.493014197 +0000 UTC m=+394.532503723" lastFinishedPulling="2026-04-20 20:12:37.696384047 +0000 UTC m=+398.735873573" observedRunningTime="2026-04-20 20:12:37.964979978 +0000 UTC m=+399.004469529" watchObservedRunningTime="2026-04-20 20:12:37.966266393 +0000 UTC m=+399.005755979" Apr 20 20:12:43.884554 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:12:43.884519 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-xfmm8" Apr 20 20:13:08.883767 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:08.883738 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8zbhw" Apr 20 20:13:08.886927 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:08.886893 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6f655776dd-gzfb6" Apr 20 20:13:10.396459 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.396415 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-gzfb6"] Apr 20 20:13:10.396876 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.396648 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-6f655776dd-gzfb6" podUID="d2fa9258-8ac4-437c-ae86-0be123a966e4" containerName="manager" containerID="cri-o://a9bcf5864c677c04661c2b3179f5e6e74b9633e2591412531b83d74af177d0a4" gracePeriod=10 Apr 20 20:13:10.424259 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.424225 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-gthrt"] Apr 20 20:13:10.492764 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.492735 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-gthrt"] Apr 20 20:13:10.492893 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.492875 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-gthrt" Apr 20 20:13:10.589240 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.589201 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkf29\" (UniqueName: \"kubernetes.io/projected/e64c308a-f9e2-49d3-990e-86dc9f689be4-kube-api-access-qkf29\") pod \"kserve-controller-manager-6f655776dd-gthrt\" (UID: \"e64c308a-f9e2-49d3-990e-86dc9f689be4\") " pod="kserve/kserve-controller-manager-6f655776dd-gthrt" Apr 20 20:13:10.589442 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.589297 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e64c308a-f9e2-49d3-990e-86dc9f689be4-cert\") pod \"kserve-controller-manager-6f655776dd-gthrt\" (UID: \"e64c308a-f9e2-49d3-990e-86dc9f689be4\") " pod="kserve/kserve-controller-manager-6f655776dd-gthrt" Apr 20 20:13:10.648080 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.648015 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-gzfb6" Apr 20 20:13:10.689906 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.689872 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2fa9258-8ac4-437c-ae86-0be123a966e4-cert\") pod \"d2fa9258-8ac4-437c-ae86-0be123a966e4\" (UID: \"d2fa9258-8ac4-437c-ae86-0be123a966e4\") " Apr 20 20:13:10.690081 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.689978 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e64c308a-f9e2-49d3-990e-86dc9f689be4-cert\") pod \"kserve-controller-manager-6f655776dd-gthrt\" (UID: \"e64c308a-f9e2-49d3-990e-86dc9f689be4\") " pod="kserve/kserve-controller-manager-6f655776dd-gthrt" Apr 20 20:13:10.690130 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.690078 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkf29\" (UniqueName: \"kubernetes.io/projected/e64c308a-f9e2-49d3-990e-86dc9f689be4-kube-api-access-qkf29\") pod \"kserve-controller-manager-6f655776dd-gthrt\" (UID: \"e64c308a-f9e2-49d3-990e-86dc9f689be4\") " pod="kserve/kserve-controller-manager-6f655776dd-gthrt" Apr 20 20:13:10.692290 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.692249 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2fa9258-8ac4-437c-ae86-0be123a966e4-cert" (OuterVolumeSpecName: "cert") pod "d2fa9258-8ac4-437c-ae86-0be123a966e4" (UID: "d2fa9258-8ac4-437c-ae86-0be123a966e4"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:13:10.692592 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.692573 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e64c308a-f9e2-49d3-990e-86dc9f689be4-cert\") pod \"kserve-controller-manager-6f655776dd-gthrt\" (UID: \"e64c308a-f9e2-49d3-990e-86dc9f689be4\") " pod="kserve/kserve-controller-manager-6f655776dd-gthrt" Apr 20 20:13:10.699487 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.699462 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkf29\" (UniqueName: \"kubernetes.io/projected/e64c308a-f9e2-49d3-990e-86dc9f689be4-kube-api-access-qkf29\") pod \"kserve-controller-manager-6f655776dd-gthrt\" (UID: \"e64c308a-f9e2-49d3-990e-86dc9f689be4\") " pod="kserve/kserve-controller-manager-6f655776dd-gthrt" Apr 20 20:13:10.790491 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.790449 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz7lw\" (UniqueName: \"kubernetes.io/projected/d2fa9258-8ac4-437c-ae86-0be123a966e4-kube-api-access-mz7lw\") pod \"d2fa9258-8ac4-437c-ae86-0be123a966e4\" (UID: \"d2fa9258-8ac4-437c-ae86-0be123a966e4\") " Apr 20 20:13:10.790684 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.790642 2571 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2fa9258-8ac4-437c-ae86-0be123a966e4-cert\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:13:10.792698 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.792666 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2fa9258-8ac4-437c-ae86-0be123a966e4-kube-api-access-mz7lw" (OuterVolumeSpecName: "kube-api-access-mz7lw") pod "d2fa9258-8ac4-437c-ae86-0be123a966e4" (UID: "d2fa9258-8ac4-437c-ae86-0be123a966e4"). InnerVolumeSpecName "kube-api-access-mz7lw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:13:10.873572 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.873530 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-gthrt" Apr 20 20:13:10.891581 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.891552 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mz7lw\" (UniqueName: \"kubernetes.io/projected/d2fa9258-8ac4-437c-ae86-0be123a966e4-kube-api-access-mz7lw\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:13:10.986240 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.986199 2571 generic.go:358] "Generic (PLEG): container finished" podID="d2fa9258-8ac4-437c-ae86-0be123a966e4" containerID="a9bcf5864c677c04661c2b3179f5e6e74b9633e2591412531b83d74af177d0a4" exitCode=0 Apr 20 20:13:10.986411 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.986252 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-gzfb6" event={"ID":"d2fa9258-8ac4-437c-ae86-0be123a966e4","Type":"ContainerDied","Data":"a9bcf5864c677c04661c2b3179f5e6e74b9633e2591412531b83d74af177d0a4"} Apr 20 20:13:10.986411 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.986273 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-gzfb6" Apr 20 20:13:10.986411 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.986289 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-gzfb6" event={"ID":"d2fa9258-8ac4-437c-ae86-0be123a966e4","Type":"ContainerDied","Data":"1fe499ce76e1a92b32bf2f55389fc0d4a17cb7bbb9722e765489b9d5d1ab0853"} Apr 20 20:13:10.986411 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.986310 2571 scope.go:117] "RemoveContainer" containerID="a9bcf5864c677c04661c2b3179f5e6e74b9633e2591412531b83d74af177d0a4" Apr 20 20:13:10.995630 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.995609 2571 scope.go:117] "RemoveContainer" containerID="a9bcf5864c677c04661c2b3179f5e6e74b9633e2591412531b83d74af177d0a4" Apr 20 20:13:10.995947 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:13:10.995926 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9bcf5864c677c04661c2b3179f5e6e74b9633e2591412531b83d74af177d0a4\": container with ID starting with a9bcf5864c677c04661c2b3179f5e6e74b9633e2591412531b83d74af177d0a4 not found: ID does not exist" containerID="a9bcf5864c677c04661c2b3179f5e6e74b9633e2591412531b83d74af177d0a4" Apr 20 20:13:10.996035 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.995955 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-gthrt"] Apr 20 20:13:10.996035 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:10.995961 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9bcf5864c677c04661c2b3179f5e6e74b9633e2591412531b83d74af177d0a4"} err="failed to get container status \"a9bcf5864c677c04661c2b3179f5e6e74b9633e2591412531b83d74af177d0a4\": rpc error: code = NotFound desc = could not find container \"a9bcf5864c677c04661c2b3179f5e6e74b9633e2591412531b83d74af177d0a4\": container with ID starting with a9bcf5864c677c04661c2b3179f5e6e74b9633e2591412531b83d74af177d0a4 not found: ID does not exist" Apr 20 20:13:10.998876 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:13:10.998852 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode64c308a_f9e2_49d3_990e_86dc9f689be4.slice/crio-04e9f31b0e8531c1c8a93aed92e580eaa9fc73417b7f6dfbda1e31632c7e6e57 WatchSource:0}: Error finding container 04e9f31b0e8531c1c8a93aed92e580eaa9fc73417b7f6dfbda1e31632c7e6e57: Status 404 returned error can't find the container with id 04e9f31b0e8531c1c8a93aed92e580eaa9fc73417b7f6dfbda1e31632c7e6e57 Apr 20 20:13:11.009227 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:11.009195 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-gzfb6"] Apr 20 20:13:11.011501 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:11.011475 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-gzfb6"] Apr 20 20:13:11.527268 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:11.527224 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2fa9258-8ac4-437c-ae86-0be123a966e4" path="/var/lib/kubelet/pods/d2fa9258-8ac4-437c-ae86-0be123a966e4/volumes" Apr 20 20:13:11.992201 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:11.992098 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-gthrt" event={"ID":"e64c308a-f9e2-49d3-990e-86dc9f689be4","Type":"ContainerStarted","Data":"c67937ef438f849f60a7af6f165c1392181448f04de85b6fcafed46ca8ed2a77"} Apr 20 20:13:11.992201 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:11.992137 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-gthrt" event={"ID":"e64c308a-f9e2-49d3-990e-86dc9f689be4","Type":"ContainerStarted","Data":"04e9f31b0e8531c1c8a93aed92e580eaa9fc73417b7f6dfbda1e31632c7e6e57"} Apr 20 20:13:11.992201 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:11.992176 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6f655776dd-gthrt" Apr 20 20:13:12.010123 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:12.010065 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6f655776dd-gthrt" podStartSLOduration=1.269356151 podStartE2EDuration="2.010049365s" podCreationTimestamp="2026-04-20 20:13:10 +0000 UTC" firstStartedPulling="2026-04-20 20:13:11.000337784 +0000 UTC m=+432.039827328" lastFinishedPulling="2026-04-20 20:13:11.741031007 +0000 UTC m=+432.780520542" observedRunningTime="2026-04-20 20:13:12.008227749 +0000 UTC m=+433.047717291" watchObservedRunningTime="2026-04-20 20:13:12.010049365 +0000 UTC m=+433.049538910" Apr 20 20:13:43.001025 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:43.000990 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6f655776dd-gthrt" Apr 20 20:13:43.930605 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:43.930568 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-lgtp4"] Apr 20 20:13:43.930908 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:43.930896 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2fa9258-8ac4-437c-ae86-0be123a966e4" containerName="manager" Apr 20 20:13:43.930955 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:43.930911 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2fa9258-8ac4-437c-ae86-0be123a966e4" containerName="manager" Apr 20 20:13:43.930988 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:43.930973 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2fa9258-8ac4-437c-ae86-0be123a966e4" containerName="manager" Apr 20 20:13:43.934068 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:43.934053 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-lgtp4" Apr 20 20:13:43.936954 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:43.936932 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-5zzd4\"" Apr 20 20:13:43.937058 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:43.936937 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 20 20:13:43.944823 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:43.944803 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-lgtp4"] Apr 20 20:13:44.053720 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:44.053685 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c3e845f7-93bc-431d-9e1c-3a863b9719ce-tls-certs\") pod \"model-serving-api-86f7b4b499-lgtp4\" (UID: \"c3e845f7-93bc-431d-9e1c-3a863b9719ce\") " pod="kserve/model-serving-api-86f7b4b499-lgtp4" Apr 20 20:13:44.054114 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:44.053763 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzptz\" (UniqueName: \"kubernetes.io/projected/c3e845f7-93bc-431d-9e1c-3a863b9719ce-kube-api-access-mzptz\") pod \"model-serving-api-86f7b4b499-lgtp4\" (UID: \"c3e845f7-93bc-431d-9e1c-3a863b9719ce\") " pod="kserve/model-serving-api-86f7b4b499-lgtp4" Apr 20 20:13:44.154509 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:44.154471 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c3e845f7-93bc-431d-9e1c-3a863b9719ce-tls-certs\") pod \"model-serving-api-86f7b4b499-lgtp4\" (UID: \"c3e845f7-93bc-431d-9e1c-3a863b9719ce\") " pod="kserve/model-serving-api-86f7b4b499-lgtp4" Apr 20 20:13:44.154715 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:44.154536 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzptz\" (UniqueName: \"kubernetes.io/projected/c3e845f7-93bc-431d-9e1c-3a863b9719ce-kube-api-access-mzptz\") pod \"model-serving-api-86f7b4b499-lgtp4\" (UID: \"c3e845f7-93bc-431d-9e1c-3a863b9719ce\") " pod="kserve/model-serving-api-86f7b4b499-lgtp4" Apr 20 20:13:44.157058 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:44.157033 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c3e845f7-93bc-431d-9e1c-3a863b9719ce-tls-certs\") pod \"model-serving-api-86f7b4b499-lgtp4\" (UID: \"c3e845f7-93bc-431d-9e1c-3a863b9719ce\") " pod="kserve/model-serving-api-86f7b4b499-lgtp4" Apr 20 20:13:44.163636 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:44.163612 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzptz\" (UniqueName: \"kubernetes.io/projected/c3e845f7-93bc-431d-9e1c-3a863b9719ce-kube-api-access-mzptz\") pod \"model-serving-api-86f7b4b499-lgtp4\" (UID: \"c3e845f7-93bc-431d-9e1c-3a863b9719ce\") " pod="kserve/model-serving-api-86f7b4b499-lgtp4" Apr 20 20:13:44.245406 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:44.245311 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-lgtp4" Apr 20 20:13:44.371375 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:44.371337 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-lgtp4"] Apr 20 20:13:44.374548 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:13:44.374517 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3e845f7_93bc_431d_9e1c_3a863b9719ce.slice/crio-29c34e7d2e894c6bb0dd556573a64f5a1971eb6e7f77bd72fd0af721d27572fb WatchSource:0}: Error finding container 29c34e7d2e894c6bb0dd556573a64f5a1971eb6e7f77bd72fd0af721d27572fb: Status 404 returned error can't find the container with id 29c34e7d2e894c6bb0dd556573a64f5a1971eb6e7f77bd72fd0af721d27572fb Apr 20 20:13:45.098131 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:45.098086 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-lgtp4" event={"ID":"c3e845f7-93bc-431d-9e1c-3a863b9719ce","Type":"ContainerStarted","Data":"29c34e7d2e894c6bb0dd556573a64f5a1971eb6e7f77bd72fd0af721d27572fb"} Apr 20 20:13:47.106577 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:47.106537 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-lgtp4" event={"ID":"c3e845f7-93bc-431d-9e1c-3a863b9719ce","Type":"ContainerStarted","Data":"8a61ae1709aa90a6e0ecf85c6ab57b145b5c844686d42b3ff0f977c6dad22670"} Apr 20 20:13:47.106967 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:47.106664 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-lgtp4" Apr 20 20:13:58.113750 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:58.113722 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-lgtp4" Apr 20 20:13:58.133891 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:13:58.133840 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-lgtp4" podStartSLOduration=12.904714226 podStartE2EDuration="15.1338257s" podCreationTimestamp="2026-04-20 20:13:43 +0000 UTC" firstStartedPulling="2026-04-20 20:13:44.376297139 +0000 UTC m=+465.415786664" lastFinishedPulling="2026-04-20 20:13:46.60540861 +0000 UTC m=+467.644898138" observedRunningTime="2026-04-20 20:13:47.125396797 +0000 UTC m=+468.164886343" watchObservedRunningTime="2026-04-20 20:13:58.1338257 +0000 UTC m=+479.173315245" Apr 20 20:14:06.691598 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.691559 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-b5dbdd44-g2gvj"] Apr 20 20:14:06.695948 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.695928 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:06.707832 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.707811 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b5dbdd44-g2gvj"] Apr 20 20:14:06.852467 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.852410 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6-console-serving-cert\") pod \"console-b5dbdd44-g2gvj\" (UID: \"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6\") " pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:06.852657 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.852477 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6-console-oauth-config\") pod \"console-b5dbdd44-g2gvj\" (UID: \"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6\") " pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:06.852657 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.852523 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6-console-config\") pod \"console-b5dbdd44-g2gvj\" (UID: \"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6\") " pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:06.852657 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.852553 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6-service-ca\") pod \"console-b5dbdd44-g2gvj\" (UID: \"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6\") " pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:06.852657 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.852582 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6-trusted-ca-bundle\") pod \"console-b5dbdd44-g2gvj\" (UID: \"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6\") " pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:06.852657 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.852611 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6-oauth-serving-cert\") pod \"console-b5dbdd44-g2gvj\" (UID: \"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6\") " pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:06.852657 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.852626 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9jvl\" (UniqueName: \"kubernetes.io/projected/a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6-kube-api-access-p9jvl\") pod \"console-b5dbdd44-g2gvj\" (UID: \"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6\") " pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:06.953554 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.953480 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6-trusted-ca-bundle\") pod \"console-b5dbdd44-g2gvj\" (UID: \"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6\") " pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:06.953554 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.953528 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6-oauth-serving-cert\") pod \"console-b5dbdd44-g2gvj\" (UID: \"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6\") " pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:06.953554 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.953544 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9jvl\" (UniqueName: \"kubernetes.io/projected/a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6-kube-api-access-p9jvl\") pod \"console-b5dbdd44-g2gvj\" (UID: \"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6\") " pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:06.953816 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.953609 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6-console-serving-cert\") pod \"console-b5dbdd44-g2gvj\" (UID: \"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6\") " pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:06.953816 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.953652 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6-console-oauth-config\") pod \"console-b5dbdd44-g2gvj\" (UID: \"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6\") " pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:06.953816 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.953695 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6-console-config\") pod \"console-b5dbdd44-g2gvj\" (UID: \"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6\") " pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:06.953816 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.953739 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6-service-ca\") pod \"console-b5dbdd44-g2gvj\" (UID: \"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6\") " pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:06.954447 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.954380 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6-service-ca\") pod \"console-b5dbdd44-g2gvj\" (UID: \"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6\") " pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:06.954807 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.954414 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6-trusted-ca-bundle\") pod \"console-b5dbdd44-g2gvj\" (UID: \"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6\") " pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:06.954807 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.954414 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6-oauth-serving-cert\") pod \"console-b5dbdd44-g2gvj\" (UID: \"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6\") " pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:06.954940 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.954465 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6-console-config\") pod \"console-b5dbdd44-g2gvj\" (UID: \"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6\") " pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:06.956119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.956098 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6-console-oauth-config\") pod \"console-b5dbdd44-g2gvj\" (UID: \"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6\") " pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:06.956310 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.956292 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6-console-serving-cert\") pod \"console-b5dbdd44-g2gvj\" (UID: \"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6\") " pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:06.961700 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:06.961678 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9jvl\" (UniqueName: \"kubernetes.io/projected/a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6-kube-api-access-p9jvl\") pod \"console-b5dbdd44-g2gvj\" (UID: \"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6\") " pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:07.005261 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:07.005218 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:07.136503 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:07.136468 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b5dbdd44-g2gvj"] Apr 20 20:14:07.138480 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:14:07.138455 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1c7430d_63f9_45d5_986a_3c4c7e8b6ef6.slice/crio-3fef356c7de2a46cd146fa4ffa43615cdc5726ae945b537c63f094b4d8d8a7f3 WatchSource:0}: Error finding container 3fef356c7de2a46cd146fa4ffa43615cdc5726ae945b537c63f094b4d8d8a7f3: Status 404 returned error can't find the container with id 3fef356c7de2a46cd146fa4ffa43615cdc5726ae945b537c63f094b4d8d8a7f3 Apr 20 20:14:07.176310 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:07.176281 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b5dbdd44-g2gvj" event={"ID":"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6","Type":"ContainerStarted","Data":"3fef356c7de2a46cd146fa4ffa43615cdc5726ae945b537c63f094b4d8d8a7f3"} Apr 20 20:14:08.181328 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:08.181290 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b5dbdd44-g2gvj" event={"ID":"a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6","Type":"ContainerStarted","Data":"1b2b9f1cc6e57c4a15b511932f10c2c75988dd121217f2ed3a6ed237ceee768c"} Apr 20 20:14:08.200450 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:08.200375 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b5dbdd44-g2gvj" podStartSLOduration=2.200360525 podStartE2EDuration="2.200360525s" podCreationTimestamp="2026-04-20 20:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:14:08.198203437 +0000 UTC m=+489.237692983" watchObservedRunningTime="2026-04-20 20:14:08.200360525 +0000 UTC m=+489.239850071" Apr 20 20:14:17.005915 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:17.005874 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:17.005915 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:17.005920 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:17.010790 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:17.010767 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:17.218489 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:17.218459 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-b5dbdd44-g2gvj" Apr 20 20:14:17.262298 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:17.262219 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d756b6df4-svlfv"] Apr 20 20:14:20.379284 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:20.379245 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8"] Apr 20 20:14:20.385238 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:20.385214 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:14:20.387840 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:20.387812 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-8ea43-predictor-serving-cert\"" Apr 20 20:14:20.387840 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:20.387818 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-8ea43-kube-rbac-proxy-sar-config\"" Apr 20 20:14:20.388085 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:20.387859 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lk7n4\"" Apr 20 20:14:20.388085 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:20.387961 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 20 20:14:20.388085 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:20.388045 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 20 20:14:20.392480 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:20.392456 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8"] Apr 20 20:14:20.461640 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:20.461608 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75b0814e-b821-4519-974f-33493a091ca5-proxy-tls\") pod \"isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8\" (UID: \"75b0814e-b821-4519-974f-33493a091ca5\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:14:20.461794 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:20.461665 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-raw-sklearn-batcher-8ea43-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/75b0814e-b821-4519-974f-33493a091ca5-isvc-raw-sklearn-batcher-8ea43-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8\" (UID: \"75b0814e-b821-4519-974f-33493a091ca5\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:14:20.461794 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:20.461698 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlq8w\" (UniqueName: \"kubernetes.io/projected/75b0814e-b821-4519-974f-33493a091ca5-kube-api-access-mlq8w\") pod \"isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8\" (UID: \"75b0814e-b821-4519-974f-33493a091ca5\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:14:20.461794 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:20.461738 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/75b0814e-b821-4519-974f-33493a091ca5-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8\" (UID: \"75b0814e-b821-4519-974f-33493a091ca5\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:14:20.562758 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:20.562712 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/75b0814e-b821-4519-974f-33493a091ca5-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8\" (UID: \"75b0814e-b821-4519-974f-33493a091ca5\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:14:20.562937 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:20.562831 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75b0814e-b821-4519-974f-33493a091ca5-proxy-tls\") pod \"isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8\" (UID: \"75b0814e-b821-4519-974f-33493a091ca5\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:14:20.562937 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:20.562866 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-raw-sklearn-batcher-8ea43-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/75b0814e-b821-4519-974f-33493a091ca5-isvc-raw-sklearn-batcher-8ea43-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8\" (UID: \"75b0814e-b821-4519-974f-33493a091ca5\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:14:20.562937 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:20.562908 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlq8w\" (UniqueName: \"kubernetes.io/projected/75b0814e-b821-4519-974f-33493a091ca5-kube-api-access-mlq8w\") pod \"isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8\" (UID: \"75b0814e-b821-4519-974f-33493a091ca5\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:14:20.563186 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:20.563162 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/75b0814e-b821-4519-974f-33493a091ca5-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8\" (UID: \"75b0814e-b821-4519-974f-33493a091ca5\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:14:20.563466 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:20.563409 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-raw-sklearn-batcher-8ea43-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/75b0814e-b821-4519-974f-33493a091ca5-isvc-raw-sklearn-batcher-8ea43-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8\" (UID: \"75b0814e-b821-4519-974f-33493a091ca5\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:14:20.565672 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:20.565645 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75b0814e-b821-4519-974f-33493a091ca5-proxy-tls\") pod \"isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8\" (UID: \"75b0814e-b821-4519-974f-33493a091ca5\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:14:20.573666 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:20.573639 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlq8w\" (UniqueName: \"kubernetes.io/projected/75b0814e-b821-4519-974f-33493a091ca5-kube-api-access-mlq8w\") pod \"isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8\" (UID: \"75b0814e-b821-4519-974f-33493a091ca5\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:14:20.697543 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:20.697453 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:14:20.826628 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:20.826522 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8"] Apr 20 20:14:20.828951 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:14:20.828920 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75b0814e_b821_4519_974f_33493a091ca5.slice/crio-d450558dfd6dc9bf209596053da92e9b6eecc782e225eb350ca9636cd4c98af8 WatchSource:0}: Error finding container d450558dfd6dc9bf209596053da92e9b6eecc782e225eb350ca9636cd4c98af8: Status 404 returned error can't find the container with id d450558dfd6dc9bf209596053da92e9b6eecc782e225eb350ca9636cd4c98af8 Apr 20 20:14:21.227719 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:21.227681 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" event={"ID":"75b0814e-b821-4519-974f-33493a091ca5","Type":"ContainerStarted","Data":"d450558dfd6dc9bf209596053da92e9b6eecc782e225eb350ca9636cd4c98af8"} Apr 20 20:14:24.242724 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:24.242676 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" event={"ID":"75b0814e-b821-4519-974f-33493a091ca5","Type":"ContainerStarted","Data":"005b3d961857206c090d3a876f24c309feafa88331e1c00945ab2e2a5bd7953b"} Apr 20 20:14:28.257264 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:28.257228 2571 generic.go:358] "Generic (PLEG): container finished" podID="75b0814e-b821-4519-974f-33493a091ca5" containerID="005b3d961857206c090d3a876f24c309feafa88331e1c00945ab2e2a5bd7953b" exitCode=0 Apr 20 20:14:28.257662 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:28.257275 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" event={"ID":"75b0814e-b821-4519-974f-33493a091ca5","Type":"ContainerDied","Data":"005b3d961857206c090d3a876f24c309feafa88331e1c00945ab2e2a5bd7953b"} Apr 20 20:14:42.283766 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:42.283711 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7d756b6df4-svlfv" podUID="5aff7e3c-f58f-418e-9e7e-5377d4ba84ab" containerName="console" containerID="cri-o://da779d6fb29b470d619323d691421cb30132be875d4adccc6dfa290468680eac" gracePeriod=15 Apr 20 20:14:42.319599 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:42.319558 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" event={"ID":"75b0814e-b821-4519-974f-33493a091ca5","Type":"ContainerStarted","Data":"93acc8693871fc641ad21fddceeee0bd499589841c915263528cb488110b7a39"} Apr 20 20:14:43.327857 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.327827 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d756b6df4-svlfv_5aff7e3c-f58f-418e-9e7e-5377d4ba84ab/console/0.log" Apr 20 20:14:43.328270 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.327872 2571 generic.go:358] "Generic (PLEG): container finished" podID="5aff7e3c-f58f-418e-9e7e-5377d4ba84ab" containerID="da779d6fb29b470d619323d691421cb30132be875d4adccc6dfa290468680eac" exitCode=2 Apr 20 20:14:43.328270 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.327960 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d756b6df4-svlfv" event={"ID":"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab","Type":"ContainerDied","Data":"da779d6fb29b470d619323d691421cb30132be875d4adccc6dfa290468680eac"} Apr 20 20:14:43.604606 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.604391 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d756b6df4-svlfv_5aff7e3c-f58f-418e-9e7e-5377d4ba84ab/console/0.log" Apr 20 20:14:43.604606 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.604494 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:14:43.677796 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.677758 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-service-ca\") pod \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " Apr 20 20:14:43.677971 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.677804 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7j55\" (UniqueName: \"kubernetes.io/projected/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-kube-api-access-r7j55\") pod \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " Apr 20 20:14:43.677971 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.677836 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-oauth-serving-cert\") pod \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " Apr 20 20:14:43.677971 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.677937 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-console-serving-cert\") pod \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " Apr 20 20:14:43.677971 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.677962 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-trusted-ca-bundle\") pod \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " Apr 20 20:14:43.678192 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.677997 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-console-config\") pod \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " Apr 20 20:14:43.678192 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.678040 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-console-oauth-config\") pod \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\" (UID: \"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab\") " Apr 20 20:14:43.678342 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.678304 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-service-ca" (OuterVolumeSpecName: "service-ca") pod "5aff7e3c-f58f-418e-9e7e-5377d4ba84ab" (UID: "5aff7e3c-f58f-418e-9e7e-5377d4ba84ab"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:14:43.678527 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.678484 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5aff7e3c-f58f-418e-9e7e-5377d4ba84ab" (UID: "5aff7e3c-f58f-418e-9e7e-5377d4ba84ab"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:14:43.678527 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.678498 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-console-config" (OuterVolumeSpecName: "console-config") pod "5aff7e3c-f58f-418e-9e7e-5377d4ba84ab" (UID: "5aff7e3c-f58f-418e-9e7e-5377d4ba84ab"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:14:43.678651 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.678615 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5aff7e3c-f58f-418e-9e7e-5377d4ba84ab" (UID: "5aff7e3c-f58f-418e-9e7e-5377d4ba84ab"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:14:43.680302 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.680269 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5aff7e3c-f58f-418e-9e7e-5377d4ba84ab" (UID: "5aff7e3c-f58f-418e-9e7e-5377d4ba84ab"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:14:43.680397 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.680316 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-kube-api-access-r7j55" (OuterVolumeSpecName: "kube-api-access-r7j55") pod "5aff7e3c-f58f-418e-9e7e-5377d4ba84ab" (UID: "5aff7e3c-f58f-418e-9e7e-5377d4ba84ab"). InnerVolumeSpecName "kube-api-access-r7j55". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:14:43.680397 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.680338 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5aff7e3c-f58f-418e-9e7e-5377d4ba84ab" (UID: "5aff7e3c-f58f-418e-9e7e-5377d4ba84ab"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:14:43.779247 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.779208 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-console-serving-cert\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:14:43.779247 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.779239 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-trusted-ca-bundle\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:14:43.779247 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.779250 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-console-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:14:43.779247 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.779259 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-console-oauth-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:14:43.779548 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.779268 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-service-ca\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:14:43.779548 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.779281 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r7j55\" (UniqueName: \"kubernetes.io/projected/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-kube-api-access-r7j55\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:14:43.779548 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:43.779289 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab-oauth-serving-cert\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:14:44.335850 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:44.335809 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d756b6df4-svlfv_5aff7e3c-f58f-418e-9e7e-5377d4ba84ab/console/0.log" Apr 20 20:14:44.336307 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:44.336003 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d756b6df4-svlfv" event={"ID":"5aff7e3c-f58f-418e-9e7e-5377d4ba84ab","Type":"ContainerDied","Data":"fc4d2849d92b775c99e06f21f6b832e04468aeb85d96fa0c6e9a4b4f173ee732"} Apr 20 20:14:44.336307 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:44.336053 2571 scope.go:117] "RemoveContainer" containerID="da779d6fb29b470d619323d691421cb30132be875d4adccc6dfa290468680eac" Apr 20 20:14:44.336307 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:44.336248 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d756b6df4-svlfv" Apr 20 20:14:44.340345 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:44.340307 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" event={"ID":"75b0814e-b821-4519-974f-33493a091ca5","Type":"ContainerStarted","Data":"2d32f62296eaccfee530c6ea1499592a6bcfc06036626f2d57b358d4506efe7a"} Apr 20 20:14:44.374588 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:44.374549 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d756b6df4-svlfv"] Apr 20 20:14:44.427072 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:44.427033 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7d756b6df4-svlfv"] Apr 20 20:14:45.527984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:45.527952 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aff7e3c-f58f-418e-9e7e-5377d4ba84ab" path="/var/lib/kubelet/pods/5aff7e3c-f58f-418e-9e7e-5377d4ba84ab/volumes" Apr 20 20:14:47.354676 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:47.354642 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" event={"ID":"75b0814e-b821-4519-974f-33493a091ca5","Type":"ContainerStarted","Data":"512ea8b5ddb6f1ecb3c14864be631fdb02e364d3b9a4cc5ca53f44caf1d7e1d8"} Apr 20 20:14:47.355169 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:47.354851 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:14:47.379220 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:47.379167 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podStartSLOduration=1.670973759 podStartE2EDuration="27.379152322s" podCreationTimestamp="2026-04-20 20:14:20 +0000 UTC" firstStartedPulling="2026-04-20 20:14:20.831050114 +0000 UTC m=+501.870539641" lastFinishedPulling="2026-04-20 20:14:46.539228678 +0000 UTC m=+527.578718204" observedRunningTime="2026-04-20 20:14:47.377139517 +0000 UTC m=+528.416629059" watchObservedRunningTime="2026-04-20 20:14:47.379152322 +0000 UTC m=+528.418641868" Apr 20 20:14:48.358614 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:48.358522 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:14:48.358614 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:48.358562 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:14:48.360192 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:48.360158 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 20 20:14:48.360865 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:48.360834 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:14:48.363737 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:48.363719 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:14:49.362072 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:49.362031 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 20 20:14:49.362529 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:49.362443 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:14:50.365678 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:50.365634 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 20 20:14:50.366077 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:14:50.365945 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:15:00.366202 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:15:00.366147 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 20 20:15:00.366707 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:15:00.366654 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:15:10.366529 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:15:10.366483 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 20 20:15:10.366948 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:15:10.366870 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:15:20.366064 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:15:20.366020 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 20 20:15:20.366597 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:15:20.366571 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:15:30.366549 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:15:30.366500 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 20 20:15:30.366979 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:15:30.366903 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:15:40.366140 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:15:40.366086 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 20 20:15:40.366595 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:15:40.366531 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:15:50.366575 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:15:50.366537 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:15:50.366972 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:15:50.366627 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:15:59.438867 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:15:59.438841 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kldp5_997e9539-5288-4af5-92f4-55d8ccefbbf7/ovn-acl-logging/0.log" Apr 20 20:15:59.439906 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:15:59.439886 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kldp5_997e9539-5288-4af5-92f4-55d8ccefbbf7/ovn-acl-logging/0.log" Apr 20 20:16:05.529933 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.529898 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8"] Apr 20 20:16:05.532322 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.530408 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kserve-container" containerID="cri-o://93acc8693871fc641ad21fddceeee0bd499589841c915263528cb488110b7a39" gracePeriod=30 Apr 20 20:16:05.532322 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.530466 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kube-rbac-proxy" containerID="cri-o://2d32f62296eaccfee530c6ea1499592a6bcfc06036626f2d57b358d4506efe7a" gracePeriod=30 Apr 20 20:16:05.532322 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.530411 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="agent" containerID="cri-o://512ea8b5ddb6f1ecb3c14864be631fdb02e364d3b9a4cc5ca53f44caf1d7e1d8" gracePeriod=30 Apr 20 20:16:05.634619 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.634586 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv"] Apr 20 20:16:05.634959 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.634945 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5aff7e3c-f58f-418e-9e7e-5377d4ba84ab" containerName="console" Apr 20 20:16:05.635011 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.634961 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aff7e3c-f58f-418e-9e7e-5377d4ba84ab" containerName="console" Apr 20 20:16:05.635057 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.635036 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5aff7e3c-f58f-418e-9e7e-5377d4ba84ab" containerName="console" Apr 20 20:16:05.638533 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.638508 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" Apr 20 20:16:05.641091 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.640896 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-87d08-predictor-serving-cert\"" Apr 20 20:16:05.641091 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.641010 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-87d08-kube-rbac-proxy-sar-config\"" Apr 20 20:16:05.648217 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.648184 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv"] Apr 20 20:16:05.662415 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.662388 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5b9121c-dafe-40ae-8da8-c95937dd3823-proxy-tls\") pod \"isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv\" (UID: \"b5b9121c-dafe-40ae-8da8-c95937dd3823\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" Apr 20 20:16:05.662587 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.662503 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-87d08-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5b9121c-dafe-40ae-8da8-c95937dd3823-isvc-sklearn-graph-raw-87d08-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv\" (UID: \"b5b9121c-dafe-40ae-8da8-c95937dd3823\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" Apr 20 20:16:05.662587 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.662537 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rn87\" (UniqueName: \"kubernetes.io/projected/b5b9121c-dafe-40ae-8da8-c95937dd3823-kube-api-access-4rn87\") pod \"isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv\" (UID: \"b5b9121c-dafe-40ae-8da8-c95937dd3823\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" Apr 20 20:16:05.662587 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.662566 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5b9121c-dafe-40ae-8da8-c95937dd3823-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv\" (UID: \"b5b9121c-dafe-40ae-8da8-c95937dd3823\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" Apr 20 20:16:05.711041 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.711008 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22"] Apr 20 20:16:05.715045 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.715022 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" Apr 20 20:16:05.717514 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.717488 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-87d08-kube-rbac-proxy-sar-config\"" Apr 20 20:16:05.717627 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.717493 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-87d08-predictor-serving-cert\"" Apr 20 20:16:05.725306 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.725244 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22"] Apr 20 20:16:05.762980 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.762950 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-87d08-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5b9121c-dafe-40ae-8da8-c95937dd3823-isvc-sklearn-graph-raw-87d08-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv\" (UID: \"b5b9121c-dafe-40ae-8da8-c95937dd3823\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" Apr 20 20:16:05.763160 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.762986 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rn87\" (UniqueName: \"kubernetes.io/projected/b5b9121c-dafe-40ae-8da8-c95937dd3823-kube-api-access-4rn87\") pod \"isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv\" (UID: \"b5b9121c-dafe-40ae-8da8-c95937dd3823\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" Apr 20 20:16:05.763160 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.763011 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5b9121c-dafe-40ae-8da8-c95937dd3823-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv\" (UID: \"b5b9121c-dafe-40ae-8da8-c95937dd3823\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" Apr 20 20:16:05.763160 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.763055 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-87d08-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0dec70b8-66c8-428b-a746-6c4139f337ab-isvc-xgboost-graph-raw-87d08-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22\" (UID: \"0dec70b8-66c8-428b-a746-6c4139f337ab\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" Apr 20 20:16:05.763160 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.763091 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dec70b8-66c8-428b-a746-6c4139f337ab-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22\" (UID: \"0dec70b8-66c8-428b-a746-6c4139f337ab\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" Apr 20 20:16:05.763160 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.763127 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5b9121c-dafe-40ae-8da8-c95937dd3823-proxy-tls\") pod \"isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv\" (UID: \"b5b9121c-dafe-40ae-8da8-c95937dd3823\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" Apr 20 20:16:05.763470 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.763159 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgmhf\" (UniqueName: \"kubernetes.io/projected/0dec70b8-66c8-428b-a746-6c4139f337ab-kube-api-access-sgmhf\") pod \"isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22\" (UID: \"0dec70b8-66c8-428b-a746-6c4139f337ab\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" Apr 20 20:16:05.763470 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.763189 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0dec70b8-66c8-428b-a746-6c4139f337ab-proxy-tls\") pod \"isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22\" (UID: \"0dec70b8-66c8-428b-a746-6c4139f337ab\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" Apr 20 20:16:05.763592 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.763477 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5b9121c-dafe-40ae-8da8-c95937dd3823-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv\" (UID: \"b5b9121c-dafe-40ae-8da8-c95937dd3823\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" Apr 20 20:16:05.763759 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.763735 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-87d08-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5b9121c-dafe-40ae-8da8-c95937dd3823-isvc-sklearn-graph-raw-87d08-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv\" (UID: \"b5b9121c-dafe-40ae-8da8-c95937dd3823\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" Apr 20 20:16:05.765629 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.765606 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5b9121c-dafe-40ae-8da8-c95937dd3823-proxy-tls\") pod \"isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv\" (UID: \"b5b9121c-dafe-40ae-8da8-c95937dd3823\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" Apr 20 20:16:05.770610 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.770589 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rn87\" (UniqueName: \"kubernetes.io/projected/b5b9121c-dafe-40ae-8da8-c95937dd3823-kube-api-access-4rn87\") pod \"isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv\" (UID: \"b5b9121c-dafe-40ae-8da8-c95937dd3823\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" Apr 20 20:16:05.863957 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.863861 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-87d08-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0dec70b8-66c8-428b-a746-6c4139f337ab-isvc-xgboost-graph-raw-87d08-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22\" (UID: \"0dec70b8-66c8-428b-a746-6c4139f337ab\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" Apr 20 20:16:05.863957 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.863915 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dec70b8-66c8-428b-a746-6c4139f337ab-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22\" (UID: \"0dec70b8-66c8-428b-a746-6c4139f337ab\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" Apr 20 20:16:05.864194 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.863959 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgmhf\" (UniqueName: \"kubernetes.io/projected/0dec70b8-66c8-428b-a746-6c4139f337ab-kube-api-access-sgmhf\") pod \"isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22\" (UID: \"0dec70b8-66c8-428b-a746-6c4139f337ab\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" Apr 20 20:16:05.864194 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.863987 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0dec70b8-66c8-428b-a746-6c4139f337ab-proxy-tls\") pod \"isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22\" (UID: \"0dec70b8-66c8-428b-a746-6c4139f337ab\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" Apr 20 20:16:05.864194 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:16:05.864154 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-serving-cert: secret "isvc-xgboost-graph-raw-87d08-predictor-serving-cert" not found Apr 20 20:16:05.864351 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:16:05.864226 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dec70b8-66c8-428b-a746-6c4139f337ab-proxy-tls podName:0dec70b8-66c8-428b-a746-6c4139f337ab nodeName:}" failed. No retries permitted until 2026-04-20 20:16:06.364205452 +0000 UTC m=+607.403694981 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/0dec70b8-66c8-428b-a746-6c4139f337ab-proxy-tls") pod "isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" (UID: "0dec70b8-66c8-428b-a746-6c4139f337ab") : secret "isvc-xgboost-graph-raw-87d08-predictor-serving-cert" not found Apr 20 20:16:05.864419 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.864345 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dec70b8-66c8-428b-a746-6c4139f337ab-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22\" (UID: \"0dec70b8-66c8-428b-a746-6c4139f337ab\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" Apr 20 20:16:05.864552 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.864533 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-87d08-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0dec70b8-66c8-428b-a746-6c4139f337ab-isvc-xgboost-graph-raw-87d08-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22\" (UID: \"0dec70b8-66c8-428b-a746-6c4139f337ab\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" Apr 20 20:16:05.874847 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.874825 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgmhf\" (UniqueName: \"kubernetes.io/projected/0dec70b8-66c8-428b-a746-6c4139f337ab-kube-api-access-sgmhf\") pod \"isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22\" (UID: \"0dec70b8-66c8-428b-a746-6c4139f337ab\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" Apr 20 20:16:05.952362 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:05.952327 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" Apr 20 20:16:06.078381 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:06.078347 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv"] Apr 20 20:16:06.082768 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:16:06.082741 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5b9121c_dafe_40ae_8da8_c95937dd3823.slice/crio-4587fe57da6de9d289614d3d58e3deb6ecad1f50089f928e48c62066cb3bc987 WatchSource:0}: Error finding container 4587fe57da6de9d289614d3d58e3deb6ecad1f50089f928e48c62066cb3bc987: Status 404 returned error can't find the container with id 4587fe57da6de9d289614d3d58e3deb6ecad1f50089f928e48c62066cb3bc987 Apr 20 20:16:06.084487 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:06.084472 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:16:06.369110 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:06.369079 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0dec70b8-66c8-428b-a746-6c4139f337ab-proxy-tls\") pod \"isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22\" (UID: \"0dec70b8-66c8-428b-a746-6c4139f337ab\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" Apr 20 20:16:06.371551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:06.371532 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0dec70b8-66c8-428b-a746-6c4139f337ab-proxy-tls\") pod \"isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22\" (UID: \"0dec70b8-66c8-428b-a746-6c4139f337ab\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" Apr 20 20:16:06.627921 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:06.627889 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" Apr 20 20:16:06.631465 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:06.631438 2571 generic.go:358] "Generic (PLEG): container finished" podID="75b0814e-b821-4519-974f-33493a091ca5" containerID="2d32f62296eaccfee530c6ea1499592a6bcfc06036626f2d57b358d4506efe7a" exitCode=2 Apr 20 20:16:06.631465 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:06.631452 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" event={"ID":"75b0814e-b821-4519-974f-33493a091ca5","Type":"ContainerDied","Data":"2d32f62296eaccfee530c6ea1499592a6bcfc06036626f2d57b358d4506efe7a"} Apr 20 20:16:06.632872 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:06.632847 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" event={"ID":"b5b9121c-dafe-40ae-8da8-c95937dd3823","Type":"ContainerStarted","Data":"3bf8c4880d36556053d260725cf54228cfcb01ad4b76ca9bea111bf4c38c1138"} Apr 20 20:16:06.632988 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:06.632878 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" event={"ID":"b5b9121c-dafe-40ae-8da8-c95937dd3823","Type":"ContainerStarted","Data":"4587fe57da6de9d289614d3d58e3deb6ecad1f50089f928e48c62066cb3bc987"} Apr 20 20:16:06.762881 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:06.762849 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22"] Apr 20 20:16:06.765579 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:16:06.765545 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dec70b8_66c8_428b_a746_6c4139f337ab.slice/crio-6a18eecffbc6402166b2b1cddffc931631b7e1ceed3454dc76e312421b2f07c1 WatchSource:0}: Error finding container 6a18eecffbc6402166b2b1cddffc931631b7e1ceed3454dc76e312421b2f07c1: Status 404 returned error can't find the container with id 6a18eecffbc6402166b2b1cddffc931631b7e1ceed3454dc76e312421b2f07c1 Apr 20 20:16:07.638195 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:07.638161 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" event={"ID":"0dec70b8-66c8-428b-a746-6c4139f337ab","Type":"ContainerStarted","Data":"14210322efa8ae61fef2e887cf8509cd14f95f64006f5c6648b3bd7329bc9f39"} Apr 20 20:16:07.638595 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:07.638204 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" event={"ID":"0dec70b8-66c8-428b-a746-6c4139f337ab","Type":"ContainerStarted","Data":"6a18eecffbc6402166b2b1cddffc931631b7e1ceed3454dc76e312421b2f07c1"} Apr 20 20:16:08.359781 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:08.359738 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 20 20:16:10.365977 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:10.365926 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 20 20:16:10.366368 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:10.366285 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:16:10.649847 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:10.649754 2571 generic.go:358] "Generic (PLEG): container finished" podID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerID="3bf8c4880d36556053d260725cf54228cfcb01ad4b76ca9bea111bf4c38c1138" exitCode=0 Apr 20 20:16:10.649847 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:10.649801 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" event={"ID":"b5b9121c-dafe-40ae-8da8-c95937dd3823","Type":"ContainerDied","Data":"3bf8c4880d36556053d260725cf54228cfcb01ad4b76ca9bea111bf4c38c1138"} Apr 20 20:16:10.651278 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:10.651256 2571 generic.go:358] "Generic (PLEG): container finished" podID="0dec70b8-66c8-428b-a746-6c4139f337ab" containerID="14210322efa8ae61fef2e887cf8509cd14f95f64006f5c6648b3bd7329bc9f39" exitCode=0 Apr 20 20:16:10.651391 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:10.651339 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" event={"ID":"0dec70b8-66c8-428b-a746-6c4139f337ab","Type":"ContainerDied","Data":"14210322efa8ae61fef2e887cf8509cd14f95f64006f5c6648b3bd7329bc9f39"} Apr 20 20:16:10.653765 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:10.653746 2571 generic.go:358] "Generic (PLEG): container finished" podID="75b0814e-b821-4519-974f-33493a091ca5" containerID="93acc8693871fc641ad21fddceeee0bd499589841c915263528cb488110b7a39" exitCode=0 Apr 20 20:16:10.653849 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:10.653776 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" event={"ID":"75b0814e-b821-4519-974f-33493a091ca5","Type":"ContainerDied","Data":"93acc8693871fc641ad21fddceeee0bd499589841c915263528cb488110b7a39"} Apr 20 20:16:11.660015 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:11.659983 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" event={"ID":"b5b9121c-dafe-40ae-8da8-c95937dd3823","Type":"ContainerStarted","Data":"78c3bdfb6b25ad62834be863766c0ec6da656e256b1aaa35936fbfc7dea00a30"} Apr 20 20:16:11.660504 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:11.660030 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" event={"ID":"b5b9121c-dafe-40ae-8da8-c95937dd3823","Type":"ContainerStarted","Data":"943db10690a19311e9eeb0b5c9705d82eceaf6dd5dd532fd8b58617d82542eef"} Apr 20 20:16:11.660504 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:11.660357 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" Apr 20 20:16:11.684959 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:11.684904 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" podStartSLOduration=6.684884514 podStartE2EDuration="6.684884514s" podCreationTimestamp="2026-04-20 20:16:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:16:11.683071592 +0000 UTC m=+612.722561139" watchObservedRunningTime="2026-04-20 20:16:11.684884514 +0000 UTC m=+612.724374061" Apr 20 20:16:12.664750 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:12.664713 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" Apr 20 20:16:12.666347 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:12.666304 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" podUID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 20 20:16:13.358943 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:13.358901 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 20 20:16:13.668813 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:13.668715 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" podUID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 20 20:16:18.359642 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:18.359534 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 20 20:16:18.360083 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:18.359834 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:16:18.673404 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:18.673328 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" Apr 20 20:16:18.673796 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:18.673771 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" podUID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 20 20:16:20.366008 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:20.365956 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 20 20:16:20.366495 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:20.366356 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:16:23.359363 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:23.359320 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 20 20:16:28.359929 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:28.359875 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 20 20:16:28.673942 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:28.673850 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" podUID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 20 20:16:30.365808 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:30.365723 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 20 20:16:30.366350 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:30.365902 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:16:30.366350 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:30.366108 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:16:30.366350 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:30.366246 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:16:33.359199 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:33.359134 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 20 20:16:34.749010 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:34.748976 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" event={"ID":"0dec70b8-66c8-428b-a746-6c4139f337ab","Type":"ContainerStarted","Data":"e2e9c2afc29c54c2fcd3c897cf5c82cd411269f74ebd4b05fd2eaea898c16e6d"} Apr 20 20:16:34.749371 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:34.749025 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" event={"ID":"0dec70b8-66c8-428b-a746-6c4139f337ab","Type":"ContainerStarted","Data":"2ee70ebfdeab8e2855d979a1e4f8286a65b65171ee36d13f819e9deb07b9ec03"} Apr 20 20:16:34.749371 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:34.749238 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" Apr 20 20:16:34.769725 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:34.769674 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" podStartSLOduration=6.148770339 podStartE2EDuration="29.76966045s" podCreationTimestamp="2026-04-20 20:16:05 +0000 UTC" firstStartedPulling="2026-04-20 20:16:10.652737074 +0000 UTC m=+611.692226598" lastFinishedPulling="2026-04-20 20:16:34.273627186 +0000 UTC m=+635.313116709" observedRunningTime="2026-04-20 20:16:34.768203355 +0000 UTC m=+635.807692903" watchObservedRunningTime="2026-04-20 20:16:34.76966045 +0000 UTC m=+635.809149995" Apr 20 20:16:35.692714 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.692690 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:16:35.754617 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.754589 2571 generic.go:358] "Generic (PLEG): container finished" podID="75b0814e-b821-4519-974f-33493a091ca5" containerID="512ea8b5ddb6f1ecb3c14864be631fdb02e364d3b9a4cc5ca53f44caf1d7e1d8" exitCode=0 Apr 20 20:16:35.755049 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.754677 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" event={"ID":"75b0814e-b821-4519-974f-33493a091ca5","Type":"ContainerDied","Data":"512ea8b5ddb6f1ecb3c14864be631fdb02e364d3b9a4cc5ca53f44caf1d7e1d8"} Apr 20 20:16:35.755049 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.754728 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" event={"ID":"75b0814e-b821-4519-974f-33493a091ca5","Type":"ContainerDied","Data":"d450558dfd6dc9bf209596053da92e9b6eecc782e225eb350ca9636cd4c98af8"} Apr 20 20:16:35.755049 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.754766 2571 scope.go:117] "RemoveContainer" containerID="512ea8b5ddb6f1ecb3c14864be631fdb02e364d3b9a4cc5ca53f44caf1d7e1d8" Apr 20 20:16:35.755049 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.754772 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8" Apr 20 20:16:35.755266 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.755215 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" Apr 20 20:16:35.756944 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.756852 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" podUID="0dec70b8-66c8-428b-a746-6c4139f337ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 20 20:16:35.763946 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.763929 2571 scope.go:117] "RemoveContainer" containerID="2d32f62296eaccfee530c6ea1499592a6bcfc06036626f2d57b358d4506efe7a" Apr 20 20:16:35.771504 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.771485 2571 scope.go:117] "RemoveContainer" containerID="93acc8693871fc641ad21fddceeee0bd499589841c915263528cb488110b7a39" Apr 20 20:16:35.778912 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.778880 2571 scope.go:117] "RemoveContainer" containerID="005b3d961857206c090d3a876f24c309feafa88331e1c00945ab2e2a5bd7953b" Apr 20 20:16:35.785962 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.785945 2571 scope.go:117] "RemoveContainer" containerID="512ea8b5ddb6f1ecb3c14864be631fdb02e364d3b9a4cc5ca53f44caf1d7e1d8" Apr 20 20:16:35.786231 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:16:35.786210 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"512ea8b5ddb6f1ecb3c14864be631fdb02e364d3b9a4cc5ca53f44caf1d7e1d8\": container with ID starting with 512ea8b5ddb6f1ecb3c14864be631fdb02e364d3b9a4cc5ca53f44caf1d7e1d8 not found: ID does not exist" containerID="512ea8b5ddb6f1ecb3c14864be631fdb02e364d3b9a4cc5ca53f44caf1d7e1d8" Apr 20 20:16:35.786284 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.786241 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"512ea8b5ddb6f1ecb3c14864be631fdb02e364d3b9a4cc5ca53f44caf1d7e1d8"} err="failed to get container status \"512ea8b5ddb6f1ecb3c14864be631fdb02e364d3b9a4cc5ca53f44caf1d7e1d8\": rpc error: code = NotFound desc = could not find container \"512ea8b5ddb6f1ecb3c14864be631fdb02e364d3b9a4cc5ca53f44caf1d7e1d8\": container with ID starting with 512ea8b5ddb6f1ecb3c14864be631fdb02e364d3b9a4cc5ca53f44caf1d7e1d8 not found: ID does not exist" Apr 20 20:16:35.786284 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.786260 2571 scope.go:117] "RemoveContainer" containerID="2d32f62296eaccfee530c6ea1499592a6bcfc06036626f2d57b358d4506efe7a" Apr 20 20:16:35.786537 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:16:35.786517 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d32f62296eaccfee530c6ea1499592a6bcfc06036626f2d57b358d4506efe7a\": container with ID starting with 2d32f62296eaccfee530c6ea1499592a6bcfc06036626f2d57b358d4506efe7a not found: ID does not exist" containerID="2d32f62296eaccfee530c6ea1499592a6bcfc06036626f2d57b358d4506efe7a" Apr 20 20:16:35.786619 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.786547 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d32f62296eaccfee530c6ea1499592a6bcfc06036626f2d57b358d4506efe7a"} err="failed to get container status \"2d32f62296eaccfee530c6ea1499592a6bcfc06036626f2d57b358d4506efe7a\": rpc error: code = NotFound desc = could not find container \"2d32f62296eaccfee530c6ea1499592a6bcfc06036626f2d57b358d4506efe7a\": container with ID starting with 2d32f62296eaccfee530c6ea1499592a6bcfc06036626f2d57b358d4506efe7a not found: ID does not exist" Apr 20 20:16:35.786619 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.786570 2571 scope.go:117] "RemoveContainer" containerID="93acc8693871fc641ad21fddceeee0bd499589841c915263528cb488110b7a39" Apr 20 20:16:35.786817 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:16:35.786799 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93acc8693871fc641ad21fddceeee0bd499589841c915263528cb488110b7a39\": container with ID starting with 93acc8693871fc641ad21fddceeee0bd499589841c915263528cb488110b7a39 not found: ID does not exist" containerID="93acc8693871fc641ad21fddceeee0bd499589841c915263528cb488110b7a39" Apr 20 20:16:35.786856 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.786822 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93acc8693871fc641ad21fddceeee0bd499589841c915263528cb488110b7a39"} err="failed to get container status \"93acc8693871fc641ad21fddceeee0bd499589841c915263528cb488110b7a39\": rpc error: code = NotFound desc = could not find container \"93acc8693871fc641ad21fddceeee0bd499589841c915263528cb488110b7a39\": container with ID starting with 93acc8693871fc641ad21fddceeee0bd499589841c915263528cb488110b7a39 not found: ID does not exist" Apr 20 20:16:35.786856 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.786838 2571 scope.go:117] "RemoveContainer" containerID="005b3d961857206c090d3a876f24c309feafa88331e1c00945ab2e2a5bd7953b" Apr 20 20:16:35.787051 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:16:35.787034 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"005b3d961857206c090d3a876f24c309feafa88331e1c00945ab2e2a5bd7953b\": container with ID starting with 005b3d961857206c090d3a876f24c309feafa88331e1c00945ab2e2a5bd7953b not found: ID does not exist" containerID="005b3d961857206c090d3a876f24c309feafa88331e1c00945ab2e2a5bd7953b" Apr 20 20:16:35.787100 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.787056 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"005b3d961857206c090d3a876f24c309feafa88331e1c00945ab2e2a5bd7953b"} err="failed to get container status \"005b3d961857206c090d3a876f24c309feafa88331e1c00945ab2e2a5bd7953b\": rpc error: code = NotFound desc = could not find container \"005b3d961857206c090d3a876f24c309feafa88331e1c00945ab2e2a5bd7953b\": container with ID starting with 005b3d961857206c090d3a876f24c309feafa88331e1c00945ab2e2a5bd7953b not found: ID does not exist" Apr 20 20:16:35.844318 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.844274 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-raw-sklearn-batcher-8ea43-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/75b0814e-b821-4519-974f-33493a091ca5-isvc-raw-sklearn-batcher-8ea43-kube-rbac-proxy-sar-config\") pod \"75b0814e-b821-4519-974f-33493a091ca5\" (UID: \"75b0814e-b821-4519-974f-33493a091ca5\") " Apr 20 20:16:35.844538 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.844329 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75b0814e-b821-4519-974f-33493a091ca5-proxy-tls\") pod \"75b0814e-b821-4519-974f-33493a091ca5\" (UID: \"75b0814e-b821-4519-974f-33493a091ca5\") " Apr 20 20:16:35.844538 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.844360 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/75b0814e-b821-4519-974f-33493a091ca5-kserve-provision-location\") pod \"75b0814e-b821-4519-974f-33493a091ca5\" (UID: \"75b0814e-b821-4519-974f-33493a091ca5\") " Apr 20 20:16:35.844538 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.844415 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlq8w\" (UniqueName: \"kubernetes.io/projected/75b0814e-b821-4519-974f-33493a091ca5-kube-api-access-mlq8w\") pod \"75b0814e-b821-4519-974f-33493a091ca5\" (UID: \"75b0814e-b821-4519-974f-33493a091ca5\") " Apr 20 20:16:35.844773 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.844723 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75b0814e-b821-4519-974f-33493a091ca5-isvc-raw-sklearn-batcher-8ea43-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-raw-sklearn-batcher-8ea43-kube-rbac-proxy-sar-config") pod "75b0814e-b821-4519-974f-33493a091ca5" (UID: "75b0814e-b821-4519-974f-33493a091ca5"). InnerVolumeSpecName "isvc-raw-sklearn-batcher-8ea43-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:16:35.844773 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.844766 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75b0814e-b821-4519-974f-33493a091ca5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "75b0814e-b821-4519-974f-33493a091ca5" (UID: "75b0814e-b821-4519-974f-33493a091ca5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:16:35.846824 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.846801 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75b0814e-b821-4519-974f-33493a091ca5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "75b0814e-b821-4519-974f-33493a091ca5" (UID: "75b0814e-b821-4519-974f-33493a091ca5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:16:35.846948 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.846842 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75b0814e-b821-4519-974f-33493a091ca5-kube-api-access-mlq8w" (OuterVolumeSpecName: "kube-api-access-mlq8w") pod "75b0814e-b821-4519-974f-33493a091ca5" (UID: "75b0814e-b821-4519-974f-33493a091ca5"). InnerVolumeSpecName "kube-api-access-mlq8w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:16:35.945812 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.945775 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-raw-sklearn-batcher-8ea43-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/75b0814e-b821-4519-974f-33493a091ca5-isvc-raw-sklearn-batcher-8ea43-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:16:35.945812 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.945807 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/75b0814e-b821-4519-974f-33493a091ca5-proxy-tls\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:16:35.945812 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.945817 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/75b0814e-b821-4519-974f-33493a091ca5-kserve-provision-location\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:16:35.946040 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:35.945829 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mlq8w\" (UniqueName: \"kubernetes.io/projected/75b0814e-b821-4519-974f-33493a091ca5-kube-api-access-mlq8w\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:16:36.078148 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:36.078119 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8"] Apr 20 20:16:36.081981 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:36.081959 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8ea43-predictor-64b4f7648f-65rl8"] Apr 20 20:16:36.759021 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:36.758978 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" podUID="0dec70b8-66c8-428b-a746-6c4139f337ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 20 20:16:37.528916 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:37.528867 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75b0814e-b821-4519-974f-33493a091ca5" path="/var/lib/kubelet/pods/75b0814e-b821-4519-974f-33493a091ca5/volumes" Apr 20 20:16:38.674172 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:38.674134 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" podUID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 20 20:16:41.762945 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:41.762914 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" Apr 20 20:16:41.763465 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:41.763418 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" podUID="0dec70b8-66c8-428b-a746-6c4139f337ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 20 20:16:48.673941 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:48.673901 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" podUID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 20 20:16:51.763788 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:51.763742 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" podUID="0dec70b8-66c8-428b-a746-6c4139f337ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 20 20:16:58.673936 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:16:58.673899 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" podUID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 20 20:17:01.763519 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:01.763466 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" podUID="0dec70b8-66c8-428b-a746-6c4139f337ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 20 20:17:08.673833 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:08.673793 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" podUID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 20 20:17:11.764268 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:11.764231 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" podUID="0dec70b8-66c8-428b-a746-6c4139f337ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 20 20:17:18.675360 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:18.675331 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" Apr 20 20:17:21.763450 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:21.763390 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" podUID="0dec70b8-66c8-428b-a746-6c4139f337ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 20 20:17:31.763885 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:31.763840 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" podUID="0dec70b8-66c8-428b-a746-6c4139f337ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 20 20:17:41.764549 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:41.764521 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" Apr 20 20:17:55.858786 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:55.858686 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv"] Apr 20 20:17:55.859247 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:55.859153 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" podUID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerName="kserve-container" containerID="cri-o://943db10690a19311e9eeb0b5c9705d82eceaf6dd5dd532fd8b58617d82542eef" gracePeriod=30 Apr 20 20:17:55.859324 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:55.859224 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" podUID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerName="kube-rbac-proxy" containerID="cri-o://78c3bdfb6b25ad62834be863766c0ec6da656e256b1aaa35936fbfc7dea00a30" gracePeriod=30 Apr 20 20:17:55.925875 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:55.925832 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs"] Apr 20 20:17:55.926504 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:55.926481 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kube-rbac-proxy" Apr 20 20:17:55.926504 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:55.926504 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kube-rbac-proxy" Apr 20 20:17:55.926706 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:55.926526 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="storage-initializer" Apr 20 20:17:55.926706 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:55.926535 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="storage-initializer" Apr 20 20:17:55.926706 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:55.926552 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kserve-container" Apr 20 20:17:55.926706 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:55.926561 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kserve-container" Apr 20 20:17:55.926706 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:55.926584 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="agent" Apr 20 20:17:55.926706 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:55.926593 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="agent" Apr 20 20:17:55.926706 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:55.926675 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kserve-container" Apr 20 20:17:55.926706 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:55.926691 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="agent" Apr 20 20:17:55.926706 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:55.926701 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="75b0814e-b821-4519-974f-33493a091ca5" containerName="kube-rbac-proxy" Apr 20 20:17:55.930533 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:55.930514 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" Apr 20 20:17:55.932927 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:55.932904 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-3bab1-predictor-serving-cert\"" Apr 20 20:17:55.933157 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:55.933136 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config\"" Apr 20 20:17:55.940936 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:55.940915 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs"] Apr 20 20:17:56.008158 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.008116 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22"] Apr 20 20:17:56.008516 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.008469 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" podUID="0dec70b8-66c8-428b-a746-6c4139f337ab" containerName="kserve-container" containerID="cri-o://2ee70ebfdeab8e2855d979a1e4f8286a65b65171ee36d13f819e9deb07b9ec03" gracePeriod=30 Apr 20 20:17:56.008666 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.008543 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" podUID="0dec70b8-66c8-428b-a746-6c4139f337ab" containerName="kube-rbac-proxy" containerID="cri-o://e2e9c2afc29c54c2fcd3c897cf5c82cd411269f74ebd4b05fd2eaea898c16e6d" gracePeriod=30 Apr 20 20:17:56.008740 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.008681 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f367b892-3117-4f93-b513-6a8a323a86e4-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs\" (UID: \"f367b892-3117-4f93-b513-6a8a323a86e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" Apr 20 20:17:56.008740 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.008716 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzspx\" (UniqueName: \"kubernetes.io/projected/f367b892-3117-4f93-b513-6a8a323a86e4-kube-api-access-lzspx\") pod \"isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs\" (UID: \"f367b892-3117-4f93-b513-6a8a323a86e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" Apr 20 20:17:56.008848 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.008821 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f367b892-3117-4f93-b513-6a8a323a86e4-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs\" (UID: \"f367b892-3117-4f93-b513-6a8a323a86e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" Apr 20 20:17:56.008971 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.008878 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f367b892-3117-4f93-b513-6a8a323a86e4-isvc-sklearn-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs\" (UID: \"f367b892-3117-4f93-b513-6a8a323a86e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" Apr 20 20:17:56.020358 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.020335 2571 generic.go:358] "Generic (PLEG): container finished" podID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerID="78c3bdfb6b25ad62834be863766c0ec6da656e256b1aaa35936fbfc7dea00a30" exitCode=2 Apr 20 20:17:56.020562 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.020408 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" event={"ID":"b5b9121c-dafe-40ae-8da8-c95937dd3823","Type":"ContainerDied","Data":"78c3bdfb6b25ad62834be863766c0ec6da656e256b1aaa35936fbfc7dea00a30"} Apr 20 20:17:56.041837 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.041812 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6"] Apr 20 20:17:56.045437 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.045409 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" Apr 20 20:17:56.048004 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.047987 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config\"" Apr 20 20:17:56.048004 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.047997 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-3bab1-predictor-serving-cert\"" Apr 20 20:17:56.053176 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.053158 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6"] Apr 20 20:17:56.109324 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.109253 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86334311-672d-4ba3-b0d8-59aa9263198f-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6\" (UID: \"86334311-672d-4ba3-b0d8-59aa9263198f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" Apr 20 20:17:56.109324 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.109296 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f367b892-3117-4f93-b513-6a8a323a86e4-isvc-sklearn-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs\" (UID: \"f367b892-3117-4f93-b513-6a8a323a86e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" Apr 20 20:17:56.109324 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.109321 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f367b892-3117-4f93-b513-6a8a323a86e4-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs\" (UID: \"f367b892-3117-4f93-b513-6a8a323a86e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" Apr 20 20:17:56.109630 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.109347 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzspx\" (UniqueName: \"kubernetes.io/projected/f367b892-3117-4f93-b513-6a8a323a86e4-kube-api-access-lzspx\") pod \"isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs\" (UID: \"f367b892-3117-4f93-b513-6a8a323a86e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" Apr 20 20:17:56.109630 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.109384 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6274q\" (UniqueName: \"kubernetes.io/projected/86334311-672d-4ba3-b0d8-59aa9263198f-kube-api-access-6274q\") pod \"isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6\" (UID: \"86334311-672d-4ba3-b0d8-59aa9263198f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" Apr 20 20:17:56.109630 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:17:56.109501 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-serving-cert: secret "isvc-sklearn-graph-raw-hpa-3bab1-predictor-serving-cert" not found Apr 20 20:17:56.109630 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.109536 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86334311-672d-4ba3-b0d8-59aa9263198f-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6\" (UID: \"86334311-672d-4ba3-b0d8-59aa9263198f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" Apr 20 20:17:56.109630 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:17:56.109576 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f367b892-3117-4f93-b513-6a8a323a86e4-proxy-tls podName:f367b892-3117-4f93-b513-6a8a323a86e4 nodeName:}" failed. No retries permitted until 2026-04-20 20:17:56.609552174 +0000 UTC m=+717.649041700 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f367b892-3117-4f93-b513-6a8a323a86e4-proxy-tls") pod "isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" (UID: "f367b892-3117-4f93-b513-6a8a323a86e4") : secret "isvc-sklearn-graph-raw-hpa-3bab1-predictor-serving-cert" not found Apr 20 20:17:56.109849 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.109659 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/86334311-672d-4ba3-b0d8-59aa9263198f-isvc-xgboost-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6\" (UID: \"86334311-672d-4ba3-b0d8-59aa9263198f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" Apr 20 20:17:56.109849 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.109703 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f367b892-3117-4f93-b513-6a8a323a86e4-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs\" (UID: \"f367b892-3117-4f93-b513-6a8a323a86e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" Apr 20 20:17:56.109965 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.109946 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f367b892-3117-4f93-b513-6a8a323a86e4-isvc-sklearn-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs\" (UID: \"f367b892-3117-4f93-b513-6a8a323a86e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" Apr 20 20:17:56.110003 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.109986 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f367b892-3117-4f93-b513-6a8a323a86e4-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs\" (UID: \"f367b892-3117-4f93-b513-6a8a323a86e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" Apr 20 20:17:56.118244 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.118180 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzspx\" (UniqueName: \"kubernetes.io/projected/f367b892-3117-4f93-b513-6a8a323a86e4-kube-api-access-lzspx\") pod \"isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs\" (UID: \"f367b892-3117-4f93-b513-6a8a323a86e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" Apr 20 20:17:56.210519 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.210484 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/86334311-672d-4ba3-b0d8-59aa9263198f-isvc-xgboost-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6\" (UID: \"86334311-672d-4ba3-b0d8-59aa9263198f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" Apr 20 20:17:56.210725 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.210551 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86334311-672d-4ba3-b0d8-59aa9263198f-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6\" (UID: \"86334311-672d-4ba3-b0d8-59aa9263198f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" Apr 20 20:17:56.210725 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.210613 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6274q\" (UniqueName: \"kubernetes.io/projected/86334311-672d-4ba3-b0d8-59aa9263198f-kube-api-access-6274q\") pod \"isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6\" (UID: \"86334311-672d-4ba3-b0d8-59aa9263198f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" Apr 20 20:17:56.210725 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.210670 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86334311-672d-4ba3-b0d8-59aa9263198f-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6\" (UID: \"86334311-672d-4ba3-b0d8-59aa9263198f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" Apr 20 20:17:56.210906 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:17:56.210827 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-serving-cert: secret "isvc-xgboost-graph-raw-hpa-3bab1-predictor-serving-cert" not found Apr 20 20:17:56.210906 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:17:56.210900 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86334311-672d-4ba3-b0d8-59aa9263198f-proxy-tls podName:86334311-672d-4ba3-b0d8-59aa9263198f nodeName:}" failed. No retries permitted until 2026-04-20 20:17:56.710877537 +0000 UTC m=+717.750367061 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/86334311-672d-4ba3-b0d8-59aa9263198f-proxy-tls") pod "isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" (UID: "86334311-672d-4ba3-b0d8-59aa9263198f") : secret "isvc-xgboost-graph-raw-hpa-3bab1-predictor-serving-cert" not found Apr 20 20:17:56.211030 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.211002 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86334311-672d-4ba3-b0d8-59aa9263198f-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6\" (UID: \"86334311-672d-4ba3-b0d8-59aa9263198f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" Apr 20 20:17:56.211260 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.211242 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/86334311-672d-4ba3-b0d8-59aa9263198f-isvc-xgboost-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6\" (UID: \"86334311-672d-4ba3-b0d8-59aa9263198f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" Apr 20 20:17:56.219130 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.219102 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6274q\" (UniqueName: \"kubernetes.io/projected/86334311-672d-4ba3-b0d8-59aa9263198f-kube-api-access-6274q\") pod \"isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6\" (UID: \"86334311-672d-4ba3-b0d8-59aa9263198f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" Apr 20 20:17:56.613296 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.613262 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f367b892-3117-4f93-b513-6a8a323a86e4-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs\" (UID: \"f367b892-3117-4f93-b513-6a8a323a86e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" Apr 20 20:17:56.615756 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.615729 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f367b892-3117-4f93-b513-6a8a323a86e4-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs\" (UID: \"f367b892-3117-4f93-b513-6a8a323a86e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" Apr 20 20:17:56.714446 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.714393 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86334311-672d-4ba3-b0d8-59aa9263198f-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6\" (UID: \"86334311-672d-4ba3-b0d8-59aa9263198f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" Apr 20 20:17:56.716894 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.716876 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86334311-672d-4ba3-b0d8-59aa9263198f-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6\" (UID: \"86334311-672d-4ba3-b0d8-59aa9263198f\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" Apr 20 20:17:56.760062 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.760026 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" podUID="0dec70b8-66c8-428b-a746-6c4139f337ab" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.34:8643/healthz\": dial tcp 10.133.0.34:8643: connect: connection refused" Apr 20 20:17:56.841648 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.841612 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" Apr 20 20:17:56.955976 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.955940 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" Apr 20 20:17:56.963848 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:56.963820 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs"] Apr 20 20:17:56.966605 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:17:56.966578 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf367b892_3117_4f93_b513_6a8a323a86e4.slice/crio-a2d6d2b6412da1f2b082a4d351f19c39dd7dda0450f181e4bddc252ca991be17 WatchSource:0}: Error finding container a2d6d2b6412da1f2b082a4d351f19c39dd7dda0450f181e4bddc252ca991be17: Status 404 returned error can't find the container with id a2d6d2b6412da1f2b082a4d351f19c39dd7dda0450f181e4bddc252ca991be17 Apr 20 20:17:57.026913 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:57.026693 2571 generic.go:358] "Generic (PLEG): container finished" podID="0dec70b8-66c8-428b-a746-6c4139f337ab" containerID="e2e9c2afc29c54c2fcd3c897cf5c82cd411269f74ebd4b05fd2eaea898c16e6d" exitCode=2 Apr 20 20:17:57.026913 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:57.026802 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" event={"ID":"0dec70b8-66c8-428b-a746-6c4139f337ab","Type":"ContainerDied","Data":"e2e9c2afc29c54c2fcd3c897cf5c82cd411269f74ebd4b05fd2eaea898c16e6d"} Apr 20 20:17:57.028296 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:57.028256 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" event={"ID":"f367b892-3117-4f93-b513-6a8a323a86e4","Type":"ContainerStarted","Data":"a2d6d2b6412da1f2b082a4d351f19c39dd7dda0450f181e4bddc252ca991be17"} Apr 20 20:17:57.088118 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:57.088092 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6"] Apr 20 20:17:57.096092 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:17:57.096041 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86334311_672d_4ba3_b0d8_59aa9263198f.slice/crio-b0fb75fdef92ef4dab2c6ac2cb21532aa8ce1cdf774cd1cfef0d1aff4e29bab6 WatchSource:0}: Error finding container b0fb75fdef92ef4dab2c6ac2cb21532aa8ce1cdf774cd1cfef0d1aff4e29bab6: Status 404 returned error can't find the container with id b0fb75fdef92ef4dab2c6ac2cb21532aa8ce1cdf774cd1cfef0d1aff4e29bab6 Apr 20 20:17:58.033223 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:58.033184 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" event={"ID":"f367b892-3117-4f93-b513-6a8a323a86e4","Type":"ContainerStarted","Data":"a34b6db78e48daa6553fd66f0b7d7a0add6fb4b258564858b9b02dc24be54472"} Apr 20 20:17:58.034701 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:58.034677 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" event={"ID":"86334311-672d-4ba3-b0d8-59aa9263198f","Type":"ContainerStarted","Data":"0bce135bb6b173721d55b8c5cc69d3e19e17d9098847bbd96e6b92ed335b60bd"} Apr 20 20:17:58.034850 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:58.034703 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" event={"ID":"86334311-672d-4ba3-b0d8-59aa9263198f","Type":"ContainerStarted","Data":"b0fb75fdef92ef4dab2c6ac2cb21532aa8ce1cdf774cd1cfef0d1aff4e29bab6"} Apr 20 20:17:58.669098 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:58.669059 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" podUID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.33:8643/healthz\": dial tcp 10.133.0.33:8643: connect: connection refused" Apr 20 20:17:58.674410 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:58.674385 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" podUID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 20 20:17:59.766472 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:59.766451 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" Apr 20 20:17:59.844767 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:59.844689 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgmhf\" (UniqueName: \"kubernetes.io/projected/0dec70b8-66c8-428b-a746-6c4139f337ab-kube-api-access-sgmhf\") pod \"0dec70b8-66c8-428b-a746-6c4139f337ab\" (UID: \"0dec70b8-66c8-428b-a746-6c4139f337ab\") " Apr 20 20:17:59.844767 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:59.844729 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0dec70b8-66c8-428b-a746-6c4139f337ab-proxy-tls\") pod \"0dec70b8-66c8-428b-a746-6c4139f337ab\" (UID: \"0dec70b8-66c8-428b-a746-6c4139f337ab\") " Apr 20 20:17:59.844974 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:59.844785 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dec70b8-66c8-428b-a746-6c4139f337ab-kserve-provision-location\") pod \"0dec70b8-66c8-428b-a746-6c4139f337ab\" (UID: \"0dec70b8-66c8-428b-a746-6c4139f337ab\") " Apr 20 20:17:59.844974 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:59.844831 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-87d08-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0dec70b8-66c8-428b-a746-6c4139f337ab-isvc-xgboost-graph-raw-87d08-kube-rbac-proxy-sar-config\") pod \"0dec70b8-66c8-428b-a746-6c4139f337ab\" (UID: \"0dec70b8-66c8-428b-a746-6c4139f337ab\") " Apr 20 20:17:59.845119 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:59.845095 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dec70b8-66c8-428b-a746-6c4139f337ab-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0dec70b8-66c8-428b-a746-6c4139f337ab" (UID: "0dec70b8-66c8-428b-a746-6c4139f337ab"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:17:59.845182 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:59.845159 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dec70b8-66c8-428b-a746-6c4139f337ab-isvc-xgboost-graph-raw-87d08-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-87d08-kube-rbac-proxy-sar-config") pod "0dec70b8-66c8-428b-a746-6c4139f337ab" (UID: "0dec70b8-66c8-428b-a746-6c4139f337ab"). InnerVolumeSpecName "isvc-xgboost-graph-raw-87d08-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:17:59.847007 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:59.846986 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dec70b8-66c8-428b-a746-6c4139f337ab-kube-api-access-sgmhf" (OuterVolumeSpecName: "kube-api-access-sgmhf") pod "0dec70b8-66c8-428b-a746-6c4139f337ab" (UID: "0dec70b8-66c8-428b-a746-6c4139f337ab"). InnerVolumeSpecName "kube-api-access-sgmhf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:17:59.847121 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:59.847041 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dec70b8-66c8-428b-a746-6c4139f337ab-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0dec70b8-66c8-428b-a746-6c4139f337ab" (UID: "0dec70b8-66c8-428b-a746-6c4139f337ab"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:17:59.945702 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:59.945673 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dec70b8-66c8-428b-a746-6c4139f337ab-kserve-provision-location\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:17:59.945702 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:59.945698 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-87d08-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0dec70b8-66c8-428b-a746-6c4139f337ab-isvc-xgboost-graph-raw-87d08-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:17:59.945881 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:59.945711 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sgmhf\" (UniqueName: \"kubernetes.io/projected/0dec70b8-66c8-428b-a746-6c4139f337ab-kube-api-access-sgmhf\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:17:59.945881 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:17:59.945721 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0dec70b8-66c8-428b-a746-6c4139f337ab-proxy-tls\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:18:00.049404 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.049367 2571 generic.go:358] "Generic (PLEG): container finished" podID="0dec70b8-66c8-428b-a746-6c4139f337ab" containerID="2ee70ebfdeab8e2855d979a1e4f8286a65b65171ee36d13f819e9deb07b9ec03" exitCode=0 Apr 20 20:18:00.049604 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.049454 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" event={"ID":"0dec70b8-66c8-428b-a746-6c4139f337ab","Type":"ContainerDied","Data":"2ee70ebfdeab8e2855d979a1e4f8286a65b65171ee36d13f819e9deb07b9ec03"} Apr 20 20:18:00.049604 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.049496 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" event={"ID":"0dec70b8-66c8-428b-a746-6c4139f337ab","Type":"ContainerDied","Data":"6a18eecffbc6402166b2b1cddffc931631b7e1ceed3454dc76e312421b2f07c1"} Apr 20 20:18:00.049604 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.049506 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22" Apr 20 20:18:00.049782 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.049511 2571 scope.go:117] "RemoveContainer" containerID="e2e9c2afc29c54c2fcd3c897cf5c82cd411269f74ebd4b05fd2eaea898c16e6d" Apr 20 20:18:00.058155 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.058126 2571 scope.go:117] "RemoveContainer" containerID="2ee70ebfdeab8e2855d979a1e4f8286a65b65171ee36d13f819e9deb07b9ec03" Apr 20 20:18:00.065531 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.065515 2571 scope.go:117] "RemoveContainer" containerID="14210322efa8ae61fef2e887cf8509cd14f95f64006f5c6648b3bd7329bc9f39" Apr 20 20:18:00.072225 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.072199 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22"] Apr 20 20:18:00.073506 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.073391 2571 scope.go:117] "RemoveContainer" containerID="e2e9c2afc29c54c2fcd3c897cf5c82cd411269f74ebd4b05fd2eaea898c16e6d" Apr 20 20:18:00.074237 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:18:00.074209 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2e9c2afc29c54c2fcd3c897cf5c82cd411269f74ebd4b05fd2eaea898c16e6d\": container with ID starting with e2e9c2afc29c54c2fcd3c897cf5c82cd411269f74ebd4b05fd2eaea898c16e6d not found: ID does not exist" containerID="e2e9c2afc29c54c2fcd3c897cf5c82cd411269f74ebd4b05fd2eaea898c16e6d" Apr 20 20:18:00.074331 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.074249 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2e9c2afc29c54c2fcd3c897cf5c82cd411269f74ebd4b05fd2eaea898c16e6d"} err="failed to get container status \"e2e9c2afc29c54c2fcd3c897cf5c82cd411269f74ebd4b05fd2eaea898c16e6d\": rpc error: code = NotFound desc = could not find container \"e2e9c2afc29c54c2fcd3c897cf5c82cd411269f74ebd4b05fd2eaea898c16e6d\": container with ID starting with e2e9c2afc29c54c2fcd3c897cf5c82cd411269f74ebd4b05fd2eaea898c16e6d not found: ID does not exist" Apr 20 20:18:00.074331 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.074271 2571 scope.go:117] "RemoveContainer" containerID="2ee70ebfdeab8e2855d979a1e4f8286a65b65171ee36d13f819e9deb07b9ec03" Apr 20 20:18:00.074608 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:18:00.074568 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ee70ebfdeab8e2855d979a1e4f8286a65b65171ee36d13f819e9deb07b9ec03\": container with ID starting with 2ee70ebfdeab8e2855d979a1e4f8286a65b65171ee36d13f819e9deb07b9ec03 not found: ID does not exist" containerID="2ee70ebfdeab8e2855d979a1e4f8286a65b65171ee36d13f819e9deb07b9ec03" Apr 20 20:18:00.074676 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.074607 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ee70ebfdeab8e2855d979a1e4f8286a65b65171ee36d13f819e9deb07b9ec03"} err="failed to get container status \"2ee70ebfdeab8e2855d979a1e4f8286a65b65171ee36d13f819e9deb07b9ec03\": rpc error: code = NotFound desc = could not find container \"2ee70ebfdeab8e2855d979a1e4f8286a65b65171ee36d13f819e9deb07b9ec03\": container with ID starting with 2ee70ebfdeab8e2855d979a1e4f8286a65b65171ee36d13f819e9deb07b9ec03 not found: ID does not exist" Apr 20 20:18:00.074676 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.074629 2571 scope.go:117] "RemoveContainer" containerID="14210322efa8ae61fef2e887cf8509cd14f95f64006f5c6648b3bd7329bc9f39" Apr 20 20:18:00.074967 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:18:00.074943 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14210322efa8ae61fef2e887cf8509cd14f95f64006f5c6648b3bd7329bc9f39\": container with ID starting with 14210322efa8ae61fef2e887cf8509cd14f95f64006f5c6648b3bd7329bc9f39 not found: ID does not exist" containerID="14210322efa8ae61fef2e887cf8509cd14f95f64006f5c6648b3bd7329bc9f39" Apr 20 20:18:00.075067 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.074973 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14210322efa8ae61fef2e887cf8509cd14f95f64006f5c6648b3bd7329bc9f39"} err="failed to get container status \"14210322efa8ae61fef2e887cf8509cd14f95f64006f5c6648b3bd7329bc9f39\": rpc error: code = NotFound desc = could not find container \"14210322efa8ae61fef2e887cf8509cd14f95f64006f5c6648b3bd7329bc9f39\": container with ID starting with 14210322efa8ae61fef2e887cf8509cd14f95f64006f5c6648b3bd7329bc9f39 not found: ID does not exist" Apr 20 20:18:00.076264 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.076245 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-87d08-predictor-84bdc99457-kpb22"] Apr 20 20:18:00.383698 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.383676 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" Apr 20 20:18:00.450849 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.450782 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5b9121c-dafe-40ae-8da8-c95937dd3823-proxy-tls\") pod \"b5b9121c-dafe-40ae-8da8-c95937dd3823\" (UID: \"b5b9121c-dafe-40ae-8da8-c95937dd3823\") " Apr 20 20:18:00.450962 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.450880 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rn87\" (UniqueName: \"kubernetes.io/projected/b5b9121c-dafe-40ae-8da8-c95937dd3823-kube-api-access-4rn87\") pod \"b5b9121c-dafe-40ae-8da8-c95937dd3823\" (UID: \"b5b9121c-dafe-40ae-8da8-c95937dd3823\") " Apr 20 20:18:00.450962 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.450924 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-87d08-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5b9121c-dafe-40ae-8da8-c95937dd3823-isvc-sklearn-graph-raw-87d08-kube-rbac-proxy-sar-config\") pod \"b5b9121c-dafe-40ae-8da8-c95937dd3823\" (UID: \"b5b9121c-dafe-40ae-8da8-c95937dd3823\") " Apr 20 20:18:00.451042 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.450964 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5b9121c-dafe-40ae-8da8-c95937dd3823-kserve-provision-location\") pod \"b5b9121c-dafe-40ae-8da8-c95937dd3823\" (UID: \"b5b9121c-dafe-40ae-8da8-c95937dd3823\") " Apr 20 20:18:00.451308 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.451285 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b9121c-dafe-40ae-8da8-c95937dd3823-isvc-sklearn-graph-raw-87d08-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-87d08-kube-rbac-proxy-sar-config") pod "b5b9121c-dafe-40ae-8da8-c95937dd3823" (UID: "b5b9121c-dafe-40ae-8da8-c95937dd3823"). InnerVolumeSpecName "isvc-sklearn-graph-raw-87d08-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:18:00.451589 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.451304 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b9121c-dafe-40ae-8da8-c95937dd3823-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b5b9121c-dafe-40ae-8da8-c95937dd3823" (UID: "b5b9121c-dafe-40ae-8da8-c95937dd3823"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:18:00.452893 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.452873 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b9121c-dafe-40ae-8da8-c95937dd3823-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b5b9121c-dafe-40ae-8da8-c95937dd3823" (UID: "b5b9121c-dafe-40ae-8da8-c95937dd3823"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:18:00.452994 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.452959 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5b9121c-dafe-40ae-8da8-c95937dd3823-kube-api-access-4rn87" (OuterVolumeSpecName: "kube-api-access-4rn87") pod "b5b9121c-dafe-40ae-8da8-c95937dd3823" (UID: "b5b9121c-dafe-40ae-8da8-c95937dd3823"). InnerVolumeSpecName "kube-api-access-4rn87". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:18:00.551684 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.551651 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4rn87\" (UniqueName: \"kubernetes.io/projected/b5b9121c-dafe-40ae-8da8-c95937dd3823-kube-api-access-4rn87\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:18:00.551684 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.551682 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-87d08-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5b9121c-dafe-40ae-8da8-c95937dd3823-isvc-sklearn-graph-raw-87d08-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:18:00.551857 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.551692 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5b9121c-dafe-40ae-8da8-c95937dd3823-kserve-provision-location\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:18:00.551857 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:00.551701 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5b9121c-dafe-40ae-8da8-c95937dd3823-proxy-tls\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:18:01.053476 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:01.053446 2571 generic.go:358] "Generic (PLEG): container finished" podID="86334311-672d-4ba3-b0d8-59aa9263198f" containerID="0bce135bb6b173721d55b8c5cc69d3e19e17d9098847bbd96e6b92ed335b60bd" exitCode=0 Apr 20 20:18:01.053476 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:01.053459 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" event={"ID":"86334311-672d-4ba3-b0d8-59aa9263198f","Type":"ContainerDied","Data":"0bce135bb6b173721d55b8c5cc69d3e19e17d9098847bbd96e6b92ed335b60bd"} Apr 20 20:18:01.055169 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:01.055149 2571 generic.go:358] "Generic (PLEG): container finished" podID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerID="943db10690a19311e9eeb0b5c9705d82eceaf6dd5dd532fd8b58617d82542eef" exitCode=0 Apr 20 20:18:01.055272 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:01.055229 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" event={"ID":"b5b9121c-dafe-40ae-8da8-c95937dd3823","Type":"ContainerDied","Data":"943db10690a19311e9eeb0b5c9705d82eceaf6dd5dd532fd8b58617d82542eef"} Apr 20 20:18:01.055324 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:01.055271 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" event={"ID":"b5b9121c-dafe-40ae-8da8-c95937dd3823","Type":"ContainerDied","Data":"4587fe57da6de9d289614d3d58e3deb6ecad1f50089f928e48c62066cb3bc987"} Apr 20 20:18:01.055324 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:01.055294 2571 scope.go:117] "RemoveContainer" containerID="78c3bdfb6b25ad62834be863766c0ec6da656e256b1aaa35936fbfc7dea00a30" Apr 20 20:18:01.055324 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:01.055240 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv" Apr 20 20:18:01.057635 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:01.057615 2571 generic.go:358] "Generic (PLEG): container finished" podID="f367b892-3117-4f93-b513-6a8a323a86e4" containerID="a34b6db78e48daa6553fd66f0b7d7a0add6fb4b258564858b9b02dc24be54472" exitCode=0 Apr 20 20:18:01.057753 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:01.057643 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" event={"ID":"f367b892-3117-4f93-b513-6a8a323a86e4","Type":"ContainerDied","Data":"a34b6db78e48daa6553fd66f0b7d7a0add6fb4b258564858b9b02dc24be54472"} Apr 20 20:18:01.098319 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:01.098296 2571 scope.go:117] "RemoveContainer" containerID="943db10690a19311e9eeb0b5c9705d82eceaf6dd5dd532fd8b58617d82542eef" Apr 20 20:18:01.126548 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:01.126521 2571 scope.go:117] "RemoveContainer" containerID="3bf8c4880d36556053d260725cf54228cfcb01ad4b76ca9bea111bf4c38c1138" Apr 20 20:18:01.135050 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:01.135002 2571 scope.go:117] "RemoveContainer" containerID="78c3bdfb6b25ad62834be863766c0ec6da656e256b1aaa35936fbfc7dea00a30" Apr 20 20:18:01.135164 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:01.135147 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv"] Apr 20 20:18:01.135336 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:18:01.135318 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78c3bdfb6b25ad62834be863766c0ec6da656e256b1aaa35936fbfc7dea00a30\": container with ID starting with 78c3bdfb6b25ad62834be863766c0ec6da656e256b1aaa35936fbfc7dea00a30 not found: ID does not exist" containerID="78c3bdfb6b25ad62834be863766c0ec6da656e256b1aaa35936fbfc7dea00a30" Apr 20 20:18:01.135394 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:01.135344 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78c3bdfb6b25ad62834be863766c0ec6da656e256b1aaa35936fbfc7dea00a30"} err="failed to get container status \"78c3bdfb6b25ad62834be863766c0ec6da656e256b1aaa35936fbfc7dea00a30\": rpc error: code = NotFound desc = could not find container \"78c3bdfb6b25ad62834be863766c0ec6da656e256b1aaa35936fbfc7dea00a30\": container with ID starting with 78c3bdfb6b25ad62834be863766c0ec6da656e256b1aaa35936fbfc7dea00a30 not found: ID does not exist" Apr 20 20:18:01.135394 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:01.135361 2571 scope.go:117] "RemoveContainer" containerID="943db10690a19311e9eeb0b5c9705d82eceaf6dd5dd532fd8b58617d82542eef" Apr 20 20:18:01.135646 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:18:01.135624 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"943db10690a19311e9eeb0b5c9705d82eceaf6dd5dd532fd8b58617d82542eef\": container with ID starting with 943db10690a19311e9eeb0b5c9705d82eceaf6dd5dd532fd8b58617d82542eef not found: ID does not exist" containerID="943db10690a19311e9eeb0b5c9705d82eceaf6dd5dd532fd8b58617d82542eef" Apr 20 20:18:01.135711 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:01.135657 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"943db10690a19311e9eeb0b5c9705d82eceaf6dd5dd532fd8b58617d82542eef"} err="failed to get container status \"943db10690a19311e9eeb0b5c9705d82eceaf6dd5dd532fd8b58617d82542eef\": rpc error: code = NotFound desc = could not find container \"943db10690a19311e9eeb0b5c9705d82eceaf6dd5dd532fd8b58617d82542eef\": container with ID starting with 943db10690a19311e9eeb0b5c9705d82eceaf6dd5dd532fd8b58617d82542eef not found: ID does not exist" Apr 20 20:18:01.135711 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:01.135675 2571 scope.go:117] "RemoveContainer" containerID="3bf8c4880d36556053d260725cf54228cfcb01ad4b76ca9bea111bf4c38c1138" Apr 20 20:18:01.136016 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:18:01.135980 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bf8c4880d36556053d260725cf54228cfcb01ad4b76ca9bea111bf4c38c1138\": container with ID starting with 3bf8c4880d36556053d260725cf54228cfcb01ad4b76ca9bea111bf4c38c1138 not found: ID does not exist" containerID="3bf8c4880d36556053d260725cf54228cfcb01ad4b76ca9bea111bf4c38c1138" Apr 20 20:18:01.136202 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:01.136171 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bf8c4880d36556053d260725cf54228cfcb01ad4b76ca9bea111bf4c38c1138"} err="failed to get container status \"3bf8c4880d36556053d260725cf54228cfcb01ad4b76ca9bea111bf4c38c1138\": rpc error: code = NotFound desc = could not find container \"3bf8c4880d36556053d260725cf54228cfcb01ad4b76ca9bea111bf4c38c1138\": container with ID starting with 3bf8c4880d36556053d260725cf54228cfcb01ad4b76ca9bea111bf4c38c1138 not found: ID does not exist" Apr 20 20:18:01.138400 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:01.138381 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-87d08-predictor-646b446c9c-m5xwv"] Apr 20 20:18:01.527896 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:01.527863 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dec70b8-66c8-428b-a746-6c4139f337ab" path="/var/lib/kubelet/pods/0dec70b8-66c8-428b-a746-6c4139f337ab/volumes" Apr 20 20:18:01.528347 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:01.528334 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5b9121c-dafe-40ae-8da8-c95937dd3823" path="/var/lib/kubelet/pods/b5b9121c-dafe-40ae-8da8-c95937dd3823/volumes" Apr 20 20:18:02.062723 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:02.062694 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" event={"ID":"f367b892-3117-4f93-b513-6a8a323a86e4","Type":"ContainerStarted","Data":"f97a6f7291eef156f337516312ac6c4b0d3cdfb8059191cab9489f97681fcefd"} Apr 20 20:18:02.062723 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:02.062727 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" event={"ID":"f367b892-3117-4f93-b513-6a8a323a86e4","Type":"ContainerStarted","Data":"64c7996e579ab9362c665744cf02454f34dabe85f48247222f56aed71e89a2e7"} Apr 20 20:18:02.063222 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:02.063081 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" Apr 20 20:18:02.063222 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:02.063105 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" Apr 20 20:18:02.064718 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:02.064684 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" podUID="f367b892-3117-4f93-b513-6a8a323a86e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 20 20:18:02.064850 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:02.064797 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" event={"ID":"86334311-672d-4ba3-b0d8-59aa9263198f","Type":"ContainerStarted","Data":"c6984da2234f90297a1e86b7c3b0bb8ca436595f5740171abc19d20e02f49b49"} Apr 20 20:18:02.064850 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:02.064827 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" event={"ID":"86334311-672d-4ba3-b0d8-59aa9263198f","Type":"ContainerStarted","Data":"f3da15e081ad6e915ec9eceed7c0aa0044d47d24fe37629a186aa888d7228d3f"} Apr 20 20:18:02.065120 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:02.065094 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" Apr 20 20:18:02.065120 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:02.065120 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" Apr 20 20:18:02.066160 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:02.066136 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" podUID="86334311-672d-4ba3-b0d8-59aa9263198f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 20 20:18:02.080809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:02.080771 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" podStartSLOduration=7.080758904 podStartE2EDuration="7.080758904s" podCreationTimestamp="2026-04-20 20:17:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:18:02.079565467 +0000 UTC m=+723.119055034" watchObservedRunningTime="2026-04-20 20:18:02.080758904 +0000 UTC m=+723.120248450" Apr 20 20:18:02.098520 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:02.098467 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" podStartSLOduration=6.098448775 podStartE2EDuration="6.098448775s" podCreationTimestamp="2026-04-20 20:17:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:18:02.096975051 +0000 UTC m=+723.136464601" watchObservedRunningTime="2026-04-20 20:18:02.098448775 +0000 UTC m=+723.137938323" Apr 20 20:18:03.069295 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:03.069253 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" podUID="86334311-672d-4ba3-b0d8-59aa9263198f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 20 20:18:03.069720 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:03.069335 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" podUID="f367b892-3117-4f93-b513-6a8a323a86e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 20 20:18:08.074105 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:08.074075 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" Apr 20 20:18:08.074789 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:08.074720 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" Apr 20 20:18:08.074990 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:08.074964 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" podUID="86334311-672d-4ba3-b0d8-59aa9263198f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 20 20:18:08.075580 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:08.075546 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" podUID="f367b892-3117-4f93-b513-6a8a323a86e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 20 20:18:18.075217 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:18.075170 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" podUID="86334311-672d-4ba3-b0d8-59aa9263198f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 20 20:18:18.075743 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:18.075521 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" podUID="f367b892-3117-4f93-b513-6a8a323a86e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 20 20:18:28.075286 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:28.075250 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" podUID="86334311-672d-4ba3-b0d8-59aa9263198f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 20 20:18:28.075754 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:28.075492 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" podUID="f367b892-3117-4f93-b513-6a8a323a86e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 20 20:18:38.075710 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:38.075672 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" podUID="f367b892-3117-4f93-b513-6a8a323a86e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 20 20:18:38.076079 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:38.075673 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" podUID="86334311-672d-4ba3-b0d8-59aa9263198f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 20 20:18:48.075151 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:48.075113 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" podUID="86334311-672d-4ba3-b0d8-59aa9263198f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 20 20:18:48.075541 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:48.075450 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" podUID="f367b892-3117-4f93-b513-6a8a323a86e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 20 20:18:58.075508 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:58.075468 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" podUID="f367b892-3117-4f93-b513-6a8a323a86e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 20 20:18:58.075876 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:18:58.075536 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" Apr 20 20:19:08.076393 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:08.076364 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" Apr 20 20:19:36.157533 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.157442 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs"] Apr 20 20:19:36.158017 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.157865 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" podUID="f367b892-3117-4f93-b513-6a8a323a86e4" containerName="kserve-container" containerID="cri-o://64c7996e579ab9362c665744cf02454f34dabe85f48247222f56aed71e89a2e7" gracePeriod=30 Apr 20 20:19:36.158017 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.157909 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" podUID="f367b892-3117-4f93-b513-6a8a323a86e4" containerName="kube-rbac-proxy" containerID="cri-o://f97a6f7291eef156f337516312ac6c4b0d3cdfb8059191cab9489f97681fcefd" gracePeriod=30 Apr 20 20:19:36.236201 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.236156 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn"] Apr 20 20:19:36.236618 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.236599 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerName="storage-initializer" Apr 20 20:19:36.236714 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.236619 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerName="storage-initializer" Apr 20 20:19:36.236714 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.236640 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerName="kserve-container" Apr 20 20:19:36.236714 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.236648 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerName="kserve-container" Apr 20 20:19:36.236714 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.236661 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0dec70b8-66c8-428b-a746-6c4139f337ab" containerName="kube-rbac-proxy" Apr 20 20:19:36.236714 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.236669 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dec70b8-66c8-428b-a746-6c4139f337ab" containerName="kube-rbac-proxy" Apr 20 20:19:36.236714 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.236694 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerName="kube-rbac-proxy" Apr 20 20:19:36.236714 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.236702 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerName="kube-rbac-proxy" Apr 20 20:19:36.236714 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.236711 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0dec70b8-66c8-428b-a746-6c4139f337ab" containerName="kserve-container" Apr 20 20:19:36.237143 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.236720 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dec70b8-66c8-428b-a746-6c4139f337ab" containerName="kserve-container" Apr 20 20:19:36.237143 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.236741 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0dec70b8-66c8-428b-a746-6c4139f337ab" containerName="storage-initializer" Apr 20 20:19:36.237143 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.236750 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dec70b8-66c8-428b-a746-6c4139f337ab" containerName="storage-initializer" Apr 20 20:19:36.237143 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.236830 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0dec70b8-66c8-428b-a746-6c4139f337ab" containerName="kube-rbac-proxy" Apr 20 20:19:36.237143 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.236844 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerName="kube-rbac-proxy" Apr 20 20:19:36.237143 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.236857 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0dec70b8-66c8-428b-a746-6c4139f337ab" containerName="kserve-container" Apr 20 20:19:36.237143 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.236871 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5b9121c-dafe-40ae-8da8-c95937dd3823" containerName="kserve-container" Apr 20 20:19:36.240611 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.240591 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" Apr 20 20:19:36.243733 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.243686 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-5be0c-kube-rbac-proxy-sar-config\"" Apr 20 20:19:36.243913 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.243894 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-5be0c-predictor-serving-cert\"" Apr 20 20:19:36.253117 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.253093 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn"] Apr 20 20:19:36.268576 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.268550 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82w9h\" (UniqueName: \"kubernetes.io/projected/cae461df-647e-4cbd-bf0b-c5333d1c15eb-kube-api-access-82w9h\") pod \"message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn\" (UID: \"cae461df-647e-4cbd-bf0b-c5333d1c15eb\") " pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" Apr 20 20:19:36.268728 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.268595 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-raw-5be0c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cae461df-647e-4cbd-bf0b-c5333d1c15eb-message-dumper-raw-5be0c-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn\" (UID: \"cae461df-647e-4cbd-bf0b-c5333d1c15eb\") " pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" Apr 20 20:19:36.268728 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.268682 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cae461df-647e-4cbd-bf0b-c5333d1c15eb-proxy-tls\") pod \"message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn\" (UID: \"cae461df-647e-4cbd-bf0b-c5333d1c15eb\") " pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" Apr 20 20:19:36.283772 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.283744 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6"] Apr 20 20:19:36.284065 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.284040 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" podUID="86334311-672d-4ba3-b0d8-59aa9263198f" containerName="kserve-container" containerID="cri-o://f3da15e081ad6e915ec9eceed7c0aa0044d47d24fe37629a186aa888d7228d3f" gracePeriod=30 Apr 20 20:19:36.284159 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.284072 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" podUID="86334311-672d-4ba3-b0d8-59aa9263198f" containerName="kube-rbac-proxy" containerID="cri-o://c6984da2234f90297a1e86b7c3b0bb8ca436595f5740171abc19d20e02f49b49" gracePeriod=30 Apr 20 20:19:36.369498 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.369467 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-raw-5be0c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cae461df-647e-4cbd-bf0b-c5333d1c15eb-message-dumper-raw-5be0c-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn\" (UID: \"cae461df-647e-4cbd-bf0b-c5333d1c15eb\") " pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" Apr 20 20:19:36.369676 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.369526 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cae461df-647e-4cbd-bf0b-c5333d1c15eb-proxy-tls\") pod \"message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn\" (UID: \"cae461df-647e-4cbd-bf0b-c5333d1c15eb\") " pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" Apr 20 20:19:36.369676 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.369615 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82w9h\" (UniqueName: \"kubernetes.io/projected/cae461df-647e-4cbd-bf0b-c5333d1c15eb-kube-api-access-82w9h\") pod \"message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn\" (UID: \"cae461df-647e-4cbd-bf0b-c5333d1c15eb\") " pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" Apr 20 20:19:36.370217 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.370152 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-raw-5be0c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cae461df-647e-4cbd-bf0b-c5333d1c15eb-message-dumper-raw-5be0c-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn\" (UID: \"cae461df-647e-4cbd-bf0b-c5333d1c15eb\") " pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" Apr 20 20:19:36.371998 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.371967 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cae461df-647e-4cbd-bf0b-c5333d1c15eb-proxy-tls\") pod \"message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn\" (UID: \"cae461df-647e-4cbd-bf0b-c5333d1c15eb\") " pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" Apr 20 20:19:36.377638 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.377613 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82w9h\" (UniqueName: \"kubernetes.io/projected/cae461df-647e-4cbd-bf0b-c5333d1c15eb-kube-api-access-82w9h\") pod \"message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn\" (UID: \"cae461df-647e-4cbd-bf0b-c5333d1c15eb\") " pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" Apr 20 20:19:36.382672 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.382647 2571 generic.go:358] "Generic (PLEG): container finished" podID="f367b892-3117-4f93-b513-6a8a323a86e4" containerID="f97a6f7291eef156f337516312ac6c4b0d3cdfb8059191cab9489f97681fcefd" exitCode=2 Apr 20 20:19:36.382775 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.382735 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" event={"ID":"f367b892-3117-4f93-b513-6a8a323a86e4","Type":"ContainerDied","Data":"f97a6f7291eef156f337516312ac6c4b0d3cdfb8059191cab9489f97681fcefd"} Apr 20 20:19:36.551324 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.551286 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" Apr 20 20:19:36.677484 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:36.677455 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn"] Apr 20 20:19:36.679666 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:19:36.679632 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcae461df_647e_4cbd_bf0b_c5333d1c15eb.slice/crio-61c4452920de7e6fc12f891e3568a5c1a44021464f4945d8b513fc8284b042a4 WatchSource:0}: Error finding container 61c4452920de7e6fc12f891e3568a5c1a44021464f4945d8b513fc8284b042a4: Status 404 returned error can't find the container with id 61c4452920de7e6fc12f891e3568a5c1a44021464f4945d8b513fc8284b042a4 Apr 20 20:19:37.388595 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:37.388555 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" event={"ID":"cae461df-647e-4cbd-bf0b-c5333d1c15eb","Type":"ContainerStarted","Data":"61c4452920de7e6fc12f891e3568a5c1a44021464f4945d8b513fc8284b042a4"} Apr 20 20:19:37.391444 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:37.391332 2571 generic.go:358] "Generic (PLEG): container finished" podID="86334311-672d-4ba3-b0d8-59aa9263198f" containerID="c6984da2234f90297a1e86b7c3b0bb8ca436595f5740171abc19d20e02f49b49" exitCode=2 Apr 20 20:19:37.391444 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:37.391384 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" event={"ID":"86334311-672d-4ba3-b0d8-59aa9263198f","Type":"ContainerDied","Data":"c6984da2234f90297a1e86b7c3b0bb8ca436595f5740171abc19d20e02f49b49"} Apr 20 20:19:38.069893 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:38.069851 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" podUID="86334311-672d-4ba3-b0d8-59aa9263198f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.36:8643/healthz\": dial tcp 10.133.0.36:8643: connect: connection refused" Apr 20 20:19:38.070070 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:38.069860 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" podUID="f367b892-3117-4f93-b513-6a8a323a86e4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.35:8643/healthz\": dial tcp 10.133.0.35:8643: connect: connection refused" Apr 20 20:19:38.074800 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:38.074780 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" podUID="86334311-672d-4ba3-b0d8-59aa9263198f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 20 20:19:38.075982 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:38.075946 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" podUID="f367b892-3117-4f93-b513-6a8a323a86e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 20 20:19:38.397248 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:38.397139 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" event={"ID":"cae461df-647e-4cbd-bf0b-c5333d1c15eb","Type":"ContainerStarted","Data":"7caf6932d51077505fe34cf73cbbf91c4c9423d5be62c081f54dd0d0ea6138ed"} Apr 20 20:19:38.397248 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:38.397176 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" event={"ID":"cae461df-647e-4cbd-bf0b-c5333d1c15eb","Type":"ContainerStarted","Data":"ee088144526618b13671339a6844f1ae8ede7e8e11dc176433dd44a25972a3f3"} Apr 20 20:19:38.397790 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:38.397287 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" Apr 20 20:19:38.413869 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:38.413810 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" podStartSLOduration=1.347128143 podStartE2EDuration="2.413792628s" podCreationTimestamp="2026-04-20 20:19:36 +0000 UTC" firstStartedPulling="2026-04-20 20:19:36.681322379 +0000 UTC m=+817.720811903" lastFinishedPulling="2026-04-20 20:19:37.747986861 +0000 UTC m=+818.787476388" observedRunningTime="2026-04-20 20:19:38.413307825 +0000 UTC m=+819.452797363" watchObservedRunningTime="2026-04-20 20:19:38.413792628 +0000 UTC m=+819.453282178" Apr 20 20:19:39.400989 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:39.400955 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" Apr 20 20:19:39.402627 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:39.402603 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" Apr 20 20:19:40.031899 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.031874 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" Apr 20 20:19:40.104827 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.104750 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86334311-672d-4ba3-b0d8-59aa9263198f-kserve-provision-location\") pod \"86334311-672d-4ba3-b0d8-59aa9263198f\" (UID: \"86334311-672d-4ba3-b0d8-59aa9263198f\") " Apr 20 20:19:40.104827 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.104806 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86334311-672d-4ba3-b0d8-59aa9263198f-proxy-tls\") pod \"86334311-672d-4ba3-b0d8-59aa9263198f\" (UID: \"86334311-672d-4ba3-b0d8-59aa9263198f\") " Apr 20 20:19:40.105063 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.104833 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6274q\" (UniqueName: \"kubernetes.io/projected/86334311-672d-4ba3-b0d8-59aa9263198f-kube-api-access-6274q\") pod \"86334311-672d-4ba3-b0d8-59aa9263198f\" (UID: \"86334311-672d-4ba3-b0d8-59aa9263198f\") " Apr 20 20:19:40.105063 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.104852 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/86334311-672d-4ba3-b0d8-59aa9263198f-isvc-xgboost-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config\") pod \"86334311-672d-4ba3-b0d8-59aa9263198f\" (UID: \"86334311-672d-4ba3-b0d8-59aa9263198f\") " Apr 20 20:19:40.105164 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.105124 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86334311-672d-4ba3-b0d8-59aa9263198f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "86334311-672d-4ba3-b0d8-59aa9263198f" (UID: "86334311-672d-4ba3-b0d8-59aa9263198f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:19:40.105223 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.105202 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86334311-672d-4ba3-b0d8-59aa9263198f-isvc-xgboost-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config") pod "86334311-672d-4ba3-b0d8-59aa9263198f" (UID: "86334311-672d-4ba3-b0d8-59aa9263198f"). InnerVolumeSpecName "isvc-xgboost-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:19:40.106897 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.106873 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86334311-672d-4ba3-b0d8-59aa9263198f-kube-api-access-6274q" (OuterVolumeSpecName: "kube-api-access-6274q") pod "86334311-672d-4ba3-b0d8-59aa9263198f" (UID: "86334311-672d-4ba3-b0d8-59aa9263198f"). InnerVolumeSpecName "kube-api-access-6274q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:19:40.107063 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.107049 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86334311-672d-4ba3-b0d8-59aa9263198f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "86334311-672d-4ba3-b0d8-59aa9263198f" (UID: "86334311-672d-4ba3-b0d8-59aa9263198f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:19:40.206117 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.206091 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86334311-672d-4ba3-b0d8-59aa9263198f-kserve-provision-location\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:19:40.206117 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.206116 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86334311-672d-4ba3-b0d8-59aa9263198f-proxy-tls\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:19:40.206302 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.206127 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6274q\" (UniqueName: \"kubernetes.io/projected/86334311-672d-4ba3-b0d8-59aa9263198f-kube-api-access-6274q\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:19:40.206302 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.206137 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/86334311-672d-4ba3-b0d8-59aa9263198f-isvc-xgboost-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:19:40.406245 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.406216 2571 generic.go:358] "Generic (PLEG): container finished" podID="86334311-672d-4ba3-b0d8-59aa9263198f" containerID="f3da15e081ad6e915ec9eceed7c0aa0044d47d24fe37629a186aa888d7228d3f" exitCode=0 Apr 20 20:19:40.406622 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.406295 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" Apr 20 20:19:40.406622 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.406296 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" event={"ID":"86334311-672d-4ba3-b0d8-59aa9263198f","Type":"ContainerDied","Data":"f3da15e081ad6e915ec9eceed7c0aa0044d47d24fe37629a186aa888d7228d3f"} Apr 20 20:19:40.406622 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.406353 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6" event={"ID":"86334311-672d-4ba3-b0d8-59aa9263198f","Type":"ContainerDied","Data":"b0fb75fdef92ef4dab2c6ac2cb21532aa8ce1cdf774cd1cfef0d1aff4e29bab6"} Apr 20 20:19:40.406622 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.406383 2571 scope.go:117] "RemoveContainer" containerID="c6984da2234f90297a1e86b7c3b0bb8ca436595f5740171abc19d20e02f49b49" Apr 20 20:19:40.461957 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.461926 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6"] Apr 20 20:19:40.464875 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.464855 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3bab1-predictor-858479d9d9-6kbd6"] Apr 20 20:19:40.469181 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.469165 2571 scope.go:117] "RemoveContainer" containerID="f3da15e081ad6e915ec9eceed7c0aa0044d47d24fe37629a186aa888d7228d3f" Apr 20 20:19:40.476522 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.476491 2571 scope.go:117] "RemoveContainer" containerID="0bce135bb6b173721d55b8c5cc69d3e19e17d9098847bbd96e6b92ed335b60bd" Apr 20 20:19:40.483391 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.483375 2571 scope.go:117] "RemoveContainer" containerID="c6984da2234f90297a1e86b7c3b0bb8ca436595f5740171abc19d20e02f49b49" Apr 20 20:19:40.483694 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:19:40.483676 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6984da2234f90297a1e86b7c3b0bb8ca436595f5740171abc19d20e02f49b49\": container with ID starting with c6984da2234f90297a1e86b7c3b0bb8ca436595f5740171abc19d20e02f49b49 not found: ID does not exist" containerID="c6984da2234f90297a1e86b7c3b0bb8ca436595f5740171abc19d20e02f49b49" Apr 20 20:19:40.483754 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.483703 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6984da2234f90297a1e86b7c3b0bb8ca436595f5740171abc19d20e02f49b49"} err="failed to get container status \"c6984da2234f90297a1e86b7c3b0bb8ca436595f5740171abc19d20e02f49b49\": rpc error: code = NotFound desc = could not find container \"c6984da2234f90297a1e86b7c3b0bb8ca436595f5740171abc19d20e02f49b49\": container with ID starting with c6984da2234f90297a1e86b7c3b0bb8ca436595f5740171abc19d20e02f49b49 not found: ID does not exist" Apr 20 20:19:40.483754 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.483722 2571 scope.go:117] "RemoveContainer" containerID="f3da15e081ad6e915ec9eceed7c0aa0044d47d24fe37629a186aa888d7228d3f" Apr 20 20:19:40.483949 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:19:40.483933 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3da15e081ad6e915ec9eceed7c0aa0044d47d24fe37629a186aa888d7228d3f\": container with ID starting with f3da15e081ad6e915ec9eceed7c0aa0044d47d24fe37629a186aa888d7228d3f not found: ID does not exist" containerID="f3da15e081ad6e915ec9eceed7c0aa0044d47d24fe37629a186aa888d7228d3f" Apr 20 20:19:40.483990 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.483966 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3da15e081ad6e915ec9eceed7c0aa0044d47d24fe37629a186aa888d7228d3f"} err="failed to get container status \"f3da15e081ad6e915ec9eceed7c0aa0044d47d24fe37629a186aa888d7228d3f\": rpc error: code = NotFound desc = could not find container \"f3da15e081ad6e915ec9eceed7c0aa0044d47d24fe37629a186aa888d7228d3f\": container with ID starting with f3da15e081ad6e915ec9eceed7c0aa0044d47d24fe37629a186aa888d7228d3f not found: ID does not exist" Apr 20 20:19:40.483990 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.483983 2571 scope.go:117] "RemoveContainer" containerID="0bce135bb6b173721d55b8c5cc69d3e19e17d9098847bbd96e6b92ed335b60bd" Apr 20 20:19:40.484245 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:19:40.484230 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bce135bb6b173721d55b8c5cc69d3e19e17d9098847bbd96e6b92ed335b60bd\": container with ID starting with 0bce135bb6b173721d55b8c5cc69d3e19e17d9098847bbd96e6b92ed335b60bd not found: ID does not exist" containerID="0bce135bb6b173721d55b8c5cc69d3e19e17d9098847bbd96e6b92ed335b60bd" Apr 20 20:19:40.484290 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.484248 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bce135bb6b173721d55b8c5cc69d3e19e17d9098847bbd96e6b92ed335b60bd"} err="failed to get container status \"0bce135bb6b173721d55b8c5cc69d3e19e17d9098847bbd96e6b92ed335b60bd\": rpc error: code = NotFound desc = could not find container \"0bce135bb6b173721d55b8c5cc69d3e19e17d9098847bbd96e6b92ed335b60bd\": container with ID starting with 0bce135bb6b173721d55b8c5cc69d3e19e17d9098847bbd96e6b92ed335b60bd not found: ID does not exist" Apr 20 20:19:40.590574 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.590551 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" Apr 20 20:19:40.609549 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.609303 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzspx\" (UniqueName: \"kubernetes.io/projected/f367b892-3117-4f93-b513-6a8a323a86e4-kube-api-access-lzspx\") pod \"f367b892-3117-4f93-b513-6a8a323a86e4\" (UID: \"f367b892-3117-4f93-b513-6a8a323a86e4\") " Apr 20 20:19:40.609549 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.609359 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f367b892-3117-4f93-b513-6a8a323a86e4-kserve-provision-location\") pod \"f367b892-3117-4f93-b513-6a8a323a86e4\" (UID: \"f367b892-3117-4f93-b513-6a8a323a86e4\") " Apr 20 20:19:40.609549 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.609459 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f367b892-3117-4f93-b513-6a8a323a86e4-isvc-sklearn-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config\") pod \"f367b892-3117-4f93-b513-6a8a323a86e4\" (UID: \"f367b892-3117-4f93-b513-6a8a323a86e4\") " Apr 20 20:19:40.609792 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.609538 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f367b892-3117-4f93-b513-6a8a323a86e4-proxy-tls\") pod \"f367b892-3117-4f93-b513-6a8a323a86e4\" (UID: \"f367b892-3117-4f93-b513-6a8a323a86e4\") " Apr 20 20:19:40.610399 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.610329 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f367b892-3117-4f93-b513-6a8a323a86e4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f367b892-3117-4f93-b513-6a8a323a86e4" (UID: "f367b892-3117-4f93-b513-6a8a323a86e4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:19:40.611292 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.611266 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f367b892-3117-4f93-b513-6a8a323a86e4-isvc-sklearn-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config") pod "f367b892-3117-4f93-b513-6a8a323a86e4" (UID: "f367b892-3117-4f93-b513-6a8a323a86e4"). InnerVolumeSpecName "isvc-sklearn-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:19:40.612519 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.612497 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f367b892-3117-4f93-b513-6a8a323a86e4-kube-api-access-lzspx" (OuterVolumeSpecName: "kube-api-access-lzspx") pod "f367b892-3117-4f93-b513-6a8a323a86e4" (UID: "f367b892-3117-4f93-b513-6a8a323a86e4"). InnerVolumeSpecName "kube-api-access-lzspx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:19:40.612519 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.612516 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f367b892-3117-4f93-b513-6a8a323a86e4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f367b892-3117-4f93-b513-6a8a323a86e4" (UID: "f367b892-3117-4f93-b513-6a8a323a86e4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:19:40.710647 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.710579 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f367b892-3117-4f93-b513-6a8a323a86e4-proxy-tls\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:19:40.710647 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.710605 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lzspx\" (UniqueName: \"kubernetes.io/projected/f367b892-3117-4f93-b513-6a8a323a86e4-kube-api-access-lzspx\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:19:40.710647 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.710618 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f367b892-3117-4f93-b513-6a8a323a86e4-kserve-provision-location\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:19:40.710647 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:40.710628 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f367b892-3117-4f93-b513-6a8a323a86e4-isvc-sklearn-graph-raw-hpa-3bab1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:19:41.412490 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:41.412455 2571 generic.go:358] "Generic (PLEG): container finished" podID="f367b892-3117-4f93-b513-6a8a323a86e4" containerID="64c7996e579ab9362c665744cf02454f34dabe85f48247222f56aed71e89a2e7" exitCode=0 Apr 20 20:19:41.412911 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:41.412523 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" event={"ID":"f367b892-3117-4f93-b513-6a8a323a86e4","Type":"ContainerDied","Data":"64c7996e579ab9362c665744cf02454f34dabe85f48247222f56aed71e89a2e7"} Apr 20 20:19:41.412911 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:41.412553 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" event={"ID":"f367b892-3117-4f93-b513-6a8a323a86e4","Type":"ContainerDied","Data":"a2d6d2b6412da1f2b082a4d351f19c39dd7dda0450f181e4bddc252ca991be17"} Apr 20 20:19:41.412911 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:41.412560 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs" Apr 20 20:19:41.412911 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:41.412570 2571 scope.go:117] "RemoveContainer" containerID="f97a6f7291eef156f337516312ac6c4b0d3cdfb8059191cab9489f97681fcefd" Apr 20 20:19:41.421635 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:41.421617 2571 scope.go:117] "RemoveContainer" containerID="64c7996e579ab9362c665744cf02454f34dabe85f48247222f56aed71e89a2e7" Apr 20 20:19:41.428537 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:41.428518 2571 scope.go:117] "RemoveContainer" containerID="a34b6db78e48daa6553fd66f0b7d7a0add6fb4b258564858b9b02dc24be54472" Apr 20 20:19:41.434895 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:41.434871 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs"] Apr 20 20:19:41.436136 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:41.436120 2571 scope.go:117] "RemoveContainer" containerID="f97a6f7291eef156f337516312ac6c4b0d3cdfb8059191cab9489f97681fcefd" Apr 20 20:19:41.436380 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:19:41.436363 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f97a6f7291eef156f337516312ac6c4b0d3cdfb8059191cab9489f97681fcefd\": container with ID starting with f97a6f7291eef156f337516312ac6c4b0d3cdfb8059191cab9489f97681fcefd not found: ID does not exist" containerID="f97a6f7291eef156f337516312ac6c4b0d3cdfb8059191cab9489f97681fcefd" Apr 20 20:19:41.436445 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:41.436388 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f97a6f7291eef156f337516312ac6c4b0d3cdfb8059191cab9489f97681fcefd"} err="failed to get container status \"f97a6f7291eef156f337516312ac6c4b0d3cdfb8059191cab9489f97681fcefd\": rpc error: code = NotFound desc = could not find container \"f97a6f7291eef156f337516312ac6c4b0d3cdfb8059191cab9489f97681fcefd\": container with ID starting with f97a6f7291eef156f337516312ac6c4b0d3cdfb8059191cab9489f97681fcefd not found: ID does not exist" Apr 20 20:19:41.436445 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:41.436406 2571 scope.go:117] "RemoveContainer" containerID="64c7996e579ab9362c665744cf02454f34dabe85f48247222f56aed71e89a2e7" Apr 20 20:19:41.436656 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:19:41.436636 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64c7996e579ab9362c665744cf02454f34dabe85f48247222f56aed71e89a2e7\": container with ID starting with 64c7996e579ab9362c665744cf02454f34dabe85f48247222f56aed71e89a2e7 not found: ID does not exist" containerID="64c7996e579ab9362c665744cf02454f34dabe85f48247222f56aed71e89a2e7" Apr 20 20:19:41.436696 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:41.436665 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c7996e579ab9362c665744cf02454f34dabe85f48247222f56aed71e89a2e7"} err="failed to get container status \"64c7996e579ab9362c665744cf02454f34dabe85f48247222f56aed71e89a2e7\": rpc error: code = NotFound desc = could not find container \"64c7996e579ab9362c665744cf02454f34dabe85f48247222f56aed71e89a2e7\": container with ID starting with 64c7996e579ab9362c665744cf02454f34dabe85f48247222f56aed71e89a2e7 not found: ID does not exist" Apr 20 20:19:41.436696 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:41.436685 2571 scope.go:117] "RemoveContainer" containerID="a34b6db78e48daa6553fd66f0b7d7a0add6fb4b258564858b9b02dc24be54472" Apr 20 20:19:41.436911 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:19:41.436895 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a34b6db78e48daa6553fd66f0b7d7a0add6fb4b258564858b9b02dc24be54472\": container with ID starting with a34b6db78e48daa6553fd66f0b7d7a0add6fb4b258564858b9b02dc24be54472 not found: ID does not exist" containerID="a34b6db78e48daa6553fd66f0b7d7a0add6fb4b258564858b9b02dc24be54472" Apr 20 20:19:41.436958 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:41.436913 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a34b6db78e48daa6553fd66f0b7d7a0add6fb4b258564858b9b02dc24be54472"} err="failed to get container status \"a34b6db78e48daa6553fd66f0b7d7a0add6fb4b258564858b9b02dc24be54472\": rpc error: code = NotFound desc = could not find container \"a34b6db78e48daa6553fd66f0b7d7a0add6fb4b258564858b9b02dc24be54472\": container with ID starting with a34b6db78e48daa6553fd66f0b7d7a0add6fb4b258564858b9b02dc24be54472 not found: ID does not exist" Apr 20 20:19:41.439884 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:41.439865 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3bab1-predictor-69b497f96f-lg7rs"] Apr 20 20:19:41.527309 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:41.527281 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86334311-672d-4ba3-b0d8-59aa9263198f" path="/var/lib/kubelet/pods/86334311-672d-4ba3-b0d8-59aa9263198f/volumes" Apr 20 20:19:41.527762 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:41.527749 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f367b892-3117-4f93-b513-6a8a323a86e4" path="/var/lib/kubelet/pods/f367b892-3117-4f93-b513-6a8a323a86e4/volumes" Apr 20 20:19:46.418129 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:46.418095 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" Apr 20 20:19:56.248930 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.248893 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb"] Apr 20 20:19:56.249397 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.249380 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f367b892-3117-4f93-b513-6a8a323a86e4" containerName="kserve-container" Apr 20 20:19:56.249480 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.249399 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f367b892-3117-4f93-b513-6a8a323a86e4" containerName="kserve-container" Apr 20 20:19:56.249480 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.249410 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86334311-672d-4ba3-b0d8-59aa9263198f" containerName="storage-initializer" Apr 20 20:19:56.249480 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.249418 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="86334311-672d-4ba3-b0d8-59aa9263198f" containerName="storage-initializer" Apr 20 20:19:56.249480 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.249449 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86334311-672d-4ba3-b0d8-59aa9263198f" containerName="kube-rbac-proxy" Apr 20 20:19:56.249480 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.249456 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="86334311-672d-4ba3-b0d8-59aa9263198f" containerName="kube-rbac-proxy" Apr 20 20:19:56.249480 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.249464 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86334311-672d-4ba3-b0d8-59aa9263198f" containerName="kserve-container" Apr 20 20:19:56.249480 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.249471 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="86334311-672d-4ba3-b0d8-59aa9263198f" containerName="kserve-container" Apr 20 20:19:56.249480 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.249479 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f367b892-3117-4f93-b513-6a8a323a86e4" containerName="storage-initializer" Apr 20 20:19:56.249728 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.249485 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f367b892-3117-4f93-b513-6a8a323a86e4" containerName="storage-initializer" Apr 20 20:19:56.249728 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.249500 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f367b892-3117-4f93-b513-6a8a323a86e4" containerName="kube-rbac-proxy" Apr 20 20:19:56.249728 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.249506 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f367b892-3117-4f93-b513-6a8a323a86e4" containerName="kube-rbac-proxy" Apr 20 20:19:56.249728 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.249553 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f367b892-3117-4f93-b513-6a8a323a86e4" containerName="kserve-container" Apr 20 20:19:56.249728 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.249561 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f367b892-3117-4f93-b513-6a8a323a86e4" containerName="kube-rbac-proxy" Apr 20 20:19:56.249728 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.249570 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="86334311-672d-4ba3-b0d8-59aa9263198f" containerName="kserve-container" Apr 20 20:19:56.249728 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.249576 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="86334311-672d-4ba3-b0d8-59aa9263198f" containerName="kube-rbac-proxy" Apr 20 20:19:56.252887 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.252866 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:19:56.255371 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.255352 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-5be0c-kube-rbac-proxy-sar-config\"" Apr 20 20:19:56.255475 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.255351 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-5be0c-predictor-serving-cert\"" Apr 20 20:19:56.261877 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.261855 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb"] Apr 20 20:19:56.339259 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.339232 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-kserve-provision-location\") pod \"isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb\" (UID: \"5aafa7c5-2a5a-4436-96c0-5528d9796f1e\") " pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:19:56.339364 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.339267 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlxqv\" (UniqueName: \"kubernetes.io/projected/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-kube-api-access-jlxqv\") pod \"isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb\" (UID: \"5aafa7c5-2a5a-4436-96c0-5528d9796f1e\") " pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:19:56.339364 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.339305 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-raw-5be0c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-isvc-logger-raw-5be0c-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb\" (UID: \"5aafa7c5-2a5a-4436-96c0-5528d9796f1e\") " pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:19:56.339490 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.339368 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-proxy-tls\") pod \"isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb\" (UID: \"5aafa7c5-2a5a-4436-96c0-5528d9796f1e\") " pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:19:56.440771 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.440739 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-kserve-provision-location\") pod \"isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb\" (UID: \"5aafa7c5-2a5a-4436-96c0-5528d9796f1e\") " pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:19:56.440916 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.440778 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jlxqv\" (UniqueName: \"kubernetes.io/projected/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-kube-api-access-jlxqv\") pod \"isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb\" (UID: \"5aafa7c5-2a5a-4436-96c0-5528d9796f1e\") " pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:19:56.440916 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.440813 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-raw-5be0c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-isvc-logger-raw-5be0c-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb\" (UID: \"5aafa7c5-2a5a-4436-96c0-5528d9796f1e\") " pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:19:56.440916 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.440836 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-proxy-tls\") pod \"isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb\" (UID: \"5aafa7c5-2a5a-4436-96c0-5528d9796f1e\") " pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:19:56.441082 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:19:56.440937 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-serving-cert: secret "isvc-logger-raw-5be0c-predictor-serving-cert" not found Apr 20 20:19:56.441082 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:19:56.441003 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-proxy-tls podName:5aafa7c5-2a5a-4436-96c0-5528d9796f1e nodeName:}" failed. No retries permitted until 2026-04-20 20:19:56.940972211 +0000 UTC m=+837.980461734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-proxy-tls") pod "isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" (UID: "5aafa7c5-2a5a-4436-96c0-5528d9796f1e") : secret "isvc-logger-raw-5be0c-predictor-serving-cert" not found Apr 20 20:19:56.441174 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.441123 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-kserve-provision-location\") pod \"isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb\" (UID: \"5aafa7c5-2a5a-4436-96c0-5528d9796f1e\") " pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:19:56.441496 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.441465 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-raw-5be0c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-isvc-logger-raw-5be0c-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb\" (UID: \"5aafa7c5-2a5a-4436-96c0-5528d9796f1e\") " pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:19:56.449503 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.449484 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlxqv\" (UniqueName: \"kubernetes.io/projected/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-kube-api-access-jlxqv\") pod \"isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb\" (UID: \"5aafa7c5-2a5a-4436-96c0-5528d9796f1e\") " pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:19:56.944742 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.944705 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-proxy-tls\") pod \"isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb\" (UID: \"5aafa7c5-2a5a-4436-96c0-5528d9796f1e\") " pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:19:56.947152 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:56.947127 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-proxy-tls\") pod \"isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb\" (UID: \"5aafa7c5-2a5a-4436-96c0-5528d9796f1e\") " pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:19:57.163910 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:57.163872 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:19:57.287311 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:57.287280 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb"] Apr 20 20:19:57.290835 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:19:57.290801 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5aafa7c5_2a5a_4436_96c0_5528d9796f1e.slice/crio-ff6a5c7a2f7f1c704d77af352c2f948029191e26790776c5ac7a0af266118d4c WatchSource:0}: Error finding container ff6a5c7a2f7f1c704d77af352c2f948029191e26790776c5ac7a0af266118d4c: Status 404 returned error can't find the container with id ff6a5c7a2f7f1c704d77af352c2f948029191e26790776c5ac7a0af266118d4c Apr 20 20:19:57.471533 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:57.471450 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" event={"ID":"5aafa7c5-2a5a-4436-96c0-5528d9796f1e","Type":"ContainerStarted","Data":"95da147bd97a9ac52914b06500d30dde63a55c1e9ba74e805167b22613c5a80d"} Apr 20 20:19:57.471533 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:19:57.471493 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" event={"ID":"5aafa7c5-2a5a-4436-96c0-5528d9796f1e","Type":"ContainerStarted","Data":"ff6a5c7a2f7f1c704d77af352c2f948029191e26790776c5ac7a0af266118d4c"} Apr 20 20:20:01.487501 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:01.487467 2571 generic.go:358] "Generic (PLEG): container finished" podID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerID="95da147bd97a9ac52914b06500d30dde63a55c1e9ba74e805167b22613c5a80d" exitCode=0 Apr 20 20:20:01.487847 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:01.487542 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" event={"ID":"5aafa7c5-2a5a-4436-96c0-5528d9796f1e","Type":"ContainerDied","Data":"95da147bd97a9ac52914b06500d30dde63a55c1e9ba74e805167b22613c5a80d"} Apr 20 20:20:02.493612 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:02.493577 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" event={"ID":"5aafa7c5-2a5a-4436-96c0-5528d9796f1e","Type":"ContainerStarted","Data":"a8dd583a0607ee1631d4facc79fc4dce6464fbadae468b8db50ec87b4c7da31b"} Apr 20 20:20:02.494055 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:02.493623 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" event={"ID":"5aafa7c5-2a5a-4436-96c0-5528d9796f1e","Type":"ContainerStarted","Data":"f8e5f54cb6be6cde2f9dea9e99c0d59aac9dcd9897bfe2a1d1e8a951c8ed84a0"} Apr 20 20:20:02.494055 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:02.493636 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" event={"ID":"5aafa7c5-2a5a-4436-96c0-5528d9796f1e","Type":"ContainerStarted","Data":"b79c32a6c6699d10497526b01ae2bffe82472361f1c6ae049fc402f4af183b9e"} Apr 20 20:20:02.494055 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:02.493929 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:20:02.494055 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:02.493959 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:20:02.495661 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:02.495636 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 20 20:20:02.514809 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:02.514756 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podStartSLOduration=6.514742713 podStartE2EDuration="6.514742713s" podCreationTimestamp="2026-04-20 20:19:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:20:02.514124621 +0000 UTC m=+843.553614169" watchObservedRunningTime="2026-04-20 20:20:02.514742713 +0000 UTC m=+843.554232262" Apr 20 20:20:03.496923 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:03.496876 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:20:03.497379 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:03.497052 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 20 20:20:03.498138 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:03.498108 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:20:04.500629 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:04.500584 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 20 20:20:04.501065 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:04.500936 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:20:09.505469 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:09.505414 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:20:09.506159 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:09.506130 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 20 20:20:09.506350 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:09.506324 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:20:19.506995 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:19.506936 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 20 20:20:19.507394 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:19.507337 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:20:29.506151 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:29.506109 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 20 20:20:29.506587 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:29.506369 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:20:39.506807 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:39.506750 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 20 20:20:39.507313 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:39.507283 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:20:49.506649 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:49.506560 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 20 20:20:49.507211 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:49.507002 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:20:59.462948 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:59.462921 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kldp5_997e9539-5288-4af5-92f4-55d8ccefbbf7/ovn-acl-logging/0.log" Apr 20 20:20:59.464457 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:59.464418 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kldp5_997e9539-5288-4af5-92f4-55d8ccefbbf7/ovn-acl-logging/0.log" Apr 20 20:20:59.506627 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:59.506592 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 20 20:20:59.507085 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:20:59.507062 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:21:09.506641 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:09.506606 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:21:09.509110 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:09.507058 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:21:21.277244 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.277206 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn_cae461df-647e-4cbd-bf0b-c5333d1c15eb/kserve-container/0.log" Apr 20 20:21:21.448754 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.448717 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb"] Apr 20 20:21:21.449192 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.449146 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kserve-container" containerID="cri-o://b79c32a6c6699d10497526b01ae2bffe82472361f1c6ae049fc402f4af183b9e" gracePeriod=30 Apr 20 20:21:21.449349 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.449323 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="agent" containerID="cri-o://a8dd583a0607ee1631d4facc79fc4dce6464fbadae468b8db50ec87b4c7da31b" gracePeriod=30 Apr 20 20:21:21.449534 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.449510 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kube-rbac-proxy" containerID="cri-o://f8e5f54cb6be6cde2f9dea9e99c0d59aac9dcd9897bfe2a1d1e8a951c8ed84a0" gracePeriod=30 Apr 20 20:21:21.483962 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.483931 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6"] Apr 20 20:21:21.487594 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.487577 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" Apr 20 20:21:21.490192 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.490172 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-30526-kube-rbac-proxy-sar-config\"" Apr 20 20:21:21.490192 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.490188 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-30526-predictor-serving-cert\"" Apr 20 20:21:21.495161 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.495138 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6"] Apr 20 20:21:21.540438 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.540346 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn"] Apr 20 20:21:21.540654 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.540621 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" podUID="cae461df-647e-4cbd-bf0b-c5333d1c15eb" containerName="kserve-container" containerID="cri-o://ee088144526618b13671339a6844f1ae8ede7e8e11dc176433dd44a25972a3f3" gracePeriod=30 Apr 20 20:21:21.540777 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.540668 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" podUID="cae461df-647e-4cbd-bf0b-c5333d1c15eb" containerName="kube-rbac-proxy" containerID="cri-o://7caf6932d51077505fe34cf73cbbf91c4c9423d5be62c081f54dd0d0ea6138ed" gracePeriod=30 Apr 20 20:21:21.650747 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.650718 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91d4ea93-1078-4593-a434-000073a3242a-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6\" (UID: \"91d4ea93-1078-4593-a434-000073a3242a\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" Apr 20 20:21:21.650951 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.650759 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8w66\" (UniqueName: \"kubernetes.io/projected/91d4ea93-1078-4593-a434-000073a3242a-kube-api-access-q8w66\") pod \"isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6\" (UID: \"91d4ea93-1078-4593-a434-000073a3242a\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" Apr 20 20:21:21.650951 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.650786 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-scale-raw-30526-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/91d4ea93-1078-4593-a434-000073a3242a-isvc-sklearn-scale-raw-30526-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6\" (UID: \"91d4ea93-1078-4593-a434-000073a3242a\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" Apr 20 20:21:21.650951 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.650922 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91d4ea93-1078-4593-a434-000073a3242a-proxy-tls\") pod \"isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6\" (UID: \"91d4ea93-1078-4593-a434-000073a3242a\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" Apr 20 20:21:21.748468 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.748404 2571 generic.go:358] "Generic (PLEG): container finished" podID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerID="f8e5f54cb6be6cde2f9dea9e99c0d59aac9dcd9897bfe2a1d1e8a951c8ed84a0" exitCode=2 Apr 20 20:21:21.748644 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.748475 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" event={"ID":"5aafa7c5-2a5a-4436-96c0-5528d9796f1e","Type":"ContainerDied","Data":"f8e5f54cb6be6cde2f9dea9e99c0d59aac9dcd9897bfe2a1d1e8a951c8ed84a0"} Apr 20 20:21:21.750173 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.750151 2571 generic.go:358] "Generic (PLEG): container finished" podID="cae461df-647e-4cbd-bf0b-c5333d1c15eb" containerID="7caf6932d51077505fe34cf73cbbf91c4c9423d5be62c081f54dd0d0ea6138ed" exitCode=2 Apr 20 20:21:21.750173 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.750169 2571 generic.go:358] "Generic (PLEG): container finished" podID="cae461df-647e-4cbd-bf0b-c5333d1c15eb" containerID="ee088144526618b13671339a6844f1ae8ede7e8e11dc176433dd44a25972a3f3" exitCode=2 Apr 20 20:21:21.750326 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.750194 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" event={"ID":"cae461df-647e-4cbd-bf0b-c5333d1c15eb","Type":"ContainerDied","Data":"7caf6932d51077505fe34cf73cbbf91c4c9423d5be62c081f54dd0d0ea6138ed"} Apr 20 20:21:21.750326 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.750218 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" event={"ID":"cae461df-647e-4cbd-bf0b-c5333d1c15eb","Type":"ContainerDied","Data":"ee088144526618b13671339a6844f1ae8ede7e8e11dc176433dd44a25972a3f3"} Apr 20 20:21:21.751510 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.751493 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91d4ea93-1078-4593-a434-000073a3242a-proxy-tls\") pod \"isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6\" (UID: \"91d4ea93-1078-4593-a434-000073a3242a\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" Apr 20 20:21:21.751598 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.751565 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91d4ea93-1078-4593-a434-000073a3242a-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6\" (UID: \"91d4ea93-1078-4593-a434-000073a3242a\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" Apr 20 20:21:21.751598 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.751591 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8w66\" (UniqueName: \"kubernetes.io/projected/91d4ea93-1078-4593-a434-000073a3242a-kube-api-access-q8w66\") pod \"isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6\" (UID: \"91d4ea93-1078-4593-a434-000073a3242a\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" Apr 20 20:21:21.751714 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.751617 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-scale-raw-30526-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/91d4ea93-1078-4593-a434-000073a3242a-isvc-sklearn-scale-raw-30526-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6\" (UID: \"91d4ea93-1078-4593-a434-000073a3242a\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" Apr 20 20:21:21.752040 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.752019 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91d4ea93-1078-4593-a434-000073a3242a-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6\" (UID: \"91d4ea93-1078-4593-a434-000073a3242a\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" Apr 20 20:21:21.752301 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.752285 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-scale-raw-30526-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/91d4ea93-1078-4593-a434-000073a3242a-isvc-sklearn-scale-raw-30526-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6\" (UID: \"91d4ea93-1078-4593-a434-000073a3242a\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" Apr 20 20:21:21.754136 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.754112 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91d4ea93-1078-4593-a434-000073a3242a-proxy-tls\") pod \"isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6\" (UID: \"91d4ea93-1078-4593-a434-000073a3242a\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" Apr 20 20:21:21.759876 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.759843 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8w66\" (UniqueName: \"kubernetes.io/projected/91d4ea93-1078-4593-a434-000073a3242a-kube-api-access-q8w66\") pod \"isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6\" (UID: \"91d4ea93-1078-4593-a434-000073a3242a\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" Apr 20 20:21:21.779825 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.779806 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" Apr 20 20:21:21.800030 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.799955 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" Apr 20 20:21:21.852260 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.852180 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-raw-5be0c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cae461df-647e-4cbd-bf0b-c5333d1c15eb-message-dumper-raw-5be0c-kube-rbac-proxy-sar-config\") pod \"cae461df-647e-4cbd-bf0b-c5333d1c15eb\" (UID: \"cae461df-647e-4cbd-bf0b-c5333d1c15eb\") " Apr 20 20:21:21.852260 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.852234 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82w9h\" (UniqueName: \"kubernetes.io/projected/cae461df-647e-4cbd-bf0b-c5333d1c15eb-kube-api-access-82w9h\") pod \"cae461df-647e-4cbd-bf0b-c5333d1c15eb\" (UID: \"cae461df-647e-4cbd-bf0b-c5333d1c15eb\") " Apr 20 20:21:21.852527 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.852323 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cae461df-647e-4cbd-bf0b-c5333d1c15eb-proxy-tls\") pod \"cae461df-647e-4cbd-bf0b-c5333d1c15eb\" (UID: \"cae461df-647e-4cbd-bf0b-c5333d1c15eb\") " Apr 20 20:21:21.852628 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.852597 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cae461df-647e-4cbd-bf0b-c5333d1c15eb-message-dumper-raw-5be0c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-raw-5be0c-kube-rbac-proxy-sar-config") pod "cae461df-647e-4cbd-bf0b-c5333d1c15eb" (UID: "cae461df-647e-4cbd-bf0b-c5333d1c15eb"). InnerVolumeSpecName "message-dumper-raw-5be0c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:21:21.854645 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.854620 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae461df-647e-4cbd-bf0b-c5333d1c15eb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cae461df-647e-4cbd-bf0b-c5333d1c15eb" (UID: "cae461df-647e-4cbd-bf0b-c5333d1c15eb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:21:21.855091 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.855069 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cae461df-647e-4cbd-bf0b-c5333d1c15eb-kube-api-access-82w9h" (OuterVolumeSpecName: "kube-api-access-82w9h") pod "cae461df-647e-4cbd-bf0b-c5333d1c15eb" (UID: "cae461df-647e-4cbd-bf0b-c5333d1c15eb"). InnerVolumeSpecName "kube-api-access-82w9h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:21:21.926235 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.926201 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6"] Apr 20 20:21:21.928822 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:21:21.928796 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91d4ea93_1078_4593_a434_000073a3242a.slice/crio-7ebdfa0cc52663fff3edac8bbd6cb4549fd48c2dd80469806598f58f172ba13c WatchSource:0}: Error finding container 7ebdfa0cc52663fff3edac8bbd6cb4549fd48c2dd80469806598f58f172ba13c: Status 404 returned error can't find the container with id 7ebdfa0cc52663fff3edac8bbd6cb4549fd48c2dd80469806598f58f172ba13c Apr 20 20:21:21.930657 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.930637 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:21:21.952954 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.952930 2571 reconciler_common.go:299] "Volume detached for volume \"message-dumper-raw-5be0c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cae461df-647e-4cbd-bf0b-c5333d1c15eb-message-dumper-raw-5be0c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:21:21.953054 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.952958 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-82w9h\" (UniqueName: \"kubernetes.io/projected/cae461df-647e-4cbd-bf0b-c5333d1c15eb-kube-api-access-82w9h\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:21:21.953054 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:21.952973 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cae461df-647e-4cbd-bf0b-c5333d1c15eb-proxy-tls\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:21:22.754584 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:22.754548 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" event={"ID":"91d4ea93-1078-4593-a434-000073a3242a","Type":"ContainerStarted","Data":"95b2805383d1da1ebb2f9662691e86ebafb60792fff7699f14fca1bd188ac634"} Apr 20 20:21:22.755021 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:22.754595 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" event={"ID":"91d4ea93-1078-4593-a434-000073a3242a","Type":"ContainerStarted","Data":"7ebdfa0cc52663fff3edac8bbd6cb4549fd48c2dd80469806598f58f172ba13c"} Apr 20 20:21:22.755859 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:22.755839 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" event={"ID":"cae461df-647e-4cbd-bf0b-c5333d1c15eb","Type":"ContainerDied","Data":"61c4452920de7e6fc12f891e3568a5c1a44021464f4945d8b513fc8284b042a4"} Apr 20 20:21:22.755960 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:22.755872 2571 scope.go:117] "RemoveContainer" containerID="7caf6932d51077505fe34cf73cbbf91c4c9423d5be62c081f54dd0d0ea6138ed" Apr 20 20:21:22.755960 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:22.755911 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn" Apr 20 20:21:22.766353 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:22.766331 2571 scope.go:117] "RemoveContainer" containerID="ee088144526618b13671339a6844f1ae8ede7e8e11dc176433dd44a25972a3f3" Apr 20 20:21:22.786795 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:22.786771 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn"] Apr 20 20:21:22.788168 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:22.788148 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-5be0c-predictor-85d4fc8b75-sgjcn"] Apr 20 20:21:23.527246 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:23.527204 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cae461df-647e-4cbd-bf0b-c5333d1c15eb" path="/var/lib/kubelet/pods/cae461df-647e-4cbd-bf0b-c5333d1c15eb/volumes" Apr 20 20:21:24.500891 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:24.500850 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.38:8643/healthz\": dial tcp 10.133.0.38:8643: connect: connection refused" Apr 20 20:21:25.767365 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:25.767336 2571 generic.go:358] "Generic (PLEG): container finished" podID="91d4ea93-1078-4593-a434-000073a3242a" containerID="95b2805383d1da1ebb2f9662691e86ebafb60792fff7699f14fca1bd188ac634" exitCode=0 Apr 20 20:21:25.767802 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:25.767406 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" event={"ID":"91d4ea93-1078-4593-a434-000073a3242a","Type":"ContainerDied","Data":"95b2805383d1da1ebb2f9662691e86ebafb60792fff7699f14fca1bd188ac634"} Apr 20 20:21:25.769672 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:25.769652 2571 generic.go:358] "Generic (PLEG): container finished" podID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerID="b79c32a6c6699d10497526b01ae2bffe82472361f1c6ae049fc402f4af183b9e" exitCode=0 Apr 20 20:21:25.769771 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:25.769689 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" event={"ID":"5aafa7c5-2a5a-4436-96c0-5528d9796f1e","Type":"ContainerDied","Data":"b79c32a6c6699d10497526b01ae2bffe82472361f1c6ae049fc402f4af183b9e"} Apr 20 20:21:26.774922 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:26.774883 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" event={"ID":"91d4ea93-1078-4593-a434-000073a3242a","Type":"ContainerStarted","Data":"fe571f02fd1b6a0e6c1a5f91b974dac213741ade633da77ab500954b4d89eb51"} Apr 20 20:21:26.774922 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:26.774921 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" event={"ID":"91d4ea93-1078-4593-a434-000073a3242a","Type":"ContainerStarted","Data":"5a55109a150b34cca95d926b51d0e169279d45648327ecec2ac2cd9e7ada8786"} Apr 20 20:21:26.775416 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:26.775206 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" Apr 20 20:21:26.775416 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:26.775334 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" Apr 20 20:21:26.776671 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:26.776644 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 20 20:21:26.794586 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:26.794542 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" podStartSLOduration=5.79453019 podStartE2EDuration="5.79453019s" podCreationTimestamp="2026-04-20 20:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:21:26.793214915 +0000 UTC m=+927.832704462" watchObservedRunningTime="2026-04-20 20:21:26.79453019 +0000 UTC m=+927.834019738" Apr 20 20:21:27.778730 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:27.778694 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 20 20:21:29.501262 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:29.501212 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.38:8643/healthz\": dial tcp 10.133.0.38:8643: connect: connection refused" Apr 20 20:21:29.506691 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:29.506658 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 20 20:21:29.508243 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:29.508215 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:21:32.782549 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:32.782519 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" Apr 20 20:21:32.783102 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:32.783078 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 20 20:21:34.501650 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:34.501612 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.38:8643/healthz\": dial tcp 10.133.0.38:8643: connect: connection refused" Apr 20 20:21:34.502040 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:34.501737 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:21:39.501588 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:39.501548 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.38:8643/healthz\": dial tcp 10.133.0.38:8643: connect: connection refused" Apr 20 20:21:39.506963 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:39.506941 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 20 20:21:39.507417 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:39.507393 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:21:42.783287 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:42.783238 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 20 20:21:44.500957 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:44.500912 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.38:8643/healthz\": dial tcp 10.133.0.38:8643: connect: connection refused" Apr 20 20:21:49.500913 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:49.500870 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.38:8643/healthz\": dial tcp 10.133.0.38:8643: connect: connection refused" Apr 20 20:21:49.506230 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:49.506207 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 20 20:21:49.506345 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:49.506332 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:21:49.507862 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:49.507830 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:21:49.507972 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:49.507958 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:21:51.597375 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.597353 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:21:51.709462 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.709356 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-kserve-provision-location\") pod \"5aafa7c5-2a5a-4436-96c0-5528d9796f1e\" (UID: \"5aafa7c5-2a5a-4436-96c0-5528d9796f1e\") " Apr 20 20:21:51.709462 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.709391 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-raw-5be0c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-isvc-logger-raw-5be0c-kube-rbac-proxy-sar-config\") pod \"5aafa7c5-2a5a-4436-96c0-5528d9796f1e\" (UID: \"5aafa7c5-2a5a-4436-96c0-5528d9796f1e\") " Apr 20 20:21:51.709462 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.709415 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-proxy-tls\") pod \"5aafa7c5-2a5a-4436-96c0-5528d9796f1e\" (UID: \"5aafa7c5-2a5a-4436-96c0-5528d9796f1e\") " Apr 20 20:21:51.709762 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.709473 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlxqv\" (UniqueName: \"kubernetes.io/projected/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-kube-api-access-jlxqv\") pod \"5aafa7c5-2a5a-4436-96c0-5528d9796f1e\" (UID: \"5aafa7c5-2a5a-4436-96c0-5528d9796f1e\") " Apr 20 20:21:51.709824 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.709762 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5aafa7c5-2a5a-4436-96c0-5528d9796f1e" (UID: "5aafa7c5-2a5a-4436-96c0-5528d9796f1e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:21:51.709875 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.709812 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-isvc-logger-raw-5be0c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-raw-5be0c-kube-rbac-proxy-sar-config") pod "5aafa7c5-2a5a-4436-96c0-5528d9796f1e" (UID: "5aafa7c5-2a5a-4436-96c0-5528d9796f1e"). InnerVolumeSpecName "isvc-logger-raw-5be0c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:21:51.711701 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.711670 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5aafa7c5-2a5a-4436-96c0-5528d9796f1e" (UID: "5aafa7c5-2a5a-4436-96c0-5528d9796f1e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:21:51.711801 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.711782 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-kube-api-access-jlxqv" (OuterVolumeSpecName: "kube-api-access-jlxqv") pod "5aafa7c5-2a5a-4436-96c0-5528d9796f1e" (UID: "5aafa7c5-2a5a-4436-96c0-5528d9796f1e"). InnerVolumeSpecName "kube-api-access-jlxqv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:21:51.810138 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.810099 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-kserve-provision-location\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:21:51.810138 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.810133 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-raw-5be0c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-isvc-logger-raw-5be0c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:21:51.810138 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.810145 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-proxy-tls\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:21:51.810366 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.810155 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jlxqv\" (UniqueName: \"kubernetes.io/projected/5aafa7c5-2a5a-4436-96c0-5528d9796f1e-kube-api-access-jlxqv\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:21:51.858668 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.858629 2571 generic.go:358] "Generic (PLEG): container finished" podID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerID="a8dd583a0607ee1631d4facc79fc4dce6464fbadae468b8db50ec87b4c7da31b" exitCode=0 Apr 20 20:21:51.858863 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.858698 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" event={"ID":"5aafa7c5-2a5a-4436-96c0-5528d9796f1e","Type":"ContainerDied","Data":"a8dd583a0607ee1631d4facc79fc4dce6464fbadae468b8db50ec87b4c7da31b"} Apr 20 20:21:51.858863 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.858743 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" Apr 20 20:21:51.858863 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.858753 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb" event={"ID":"5aafa7c5-2a5a-4436-96c0-5528d9796f1e","Type":"ContainerDied","Data":"ff6a5c7a2f7f1c704d77af352c2f948029191e26790776c5ac7a0af266118d4c"} Apr 20 20:21:51.858863 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.858771 2571 scope.go:117] "RemoveContainer" containerID="a8dd583a0607ee1631d4facc79fc4dce6464fbadae468b8db50ec87b4c7da31b" Apr 20 20:21:51.866833 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.866815 2571 scope.go:117] "RemoveContainer" containerID="f8e5f54cb6be6cde2f9dea9e99c0d59aac9dcd9897bfe2a1d1e8a951c8ed84a0" Apr 20 20:21:51.873899 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.873883 2571 scope.go:117] "RemoveContainer" containerID="b79c32a6c6699d10497526b01ae2bffe82472361f1c6ae049fc402f4af183b9e" Apr 20 20:21:51.879856 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.879823 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb"] Apr 20 20:21:51.882839 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.882746 2571 scope.go:117] "RemoveContainer" containerID="95da147bd97a9ac52914b06500d30dde63a55c1e9ba74e805167b22613c5a80d" Apr 20 20:21:51.884047 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.884026 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-5be0c-predictor-7c47c997b7-8k9kb"] Apr 20 20:21:51.890049 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.890034 2571 scope.go:117] "RemoveContainer" containerID="a8dd583a0607ee1631d4facc79fc4dce6464fbadae468b8db50ec87b4c7da31b" Apr 20 20:21:51.890274 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:21:51.890257 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8dd583a0607ee1631d4facc79fc4dce6464fbadae468b8db50ec87b4c7da31b\": container with ID starting with a8dd583a0607ee1631d4facc79fc4dce6464fbadae468b8db50ec87b4c7da31b not found: ID does not exist" containerID="a8dd583a0607ee1631d4facc79fc4dce6464fbadae468b8db50ec87b4c7da31b" Apr 20 20:21:51.890338 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.890283 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8dd583a0607ee1631d4facc79fc4dce6464fbadae468b8db50ec87b4c7da31b"} err="failed to get container status \"a8dd583a0607ee1631d4facc79fc4dce6464fbadae468b8db50ec87b4c7da31b\": rpc error: code = NotFound desc = could not find container \"a8dd583a0607ee1631d4facc79fc4dce6464fbadae468b8db50ec87b4c7da31b\": container with ID starting with a8dd583a0607ee1631d4facc79fc4dce6464fbadae468b8db50ec87b4c7da31b not found: ID does not exist" Apr 20 20:21:51.890338 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.890299 2571 scope.go:117] "RemoveContainer" containerID="f8e5f54cb6be6cde2f9dea9e99c0d59aac9dcd9897bfe2a1d1e8a951c8ed84a0" Apr 20 20:21:51.890569 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:21:51.890555 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8e5f54cb6be6cde2f9dea9e99c0d59aac9dcd9897bfe2a1d1e8a951c8ed84a0\": container with ID starting with f8e5f54cb6be6cde2f9dea9e99c0d59aac9dcd9897bfe2a1d1e8a951c8ed84a0 not found: ID does not exist" containerID="f8e5f54cb6be6cde2f9dea9e99c0d59aac9dcd9897bfe2a1d1e8a951c8ed84a0" Apr 20 20:21:51.890610 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.890572 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8e5f54cb6be6cde2f9dea9e99c0d59aac9dcd9897bfe2a1d1e8a951c8ed84a0"} err="failed to get container status \"f8e5f54cb6be6cde2f9dea9e99c0d59aac9dcd9897bfe2a1d1e8a951c8ed84a0\": rpc error: code = NotFound desc = could not find container \"f8e5f54cb6be6cde2f9dea9e99c0d59aac9dcd9897bfe2a1d1e8a951c8ed84a0\": container with ID starting with f8e5f54cb6be6cde2f9dea9e99c0d59aac9dcd9897bfe2a1d1e8a951c8ed84a0 not found: ID does not exist" Apr 20 20:21:51.890610 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.890585 2571 scope.go:117] "RemoveContainer" containerID="b79c32a6c6699d10497526b01ae2bffe82472361f1c6ae049fc402f4af183b9e" Apr 20 20:21:51.890866 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:21:51.890842 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b79c32a6c6699d10497526b01ae2bffe82472361f1c6ae049fc402f4af183b9e\": container with ID starting with b79c32a6c6699d10497526b01ae2bffe82472361f1c6ae049fc402f4af183b9e not found: ID does not exist" containerID="b79c32a6c6699d10497526b01ae2bffe82472361f1c6ae049fc402f4af183b9e" Apr 20 20:21:51.890917 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.890865 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b79c32a6c6699d10497526b01ae2bffe82472361f1c6ae049fc402f4af183b9e"} err="failed to get container status \"b79c32a6c6699d10497526b01ae2bffe82472361f1c6ae049fc402f4af183b9e\": rpc error: code = NotFound desc = could not find container \"b79c32a6c6699d10497526b01ae2bffe82472361f1c6ae049fc402f4af183b9e\": container with ID starting with b79c32a6c6699d10497526b01ae2bffe82472361f1c6ae049fc402f4af183b9e not found: ID does not exist" Apr 20 20:21:51.890917 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.890882 2571 scope.go:117] "RemoveContainer" containerID="95da147bd97a9ac52914b06500d30dde63a55c1e9ba74e805167b22613c5a80d" Apr 20 20:21:51.891141 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:21:51.891124 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95da147bd97a9ac52914b06500d30dde63a55c1e9ba74e805167b22613c5a80d\": container with ID starting with 95da147bd97a9ac52914b06500d30dde63a55c1e9ba74e805167b22613c5a80d not found: ID does not exist" containerID="95da147bd97a9ac52914b06500d30dde63a55c1e9ba74e805167b22613c5a80d" Apr 20 20:21:51.891189 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:51.891146 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95da147bd97a9ac52914b06500d30dde63a55c1e9ba74e805167b22613c5a80d"} err="failed to get container status \"95da147bd97a9ac52914b06500d30dde63a55c1e9ba74e805167b22613c5a80d\": rpc error: code = NotFound desc = could not find container \"95da147bd97a9ac52914b06500d30dde63a55c1e9ba74e805167b22613c5a80d\": container with ID starting with 95da147bd97a9ac52914b06500d30dde63a55c1e9ba74e805167b22613c5a80d not found: ID does not exist" Apr 20 20:21:52.783842 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:52.783791 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 20 20:21:53.527402 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:21:53.527372 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" path="/var/lib/kubelet/pods/5aafa7c5-2a5a-4436-96c0-5528d9796f1e/volumes" Apr 20 20:22:02.783898 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:22:02.783860 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 20 20:22:12.783662 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:22:12.783619 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 20 20:22:22.784024 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:22:22.783933 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 20 20:22:32.783335 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:22:32.783285 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 20 20:22:42.783690 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:22:42.783644 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 20 20:22:52.783061 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:22:52.783017 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 20 20:22:58.524085 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:22:58.524041 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 20 20:23:08.524209 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:08.524157 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 20 20:23:18.524694 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:18.524654 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 20 20:23:28.524934 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:28.524872 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 20 20:23:38.524631 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:38.524593 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" Apr 20 20:23:41.668530 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.668475 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6"] Apr 20 20:23:41.669080 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.668929 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kserve-container" containerID="cri-o://5a55109a150b34cca95d926b51d0e169279d45648327ecec2ac2cd9e7ada8786" gracePeriod=30 Apr 20 20:23:41.669080 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.668965 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kube-rbac-proxy" containerID="cri-o://fe571f02fd1b6a0e6c1a5f91b974dac213741ade633da77ab500954b4d89eb51" gracePeriod=30 Apr 20 20:23:41.763613 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.763575 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj"] Apr 20 20:23:41.763995 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.763979 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="storage-initializer" Apr 20 20:23:41.764052 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.763997 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="storage-initializer" Apr 20 20:23:41.764052 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.764005 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kube-rbac-proxy" Apr 20 20:23:41.764052 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.764011 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kube-rbac-proxy" Apr 20 20:23:41.764052 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.764020 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kserve-container" Apr 20 20:23:41.764052 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.764025 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kserve-container" Apr 20 20:23:41.764052 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.764040 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="agent" Apr 20 20:23:41.764052 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.764045 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="agent" Apr 20 20:23:41.764052 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.764053 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cae461df-647e-4cbd-bf0b-c5333d1c15eb" containerName="kube-rbac-proxy" Apr 20 20:23:41.764302 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.764060 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae461df-647e-4cbd-bf0b-c5333d1c15eb" containerName="kube-rbac-proxy" Apr 20 20:23:41.764302 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.764074 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cae461df-647e-4cbd-bf0b-c5333d1c15eb" containerName="kserve-container" Apr 20 20:23:41.764302 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.764080 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae461df-647e-4cbd-bf0b-c5333d1c15eb" containerName="kserve-container" Apr 20 20:23:41.764302 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.764133 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kube-rbac-proxy" Apr 20 20:23:41.764302 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.764141 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="agent" Apr 20 20:23:41.764302 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.764148 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="cae461df-647e-4cbd-bf0b-c5333d1c15eb" containerName="kserve-container" Apr 20 20:23:41.764302 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.764154 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="cae461df-647e-4cbd-bf0b-c5333d1c15eb" containerName="kube-rbac-proxy" Apr 20 20:23:41.764302 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.764161 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5aafa7c5-2a5a-4436-96c0-5528d9796f1e" containerName="kserve-container" Apr 20 20:23:41.767540 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.767519 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" Apr 20 20:23:41.770075 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.770054 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-7f229f-predictor-serving-cert\"" Apr 20 20:23:41.770200 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.770089 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-7f229f-kube-rbac-proxy-sar-config\"" Apr 20 20:23:41.776079 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.776052 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj"] Apr 20 20:23:41.825588 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.825549 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rt2t\" (UniqueName: \"kubernetes.io/projected/a7af65ce-7427-4518-b4d1-39db352355fb-kube-api-access-4rt2t\") pod \"isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj\" (UID: \"a7af65ce-7427-4518-b4d1-39db352355fb\") " pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" Apr 20 20:23:41.825756 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.825593 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7af65ce-7427-4518-b4d1-39db352355fb-kserve-provision-location\") pod \"isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj\" (UID: \"a7af65ce-7427-4518-b4d1-39db352355fb\") " pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" Apr 20 20:23:41.825756 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.825700 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7af65ce-7427-4518-b4d1-39db352355fb-proxy-tls\") pod \"isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj\" (UID: \"a7af65ce-7427-4518-b4d1-39db352355fb\") " pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" Apr 20 20:23:41.825833 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.825768 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-7f229f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a7af65ce-7427-4518-b4d1-39db352355fb-isvc-primary-7f229f-kube-rbac-proxy-sar-config\") pod \"isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj\" (UID: \"a7af65ce-7427-4518-b4d1-39db352355fb\") " pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" Apr 20 20:23:41.927101 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.927012 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rt2t\" (UniqueName: \"kubernetes.io/projected/a7af65ce-7427-4518-b4d1-39db352355fb-kube-api-access-4rt2t\") pod \"isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj\" (UID: \"a7af65ce-7427-4518-b4d1-39db352355fb\") " pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" Apr 20 20:23:41.927101 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.927061 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7af65ce-7427-4518-b4d1-39db352355fb-kserve-provision-location\") pod \"isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj\" (UID: \"a7af65ce-7427-4518-b4d1-39db352355fb\") " pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" Apr 20 20:23:41.927337 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.927108 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7af65ce-7427-4518-b4d1-39db352355fb-proxy-tls\") pod \"isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj\" (UID: \"a7af65ce-7427-4518-b4d1-39db352355fb\") " pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" Apr 20 20:23:41.927337 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.927168 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-7f229f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a7af65ce-7427-4518-b4d1-39db352355fb-isvc-primary-7f229f-kube-rbac-proxy-sar-config\") pod \"isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj\" (UID: \"a7af65ce-7427-4518-b4d1-39db352355fb\") " pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" Apr 20 20:23:41.927337 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:23:41.927282 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-primary-7f229f-predictor-serving-cert: secret "isvc-primary-7f229f-predictor-serving-cert" not found Apr 20 20:23:41.927521 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:23:41.927360 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7af65ce-7427-4518-b4d1-39db352355fb-proxy-tls podName:a7af65ce-7427-4518-b4d1-39db352355fb nodeName:}" failed. No retries permitted until 2026-04-20 20:23:42.427337969 +0000 UTC m=+1063.466827495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a7af65ce-7427-4518-b4d1-39db352355fb-proxy-tls") pod "isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" (UID: "a7af65ce-7427-4518-b4d1-39db352355fb") : secret "isvc-primary-7f229f-predictor-serving-cert" not found Apr 20 20:23:41.927615 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.927589 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7af65ce-7427-4518-b4d1-39db352355fb-kserve-provision-location\") pod \"isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj\" (UID: \"a7af65ce-7427-4518-b4d1-39db352355fb\") " pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" Apr 20 20:23:41.927812 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.927794 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-7f229f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a7af65ce-7427-4518-b4d1-39db352355fb-isvc-primary-7f229f-kube-rbac-proxy-sar-config\") pod \"isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj\" (UID: \"a7af65ce-7427-4518-b4d1-39db352355fb\") " pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" Apr 20 20:23:41.935748 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:41.935714 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rt2t\" (UniqueName: \"kubernetes.io/projected/a7af65ce-7427-4518-b4d1-39db352355fb-kube-api-access-4rt2t\") pod \"isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj\" (UID: \"a7af65ce-7427-4518-b4d1-39db352355fb\") " pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" Apr 20 20:23:42.233982 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:42.233884 2571 generic.go:358] "Generic (PLEG): container finished" podID="91d4ea93-1078-4593-a434-000073a3242a" containerID="fe571f02fd1b6a0e6c1a5f91b974dac213741ade633da77ab500954b4d89eb51" exitCode=2 Apr 20 20:23:42.233982 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:42.233961 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" event={"ID":"91d4ea93-1078-4593-a434-000073a3242a","Type":"ContainerDied","Data":"fe571f02fd1b6a0e6c1a5f91b974dac213741ade633da77ab500954b4d89eb51"} Apr 20 20:23:42.431107 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:42.431072 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7af65ce-7427-4518-b4d1-39db352355fb-proxy-tls\") pod \"isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj\" (UID: \"a7af65ce-7427-4518-b4d1-39db352355fb\") " pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" Apr 20 20:23:42.433760 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:42.433729 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7af65ce-7427-4518-b4d1-39db352355fb-proxy-tls\") pod \"isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj\" (UID: \"a7af65ce-7427-4518-b4d1-39db352355fb\") " pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" Apr 20 20:23:42.680569 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:42.680528 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" Apr 20 20:23:42.779287 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:42.779242 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.39:8643/healthz\": dial tcp 10.133.0.39:8643: connect: connection refused" Apr 20 20:23:42.804918 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:42.804889 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj"] Apr 20 20:23:42.807252 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:23:42.807223 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7af65ce_7427_4518_b4d1_39db352355fb.slice/crio-053a5f3b793ef2565ae694b7f182726d19fcbfed1e9e27d22cd14368855a5514 WatchSource:0}: Error finding container 053a5f3b793ef2565ae694b7f182726d19fcbfed1e9e27d22cd14368855a5514: Status 404 returned error can't find the container with id 053a5f3b793ef2565ae694b7f182726d19fcbfed1e9e27d22cd14368855a5514 Apr 20 20:23:43.239582 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:43.239545 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" event={"ID":"a7af65ce-7427-4518-b4d1-39db352355fb","Type":"ContainerStarted","Data":"aa104aac1d796bdc4188269414569535fbbcf6909427443ebf2db4fb699c51fc"} Apr 20 20:23:43.239582 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:43.239583 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" event={"ID":"a7af65ce-7427-4518-b4d1-39db352355fb","Type":"ContainerStarted","Data":"053a5f3b793ef2565ae694b7f182726d19fcbfed1e9e27d22cd14368855a5514"} Apr 20 20:23:47.254940 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:47.254902 2571 generic.go:358] "Generic (PLEG): container finished" podID="a7af65ce-7427-4518-b4d1-39db352355fb" containerID="aa104aac1d796bdc4188269414569535fbbcf6909427443ebf2db4fb699c51fc" exitCode=0 Apr 20 20:23:47.255325 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:47.254978 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" event={"ID":"a7af65ce-7427-4518-b4d1-39db352355fb","Type":"ContainerDied","Data":"aa104aac1d796bdc4188269414569535fbbcf6909427443ebf2db4fb699c51fc"} Apr 20 20:23:47.779438 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:47.779386 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.39:8643/healthz\": dial tcp 10.133.0.39:8643: connect: connection refused" Apr 20 20:23:48.260335 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:48.260302 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" event={"ID":"a7af65ce-7427-4518-b4d1-39db352355fb","Type":"ContainerStarted","Data":"6c0a0c293277028a67536d54f114f5f5df4985aac98f6ecd5d26f7f9988ad0fc"} Apr 20 20:23:48.260751 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:48.260343 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" event={"ID":"a7af65ce-7427-4518-b4d1-39db352355fb","Type":"ContainerStarted","Data":"091cf9d25c8d86f366561c2cb15b396176e4f0341a3c7a3c881db6719b28a6ce"} Apr 20 20:23:48.260751 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:48.260669 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" Apr 20 20:23:48.260878 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:48.260800 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" Apr 20 20:23:48.262054 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:48.262026 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" podUID="a7af65ce-7427-4518-b4d1-39db352355fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 20 20:23:48.281530 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:48.281480 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" podStartSLOduration=7.281463015 podStartE2EDuration="7.281463015s" podCreationTimestamp="2026-04-20 20:23:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:23:48.278488467 +0000 UTC m=+1069.317978004" watchObservedRunningTime="2026-04-20 20:23:48.281463015 +0000 UTC m=+1069.320952563" Apr 20 20:23:48.524609 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:48.524518 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 20 20:23:49.264067 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:49.264028 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" podUID="a7af65ce-7427-4518-b4d1-39db352355fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 20 20:23:51.119275 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.119252 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" Apr 20 20:23:51.214553 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.214467 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91d4ea93-1078-4593-a434-000073a3242a-proxy-tls\") pod \"91d4ea93-1078-4593-a434-000073a3242a\" (UID: \"91d4ea93-1078-4593-a434-000073a3242a\") " Apr 20 20:23:51.214734 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.214570 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-scale-raw-30526-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/91d4ea93-1078-4593-a434-000073a3242a-isvc-sklearn-scale-raw-30526-kube-rbac-proxy-sar-config\") pod \"91d4ea93-1078-4593-a434-000073a3242a\" (UID: \"91d4ea93-1078-4593-a434-000073a3242a\") " Apr 20 20:23:51.214734 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.214599 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8w66\" (UniqueName: \"kubernetes.io/projected/91d4ea93-1078-4593-a434-000073a3242a-kube-api-access-q8w66\") pod \"91d4ea93-1078-4593-a434-000073a3242a\" (UID: \"91d4ea93-1078-4593-a434-000073a3242a\") " Apr 20 20:23:51.214734 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.214616 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91d4ea93-1078-4593-a434-000073a3242a-kserve-provision-location\") pod \"91d4ea93-1078-4593-a434-000073a3242a\" (UID: \"91d4ea93-1078-4593-a434-000073a3242a\") " Apr 20 20:23:51.214978 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.214944 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91d4ea93-1078-4593-a434-000073a3242a-isvc-sklearn-scale-raw-30526-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-scale-raw-30526-kube-rbac-proxy-sar-config") pod "91d4ea93-1078-4593-a434-000073a3242a" (UID: "91d4ea93-1078-4593-a434-000073a3242a"). InnerVolumeSpecName "isvc-sklearn-scale-raw-30526-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:23:51.215100 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.215017 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91d4ea93-1078-4593-a434-000073a3242a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "91d4ea93-1078-4593-a434-000073a3242a" (UID: "91d4ea93-1078-4593-a434-000073a3242a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:23:51.216851 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.216828 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91d4ea93-1078-4593-a434-000073a3242a-kube-api-access-q8w66" (OuterVolumeSpecName: "kube-api-access-q8w66") pod "91d4ea93-1078-4593-a434-000073a3242a" (UID: "91d4ea93-1078-4593-a434-000073a3242a"). InnerVolumeSpecName "kube-api-access-q8w66". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:23:51.216931 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.216858 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91d4ea93-1078-4593-a434-000073a3242a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "91d4ea93-1078-4593-a434-000073a3242a" (UID: "91d4ea93-1078-4593-a434-000073a3242a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:23:51.272125 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.272087 2571 generic.go:358] "Generic (PLEG): container finished" podID="91d4ea93-1078-4593-a434-000073a3242a" containerID="5a55109a150b34cca95d926b51d0e169279d45648327ecec2ac2cd9e7ada8786" exitCode=0 Apr 20 20:23:51.272312 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.272137 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" event={"ID":"91d4ea93-1078-4593-a434-000073a3242a","Type":"ContainerDied","Data":"5a55109a150b34cca95d926b51d0e169279d45648327ecec2ac2cd9e7ada8786"} Apr 20 20:23:51.272312 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.272178 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" Apr 20 20:23:51.272312 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.272193 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6" event={"ID":"91d4ea93-1078-4593-a434-000073a3242a","Type":"ContainerDied","Data":"7ebdfa0cc52663fff3edac8bbd6cb4549fd48c2dd80469806598f58f172ba13c"} Apr 20 20:23:51.272312 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.272209 2571 scope.go:117] "RemoveContainer" containerID="fe571f02fd1b6a0e6c1a5f91b974dac213741ade633da77ab500954b4d89eb51" Apr 20 20:23:51.280967 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.280949 2571 scope.go:117] "RemoveContainer" containerID="5a55109a150b34cca95d926b51d0e169279d45648327ecec2ac2cd9e7ada8786" Apr 20 20:23:51.290205 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.290147 2571 scope.go:117] "RemoveContainer" containerID="95b2805383d1da1ebb2f9662691e86ebafb60792fff7699f14fca1bd188ac634" Apr 20 20:23:51.294792 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.294761 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6"] Apr 20 20:23:51.301048 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.301026 2571 scope.go:117] "RemoveContainer" containerID="fe571f02fd1b6a0e6c1a5f91b974dac213741ade633da77ab500954b4d89eb51" Apr 20 20:23:51.301356 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:23:51.301336 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe571f02fd1b6a0e6c1a5f91b974dac213741ade633da77ab500954b4d89eb51\": container with ID starting with fe571f02fd1b6a0e6c1a5f91b974dac213741ade633da77ab500954b4d89eb51 not found: ID does not exist" containerID="fe571f02fd1b6a0e6c1a5f91b974dac213741ade633da77ab500954b4d89eb51" Apr 20 20:23:51.301444 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.301365 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe571f02fd1b6a0e6c1a5f91b974dac213741ade633da77ab500954b4d89eb51"} err="failed to get container status \"fe571f02fd1b6a0e6c1a5f91b974dac213741ade633da77ab500954b4d89eb51\": rpc error: code = NotFound desc = could not find container \"fe571f02fd1b6a0e6c1a5f91b974dac213741ade633da77ab500954b4d89eb51\": container with ID starting with fe571f02fd1b6a0e6c1a5f91b974dac213741ade633da77ab500954b4d89eb51 not found: ID does not exist" Apr 20 20:23:51.301444 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.301384 2571 scope.go:117] "RemoveContainer" containerID="5a55109a150b34cca95d926b51d0e169279d45648327ecec2ac2cd9e7ada8786" Apr 20 20:23:51.301666 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:23:51.301645 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a55109a150b34cca95d926b51d0e169279d45648327ecec2ac2cd9e7ada8786\": container with ID starting with 5a55109a150b34cca95d926b51d0e169279d45648327ecec2ac2cd9e7ada8786 not found: ID does not exist" containerID="5a55109a150b34cca95d926b51d0e169279d45648327ecec2ac2cd9e7ada8786" Apr 20 20:23:51.301729 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.301670 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a55109a150b34cca95d926b51d0e169279d45648327ecec2ac2cd9e7ada8786"} err="failed to get container status \"5a55109a150b34cca95d926b51d0e169279d45648327ecec2ac2cd9e7ada8786\": rpc error: code = NotFound desc = could not find container \"5a55109a150b34cca95d926b51d0e169279d45648327ecec2ac2cd9e7ada8786\": container with ID starting with 5a55109a150b34cca95d926b51d0e169279d45648327ecec2ac2cd9e7ada8786 not found: ID does not exist" Apr 20 20:23:51.301729 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.301685 2571 scope.go:117] "RemoveContainer" containerID="95b2805383d1da1ebb2f9662691e86ebafb60792fff7699f14fca1bd188ac634" Apr 20 20:23:51.301908 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:23:51.301891 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95b2805383d1da1ebb2f9662691e86ebafb60792fff7699f14fca1bd188ac634\": container with ID starting with 95b2805383d1da1ebb2f9662691e86ebafb60792fff7699f14fca1bd188ac634 not found: ID does not exist" containerID="95b2805383d1da1ebb2f9662691e86ebafb60792fff7699f14fca1bd188ac634" Apr 20 20:23:51.301964 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.301910 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b2805383d1da1ebb2f9662691e86ebafb60792fff7699f14fca1bd188ac634"} err="failed to get container status \"95b2805383d1da1ebb2f9662691e86ebafb60792fff7699f14fca1bd188ac634\": rpc error: code = NotFound desc = could not find container \"95b2805383d1da1ebb2f9662691e86ebafb60792fff7699f14fca1bd188ac634\": container with ID starting with 95b2805383d1da1ebb2f9662691e86ebafb60792fff7699f14fca1bd188ac634 not found: ID does not exist" Apr 20 20:23:51.302624 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.302604 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-30526-predictor-66fb4d7b67-rvhq6"] Apr 20 20:23:51.316364 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.316334 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-scale-raw-30526-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/91d4ea93-1078-4593-a434-000073a3242a-isvc-sklearn-scale-raw-30526-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:23:51.316364 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.316367 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q8w66\" (UniqueName: \"kubernetes.io/projected/91d4ea93-1078-4593-a434-000073a3242a-kube-api-access-q8w66\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:23:51.316561 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.316382 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91d4ea93-1078-4593-a434-000073a3242a-kserve-provision-location\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:23:51.316561 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.316395 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91d4ea93-1078-4593-a434-000073a3242a-proxy-tls\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:23:51.527888 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:51.527856 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91d4ea93-1078-4593-a434-000073a3242a" path="/var/lib/kubelet/pods/91d4ea93-1078-4593-a434-000073a3242a/volumes" Apr 20 20:23:54.269134 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:54.269106 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" Apr 20 20:23:54.269734 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:23:54.269706 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" podUID="a7af65ce-7427-4518-b4d1-39db352355fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 20 20:24:04.269757 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:24:04.269713 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" podUID="a7af65ce-7427-4518-b4d1-39db352355fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 20 20:24:14.270498 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:24:14.270451 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" podUID="a7af65ce-7427-4518-b4d1-39db352355fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 20 20:24:24.269972 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:24:24.269930 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" podUID="a7af65ce-7427-4518-b4d1-39db352355fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 20 20:24:34.270315 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:24:34.270269 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" podUID="a7af65ce-7427-4518-b4d1-39db352355fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 20 20:24:44.269645 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:24:44.269604 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" podUID="a7af65ce-7427-4518-b4d1-39db352355fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 20 20:24:54.270319 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:24:54.270287 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" Apr 20 20:25:01.886827 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:01.886792 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg"] Apr 20 20:25:01.887215 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:01.887144 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kserve-container" Apr 20 20:25:01.887215 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:01.887155 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kserve-container" Apr 20 20:25:01.887215 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:01.887173 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="storage-initializer" Apr 20 20:25:01.887215 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:01.887179 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="storage-initializer" Apr 20 20:25:01.887215 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:01.887185 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kube-rbac-proxy" Apr 20 20:25:01.887215 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:01.887191 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kube-rbac-proxy" Apr 20 20:25:01.887410 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:01.887242 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kserve-container" Apr 20 20:25:01.887410 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:01.887252 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="91d4ea93-1078-4593-a434-000073a3242a" containerName="kube-rbac-proxy" Apr 20 20:25:01.890306 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:01.890290 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" Apr 20 20:25:01.893164 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:01.893137 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-7f229f\"" Apr 20 20:25:01.893164 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:01.893137 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-7f229f-predictor-serving-cert\"" Apr 20 20:25:01.893465 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:01.893193 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-7f229f-kube-rbac-proxy-sar-config\"" Apr 20 20:25:01.894123 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:01.894099 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 20 20:25:01.894218 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:01.894110 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-7f229f-dockercfg-r4hnq\"" Apr 20 20:25:01.900770 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:01.900743 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg"] Apr 20 20:25:01.921054 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:01.921019 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ndkq\" (UniqueName: \"kubernetes.io/projected/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-kube-api-access-6ndkq\") pod \"isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg\" (UID: \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\") " pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" Apr 20 20:25:01.921230 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:01.921066 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-proxy-tls\") pod \"isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg\" (UID: \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\") " pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" Apr 20 20:25:01.921230 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:01.921126 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-7f229f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-isvc-secondary-7f229f-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg\" (UID: \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\") " pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" Apr 20 20:25:01.921230 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:01.921186 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-kserve-provision-location\") pod \"isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg\" (UID: \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\") " pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" Apr 20 20:25:01.921395 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:01.921238 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-cabundle-cert\") pod \"isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg\" (UID: \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\") " pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" Apr 20 20:25:02.022037 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:02.021989 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-kserve-provision-location\") pod \"isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg\" (UID: \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\") " pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" Apr 20 20:25:02.022237 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:02.022049 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-cabundle-cert\") pod \"isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg\" (UID: \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\") " pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" Apr 20 20:25:02.022237 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:02.022082 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ndkq\" (UniqueName: \"kubernetes.io/projected/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-kube-api-access-6ndkq\") pod \"isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg\" (UID: \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\") " pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" Apr 20 20:25:02.022237 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:02.022103 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-proxy-tls\") pod \"isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg\" (UID: \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\") " pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" Apr 20 20:25:02.022237 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:02.022136 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-7f229f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-isvc-secondary-7f229f-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg\" (UID: \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\") " pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" Apr 20 20:25:02.022547 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:02.022521 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-kserve-provision-location\") pod \"isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg\" (UID: \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\") " pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" Apr 20 20:25:02.022744 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:02.022723 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-cabundle-cert\") pod \"isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg\" (UID: \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\") " pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" Apr 20 20:25:02.022789 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:02.022767 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-7f229f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-isvc-secondary-7f229f-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg\" (UID: \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\") " pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" Apr 20 20:25:02.024742 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:02.024719 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-proxy-tls\") pod \"isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg\" (UID: \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\") " pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" Apr 20 20:25:02.030667 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:02.030638 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ndkq\" (UniqueName: \"kubernetes.io/projected/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-kube-api-access-6ndkq\") pod \"isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg\" (UID: \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\") " pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" Apr 20 20:25:02.203149 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:02.203050 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" Apr 20 20:25:02.327857 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:02.327830 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg"] Apr 20 20:25:02.330308 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:25:02.330279 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f731566_8fe5_4ec1_8d1d_e8fa572f4efc.slice/crio-07dca9914209721520804517642a533d8964fa74a4120081b194e4b5340884a0 WatchSource:0}: Error finding container 07dca9914209721520804517642a533d8964fa74a4120081b194e4b5340884a0: Status 404 returned error can't find the container with id 07dca9914209721520804517642a533d8964fa74a4120081b194e4b5340884a0 Apr 20 20:25:02.506314 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:02.506280 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" event={"ID":"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc","Type":"ContainerStarted","Data":"9fd4f1571aec553cc02e36977e57aea2bc8d75048e32e08f36d49bebba0ee1d2"} Apr 20 20:25:02.506535 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:02.506321 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" event={"ID":"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc","Type":"ContainerStarted","Data":"07dca9914209721520804517642a533d8964fa74a4120081b194e4b5340884a0"} Apr 20 20:25:05.518335 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:05.518301 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg_6f731566-8fe5-4ec1-8d1d-e8fa572f4efc/storage-initializer/0.log" Apr 20 20:25:05.518848 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:05.518347 2571 generic.go:358] "Generic (PLEG): container finished" podID="6f731566-8fe5-4ec1-8d1d-e8fa572f4efc" containerID="9fd4f1571aec553cc02e36977e57aea2bc8d75048e32e08f36d49bebba0ee1d2" exitCode=1 Apr 20 20:25:05.518848 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:05.518391 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" event={"ID":"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc","Type":"ContainerDied","Data":"9fd4f1571aec553cc02e36977e57aea2bc8d75048e32e08f36d49bebba0ee1d2"} Apr 20 20:25:06.524203 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:06.524176 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg_6f731566-8fe5-4ec1-8d1d-e8fa572f4efc/storage-initializer/0.log" Apr 20 20:25:06.524614 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:06.524247 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" event={"ID":"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc","Type":"ContainerStarted","Data":"c3938d3221ccd39323a4f2985f0568058817ee3eba06387158236687148e3641"} Apr 20 20:25:09.542530 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:09.542498 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg_6f731566-8fe5-4ec1-8d1d-e8fa572f4efc/storage-initializer/1.log" Apr 20 20:25:09.542962 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:09.542821 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg_6f731566-8fe5-4ec1-8d1d-e8fa572f4efc/storage-initializer/0.log" Apr 20 20:25:09.542962 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:09.542851 2571 generic.go:358] "Generic (PLEG): container finished" podID="6f731566-8fe5-4ec1-8d1d-e8fa572f4efc" containerID="c3938d3221ccd39323a4f2985f0568058817ee3eba06387158236687148e3641" exitCode=1 Apr 20 20:25:09.542962 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:09.542938 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" event={"ID":"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc","Type":"ContainerDied","Data":"c3938d3221ccd39323a4f2985f0568058817ee3eba06387158236687148e3641"} Apr 20 20:25:09.543124 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:09.542994 2571 scope.go:117] "RemoveContainer" containerID="9fd4f1571aec553cc02e36977e57aea2bc8d75048e32e08f36d49bebba0ee1d2" Apr 20 20:25:09.543484 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:09.543465 2571 scope.go:117] "RemoveContainer" containerID="9fd4f1571aec553cc02e36977e57aea2bc8d75048e32e08f36d49bebba0ee1d2" Apr 20 20:25:09.553547 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:25:09.553520 2571 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg_kserve-ci-e2e-test_6f731566-8fe5-4ec1-8d1d-e8fa572f4efc_0 in pod sandbox 07dca9914209721520804517642a533d8964fa74a4120081b194e4b5340884a0 from index: no such id: '9fd4f1571aec553cc02e36977e57aea2bc8d75048e32e08f36d49bebba0ee1d2'" containerID="9fd4f1571aec553cc02e36977e57aea2bc8d75048e32e08f36d49bebba0ee1d2" Apr 20 20:25:09.553612 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:25:09.553566 2571 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg_kserve-ci-e2e-test_6f731566-8fe5-4ec1-8d1d-e8fa572f4efc_0 in pod sandbox 07dca9914209721520804517642a533d8964fa74a4120081b194e4b5340884a0 from index: no such id: '9fd4f1571aec553cc02e36977e57aea2bc8d75048e32e08f36d49bebba0ee1d2'; Skipping pod \"isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg_kserve-ci-e2e-test(6f731566-8fe5-4ec1-8d1d-e8fa572f4efc)\"" logger="UnhandledError" Apr 20 20:25:09.554911 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:25:09.554890 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg_kserve-ci-e2e-test(6f731566-8fe5-4ec1-8d1d-e8fa572f4efc)\"" pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" podUID="6f731566-8fe5-4ec1-8d1d-e8fa572f4efc" Apr 20 20:25:10.547738 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:10.547709 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg_6f731566-8fe5-4ec1-8d1d-e8fa572f4efc/storage-initializer/1.log" Apr 20 20:25:15.939456 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:15.939404 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg"] Apr 20 20:25:15.981748 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:15.981708 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj"] Apr 20 20:25:15.982169 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:15.982129 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" podUID="a7af65ce-7427-4518-b4d1-39db352355fb" containerName="kserve-container" containerID="cri-o://091cf9d25c8d86f366561c2cb15b396176e4f0341a3c7a3c881db6719b28a6ce" gracePeriod=30 Apr 20 20:25:15.982290 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:15.982169 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" podUID="a7af65ce-7427-4518-b4d1-39db352355fb" containerName="kube-rbac-proxy" containerID="cri-o://6c0a0c293277028a67536d54f114f5f5df4985aac98f6ecd5d26f7f9988ad0fc" gracePeriod=30 Apr 20 20:25:16.071895 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.071860 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt"] Apr 20 20:25:16.076144 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.076120 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" Apr 20 20:25:16.078779 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.078756 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-a4d11b-predictor-serving-cert\"" Apr 20 20:25:16.078887 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.078866 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-a4d11b-kube-rbac-proxy-sar-config\"" Apr 20 20:25:16.078955 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.078908 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-a4d11b\"" Apr 20 20:25:16.078955 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.078929 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-a4d11b-dockercfg-2n2l7\"" Apr 20 20:25:16.086508 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.086483 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt"] Apr 20 20:25:16.102615 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.102594 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg_6f731566-8fe5-4ec1-8d1d-e8fa572f4efc/storage-initializer/1.log" Apr 20 20:25:16.102735 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.102656 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" Apr 20 20:25:16.142008 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.141979 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-kserve-provision-location\") pod \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\" (UID: \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\") " Apr 20 20:25:16.142156 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.142049 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-cabundle-cert\") pod \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\" (UID: \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\") " Apr 20 20:25:16.142156 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.142068 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ndkq\" (UniqueName: \"kubernetes.io/projected/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-kube-api-access-6ndkq\") pod \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\" (UID: \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\") " Apr 20 20:25:16.142156 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.142085 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-7f229f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-isvc-secondary-7f229f-kube-rbac-proxy-sar-config\") pod \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\" (UID: \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\") " Apr 20 20:25:16.142156 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.142138 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-proxy-tls\") pod \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\" (UID: \"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc\") " Apr 20 20:25:16.142387 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.142252 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-292tw\" (UniqueName: \"kubernetes.io/projected/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-kube-api-access-292tw\") pod \"isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt\" (UID: \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\") " pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" Apr 20 20:25:16.142387 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.142263 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6f731566-8fe5-4ec1-8d1d-e8fa572f4efc" (UID: "6f731566-8fe5-4ec1-8d1d-e8fa572f4efc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:25:16.142387 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.142379 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-a4d11b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-isvc-init-fail-a4d11b-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt\" (UID: \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\") " pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" Apr 20 20:25:16.142578 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.142404 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-cabundle-cert\") pod \"isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt\" (UID: \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\") " pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" Apr 20 20:25:16.142578 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.142469 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-kserve-provision-location\") pod \"isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt\" (UID: \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\") " pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" Apr 20 20:25:16.142578 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.142490 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "6f731566-8fe5-4ec1-8d1d-e8fa572f4efc" (UID: "6f731566-8fe5-4ec1-8d1d-e8fa572f4efc"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:25:16.142578 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.142503 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-proxy-tls\") pod \"isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt\" (UID: \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\") " pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" Apr 20 20:25:16.142578 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.142524 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-isvc-secondary-7f229f-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-7f229f-kube-rbac-proxy-sar-config") pod "6f731566-8fe5-4ec1-8d1d-e8fa572f4efc" (UID: "6f731566-8fe5-4ec1-8d1d-e8fa572f4efc"). InnerVolumeSpecName "isvc-secondary-7f229f-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:25:16.142791 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.142583 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-kserve-provision-location\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:25:16.142791 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.142601 2571 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-cabundle-cert\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:25:16.144393 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.144368 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6f731566-8fe5-4ec1-8d1d-e8fa572f4efc" (UID: "6f731566-8fe5-4ec1-8d1d-e8fa572f4efc"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:25:16.144508 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.144438 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-kube-api-access-6ndkq" (OuterVolumeSpecName: "kube-api-access-6ndkq") pod "6f731566-8fe5-4ec1-8d1d-e8fa572f4efc" (UID: "6f731566-8fe5-4ec1-8d1d-e8fa572f4efc"). InnerVolumeSpecName "kube-api-access-6ndkq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:25:16.243551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.243466 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-292tw\" (UniqueName: \"kubernetes.io/projected/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-kube-api-access-292tw\") pod \"isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt\" (UID: \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\") " pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" Apr 20 20:25:16.243551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.243529 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-a4d11b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-isvc-init-fail-a4d11b-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt\" (UID: \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\") " pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" Apr 20 20:25:16.243551 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.243549 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-cabundle-cert\") pod \"isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt\" (UID: \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\") " pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" Apr 20 20:25:16.243850 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.243575 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-kserve-provision-location\") pod \"isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt\" (UID: \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\") " pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" Apr 20 20:25:16.243850 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.243599 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-proxy-tls\") pod \"isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt\" (UID: \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\") " pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" Apr 20 20:25:16.243850 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.243645 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6ndkq\" (UniqueName: \"kubernetes.io/projected/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-kube-api-access-6ndkq\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:25:16.243850 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.243662 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-7f229f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-isvc-secondary-7f229f-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:25:16.243850 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.243675 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc-proxy-tls\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:25:16.244131 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.244021 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-kserve-provision-location\") pod \"isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt\" (UID: \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\") " pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" Apr 20 20:25:16.244305 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.244280 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-a4d11b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-isvc-init-fail-a4d11b-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt\" (UID: \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\") " pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" Apr 20 20:25:16.244391 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.244368 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-cabundle-cert\") pod \"isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt\" (UID: \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\") " pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" Apr 20 20:25:16.246139 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.246122 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-proxy-tls\") pod \"isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt\" (UID: \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\") " pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" Apr 20 20:25:16.251149 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.251126 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-292tw\" (UniqueName: \"kubernetes.io/projected/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-kube-api-access-292tw\") pod \"isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt\" (UID: \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\") " pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" Apr 20 20:25:16.387704 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.387667 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" Apr 20 20:25:16.512122 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.512095 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt"] Apr 20 20:25:16.514515 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:25:16.514486 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfda7d385_7ff2_4f30_91cf_1a1c678a0f25.slice/crio-82fd209ec73d87dab80ec746febe0f0675516bfcd15aaa153fb5779fa85b84cb WatchSource:0}: Error finding container 82fd209ec73d87dab80ec746febe0f0675516bfcd15aaa153fb5779fa85b84cb: Status 404 returned error can't find the container with id 82fd209ec73d87dab80ec746febe0f0675516bfcd15aaa153fb5779fa85b84cb Apr 20 20:25:16.567347 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.567326 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg_6f731566-8fe5-4ec1-8d1d-e8fa572f4efc/storage-initializer/1.log" Apr 20 20:25:16.567489 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.567450 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" event={"ID":"6f731566-8fe5-4ec1-8d1d-e8fa572f4efc","Type":"ContainerDied","Data":"07dca9914209721520804517642a533d8964fa74a4120081b194e4b5340884a0"} Apr 20 20:25:16.567574 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.567495 2571 scope.go:117] "RemoveContainer" containerID="c3938d3221ccd39323a4f2985f0568058817ee3eba06387158236687148e3641" Apr 20 20:25:16.567574 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.567462 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg" Apr 20 20:25:16.569244 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.569011 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" event={"ID":"fda7d385-7ff2-4f30-91cf-1a1c678a0f25","Type":"ContainerStarted","Data":"82fd209ec73d87dab80ec746febe0f0675516bfcd15aaa153fb5779fa85b84cb"} Apr 20 20:25:16.571222 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.571196 2571 generic.go:358] "Generic (PLEG): container finished" podID="a7af65ce-7427-4518-b4d1-39db352355fb" containerID="6c0a0c293277028a67536d54f114f5f5df4985aac98f6ecd5d26f7f9988ad0fc" exitCode=2 Apr 20 20:25:16.571328 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.571227 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" event={"ID":"a7af65ce-7427-4518-b4d1-39db352355fb","Type":"ContainerDied","Data":"6c0a0c293277028a67536d54f114f5f5df4985aac98f6ecd5d26f7f9988ad0fc"} Apr 20 20:25:16.604731 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.604700 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg"] Apr 20 20:25:16.609232 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:16.609205 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-7f229f-predictor-6cb8c497f5-bcglg"] Apr 20 20:25:17.528333 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:17.528297 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f731566-8fe5-4ec1-8d1d-e8fa572f4efc" path="/var/lib/kubelet/pods/6f731566-8fe5-4ec1-8d1d-e8fa572f4efc/volumes" Apr 20 20:25:17.576906 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:17.576870 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" event={"ID":"fda7d385-7ff2-4f30-91cf-1a1c678a0f25","Type":"ContainerStarted","Data":"6f42ab70abb8090dcb16effbf5c34855b8bf691daa6c67584d9b81d5f0027de0"} Apr 20 20:25:19.264274 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:19.264228 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" podUID="a7af65ce-7427-4518-b4d1-39db352355fb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.40:8643/healthz\": dial tcp 10.133.0.40:8643: connect: connection refused" Apr 20 20:25:20.525623 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.525601 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" Apr 20 20:25:20.583376 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.583304 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rt2t\" (UniqueName: \"kubernetes.io/projected/a7af65ce-7427-4518-b4d1-39db352355fb-kube-api-access-4rt2t\") pod \"a7af65ce-7427-4518-b4d1-39db352355fb\" (UID: \"a7af65ce-7427-4518-b4d1-39db352355fb\") " Apr 20 20:25:20.583530 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.583405 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7af65ce-7427-4518-b4d1-39db352355fb-kserve-provision-location\") pod \"a7af65ce-7427-4518-b4d1-39db352355fb\" (UID: \"a7af65ce-7427-4518-b4d1-39db352355fb\") " Apr 20 20:25:20.583530 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.583448 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7af65ce-7427-4518-b4d1-39db352355fb-proxy-tls\") pod \"a7af65ce-7427-4518-b4d1-39db352355fb\" (UID: \"a7af65ce-7427-4518-b4d1-39db352355fb\") " Apr 20 20:25:20.583530 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.583487 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-7f229f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a7af65ce-7427-4518-b4d1-39db352355fb-isvc-primary-7f229f-kube-rbac-proxy-sar-config\") pod \"a7af65ce-7427-4518-b4d1-39db352355fb\" (UID: \"a7af65ce-7427-4518-b4d1-39db352355fb\") " Apr 20 20:25:20.583764 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.583721 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7af65ce-7427-4518-b4d1-39db352355fb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a7af65ce-7427-4518-b4d1-39db352355fb" (UID: "a7af65ce-7427-4518-b4d1-39db352355fb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:25:20.583907 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.583879 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7af65ce-7427-4518-b4d1-39db352355fb-isvc-primary-7f229f-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-7f229f-kube-rbac-proxy-sar-config") pod "a7af65ce-7427-4518-b4d1-39db352355fb" (UID: "a7af65ce-7427-4518-b4d1-39db352355fb"). InnerVolumeSpecName "isvc-primary-7f229f-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:25:20.585745 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.585714 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7af65ce-7427-4518-b4d1-39db352355fb-kube-api-access-4rt2t" (OuterVolumeSpecName: "kube-api-access-4rt2t") pod "a7af65ce-7427-4518-b4d1-39db352355fb" (UID: "a7af65ce-7427-4518-b4d1-39db352355fb"). InnerVolumeSpecName "kube-api-access-4rt2t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:25:20.585874 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.585839 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7af65ce-7427-4518-b4d1-39db352355fb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a7af65ce-7427-4518-b4d1-39db352355fb" (UID: "a7af65ce-7427-4518-b4d1-39db352355fb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:25:20.589316 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.589298 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt_fda7d385-7ff2-4f30-91cf-1a1c678a0f25/storage-initializer/0.log" Apr 20 20:25:20.589470 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.589331 2571 generic.go:358] "Generic (PLEG): container finished" podID="fda7d385-7ff2-4f30-91cf-1a1c678a0f25" containerID="6f42ab70abb8090dcb16effbf5c34855b8bf691daa6c67584d9b81d5f0027de0" exitCode=1 Apr 20 20:25:20.589470 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.589411 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" event={"ID":"fda7d385-7ff2-4f30-91cf-1a1c678a0f25","Type":"ContainerDied","Data":"6f42ab70abb8090dcb16effbf5c34855b8bf691daa6c67584d9b81d5f0027de0"} Apr 20 20:25:20.591335 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.591309 2571 generic.go:358] "Generic (PLEG): container finished" podID="a7af65ce-7427-4518-b4d1-39db352355fb" containerID="091cf9d25c8d86f366561c2cb15b396176e4f0341a3c7a3c881db6719b28a6ce" exitCode=0 Apr 20 20:25:20.591482 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.591358 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" event={"ID":"a7af65ce-7427-4518-b4d1-39db352355fb","Type":"ContainerDied","Data":"091cf9d25c8d86f366561c2cb15b396176e4f0341a3c7a3c881db6719b28a6ce"} Apr 20 20:25:20.591482 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.591381 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" Apr 20 20:25:20.591482 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.591408 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj" event={"ID":"a7af65ce-7427-4518-b4d1-39db352355fb","Type":"ContainerDied","Data":"053a5f3b793ef2565ae694b7f182726d19fcbfed1e9e27d22cd14368855a5514"} Apr 20 20:25:20.591482 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.591454 2571 scope.go:117] "RemoveContainer" containerID="6c0a0c293277028a67536d54f114f5f5df4985aac98f6ecd5d26f7f9988ad0fc" Apr 20 20:25:20.600478 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.600373 2571 scope.go:117] "RemoveContainer" containerID="091cf9d25c8d86f366561c2cb15b396176e4f0341a3c7a3c881db6719b28a6ce" Apr 20 20:25:20.610412 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.610392 2571 scope.go:117] "RemoveContainer" containerID="aa104aac1d796bdc4188269414569535fbbcf6909427443ebf2db4fb699c51fc" Apr 20 20:25:20.619408 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.619373 2571 scope.go:117] "RemoveContainer" containerID="6c0a0c293277028a67536d54f114f5f5df4985aac98f6ecd5d26f7f9988ad0fc" Apr 20 20:25:20.619848 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:25:20.619822 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c0a0c293277028a67536d54f114f5f5df4985aac98f6ecd5d26f7f9988ad0fc\": container with ID starting with 6c0a0c293277028a67536d54f114f5f5df4985aac98f6ecd5d26f7f9988ad0fc not found: ID does not exist" containerID="6c0a0c293277028a67536d54f114f5f5df4985aac98f6ecd5d26f7f9988ad0fc" Apr 20 20:25:20.619984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.619860 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c0a0c293277028a67536d54f114f5f5df4985aac98f6ecd5d26f7f9988ad0fc"} err="failed to get container status \"6c0a0c293277028a67536d54f114f5f5df4985aac98f6ecd5d26f7f9988ad0fc\": rpc error: code = NotFound desc = could not find container \"6c0a0c293277028a67536d54f114f5f5df4985aac98f6ecd5d26f7f9988ad0fc\": container with ID starting with 6c0a0c293277028a67536d54f114f5f5df4985aac98f6ecd5d26f7f9988ad0fc not found: ID does not exist" Apr 20 20:25:20.619984 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.619884 2571 scope.go:117] "RemoveContainer" containerID="091cf9d25c8d86f366561c2cb15b396176e4f0341a3c7a3c881db6719b28a6ce" Apr 20 20:25:20.620232 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:25:20.620204 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"091cf9d25c8d86f366561c2cb15b396176e4f0341a3c7a3c881db6719b28a6ce\": container with ID starting with 091cf9d25c8d86f366561c2cb15b396176e4f0341a3c7a3c881db6719b28a6ce not found: ID does not exist" containerID="091cf9d25c8d86f366561c2cb15b396176e4f0341a3c7a3c881db6719b28a6ce" Apr 20 20:25:20.620333 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.620240 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"091cf9d25c8d86f366561c2cb15b396176e4f0341a3c7a3c881db6719b28a6ce"} err="failed to get container status \"091cf9d25c8d86f366561c2cb15b396176e4f0341a3c7a3c881db6719b28a6ce\": rpc error: code = NotFound desc = could not find container \"091cf9d25c8d86f366561c2cb15b396176e4f0341a3c7a3c881db6719b28a6ce\": container with ID starting with 091cf9d25c8d86f366561c2cb15b396176e4f0341a3c7a3c881db6719b28a6ce not found: ID does not exist" Apr 20 20:25:20.620333 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.620264 2571 scope.go:117] "RemoveContainer" containerID="aa104aac1d796bdc4188269414569535fbbcf6909427443ebf2db4fb699c51fc" Apr 20 20:25:20.620677 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:25:20.620656 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa104aac1d796bdc4188269414569535fbbcf6909427443ebf2db4fb699c51fc\": container with ID starting with aa104aac1d796bdc4188269414569535fbbcf6909427443ebf2db4fb699c51fc not found: ID does not exist" containerID="aa104aac1d796bdc4188269414569535fbbcf6909427443ebf2db4fb699c51fc" Apr 20 20:25:20.620749 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.620686 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa104aac1d796bdc4188269414569535fbbcf6909427443ebf2db4fb699c51fc"} err="failed to get container status \"aa104aac1d796bdc4188269414569535fbbcf6909427443ebf2db4fb699c51fc\": rpc error: code = NotFound desc = could not find container \"aa104aac1d796bdc4188269414569535fbbcf6909427443ebf2db4fb699c51fc\": container with ID starting with aa104aac1d796bdc4188269414569535fbbcf6909427443ebf2db4fb699c51fc not found: ID does not exist" Apr 20 20:25:20.621843 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.621800 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj"] Apr 20 20:25:20.624082 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.624059 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-7f229f-predictor-7bd7856b4d-q5sgj"] Apr 20 20:25:20.684332 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.684300 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7af65ce-7427-4518-b4d1-39db352355fb-kserve-provision-location\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:25:20.684332 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.684335 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7af65ce-7427-4518-b4d1-39db352355fb-proxy-tls\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:25:20.684613 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.684353 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-7f229f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a7af65ce-7427-4518-b4d1-39db352355fb-isvc-primary-7f229f-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:25:20.684613 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:20.684368 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4rt2t\" (UniqueName: \"kubernetes.io/projected/a7af65ce-7427-4518-b4d1-39db352355fb-kube-api-access-4rt2t\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:25:21.054139 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.054044 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt"] Apr 20 20:25:21.189814 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.189780 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb"] Apr 20 20:25:21.190232 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.190211 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f731566-8fe5-4ec1-8d1d-e8fa572f4efc" containerName="storage-initializer" Apr 20 20:25:21.190349 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.190235 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f731566-8fe5-4ec1-8d1d-e8fa572f4efc" containerName="storage-initializer" Apr 20 20:25:21.190349 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.190247 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7af65ce-7427-4518-b4d1-39db352355fb" containerName="kube-rbac-proxy" Apr 20 20:25:21.190349 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.190254 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7af65ce-7427-4518-b4d1-39db352355fb" containerName="kube-rbac-proxy" Apr 20 20:25:21.190349 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.190276 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7af65ce-7427-4518-b4d1-39db352355fb" containerName="kserve-container" Apr 20 20:25:21.190349 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.190285 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7af65ce-7427-4518-b4d1-39db352355fb" containerName="kserve-container" Apr 20 20:25:21.190349 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.190296 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7af65ce-7427-4518-b4d1-39db352355fb" containerName="storage-initializer" Apr 20 20:25:21.190349 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.190304 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7af65ce-7427-4518-b4d1-39db352355fb" containerName="storage-initializer" Apr 20 20:25:21.190787 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.190386 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7af65ce-7427-4518-b4d1-39db352355fb" containerName="kube-rbac-proxy" Apr 20 20:25:21.190787 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.190400 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f731566-8fe5-4ec1-8d1d-e8fa572f4efc" containerName="storage-initializer" Apr 20 20:25:21.190787 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.190410 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7af65ce-7427-4518-b4d1-39db352355fb" containerName="kserve-container" Apr 20 20:25:21.190787 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.190556 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f731566-8fe5-4ec1-8d1d-e8fa572f4efc" containerName="storage-initializer" Apr 20 20:25:21.190787 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.190567 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f731566-8fe5-4ec1-8d1d-e8fa572f4efc" containerName="storage-initializer" Apr 20 20:25:21.190787 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.190648 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f731566-8fe5-4ec1-8d1d-e8fa572f4efc" containerName="storage-initializer" Apr 20 20:25:21.193978 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.193956 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" Apr 20 20:25:21.196406 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.196387 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-8db3a-predictor-serving-cert\"" Apr 20 20:25:21.196560 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.196498 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-8db3a-kube-rbac-proxy-sar-config\"" Apr 20 20:25:21.196560 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.196504 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lk7n4\"" Apr 20 20:25:21.205072 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.205047 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb"] Apr 20 20:25:21.289212 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.289176 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-kserve-provision-location\") pod \"raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb\" (UID: \"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed\") " pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" Apr 20 20:25:21.289416 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.289236 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-8db3a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-raw-sklearn-8db3a-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb\" (UID: \"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed\") " pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" Apr 20 20:25:21.289416 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.289334 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9279x\" (UniqueName: \"kubernetes.io/projected/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-kube-api-access-9279x\") pod \"raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb\" (UID: \"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed\") " pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" Apr 20 20:25:21.289416 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.289382 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-proxy-tls\") pod \"raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb\" (UID: \"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed\") " pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" Apr 20 20:25:21.390154 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.390067 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-kserve-provision-location\") pod \"raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb\" (UID: \"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed\") " pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" Apr 20 20:25:21.390154 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.390138 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-8db3a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-raw-sklearn-8db3a-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb\" (UID: \"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed\") " pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" Apr 20 20:25:21.390363 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.390189 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9279x\" (UniqueName: \"kubernetes.io/projected/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-kube-api-access-9279x\") pod \"raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb\" (UID: \"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed\") " pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" Apr 20 20:25:21.390363 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.390229 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-proxy-tls\") pod \"raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb\" (UID: \"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed\") " pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" Apr 20 20:25:21.390571 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.390548 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-kserve-provision-location\") pod \"raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb\" (UID: \"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed\") " pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" Apr 20 20:25:21.390825 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.390800 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-8db3a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-raw-sklearn-8db3a-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb\" (UID: \"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed\") " pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" Apr 20 20:25:21.392838 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.392817 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-proxy-tls\") pod \"raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb\" (UID: \"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed\") " pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" Apr 20 20:25:21.398811 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.398787 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9279x\" (UniqueName: \"kubernetes.io/projected/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-kube-api-access-9279x\") pod \"raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb\" (UID: \"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed\") " pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" Apr 20 20:25:21.505808 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.505769 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" Apr 20 20:25:21.528544 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.528512 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7af65ce-7427-4518-b4d1-39db352355fb" path="/var/lib/kubelet/pods/a7af65ce-7427-4518-b4d1-39db352355fb/volumes" Apr 20 20:25:21.596897 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.596870 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt_fda7d385-7ff2-4f30-91cf-1a1c678a0f25/storage-initializer/0.log" Apr 20 20:25:21.597085 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.597007 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" event={"ID":"fda7d385-7ff2-4f30-91cf-1a1c678a0f25","Type":"ContainerStarted","Data":"8b3f186884467fd8b55a6b610e52906f2a4688bc40ca1851696baadd04ee1cf1"} Apr 20 20:25:21.597158 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.597115 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" podUID="fda7d385-7ff2-4f30-91cf-1a1c678a0f25" containerName="storage-initializer" containerID="cri-o://8b3f186884467fd8b55a6b610e52906f2a4688bc40ca1851696baadd04ee1cf1" gracePeriod=30 Apr 20 20:25:21.634053 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:21.634028 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb"] Apr 20 20:25:21.635957 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:25:21.635930 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b6adc77_1f4e_4265_8bc3_a61a9cabcfed.slice/crio-d408ef04b1b9ee4b638b6ba38881c6e3e8626fa1a7d384ddfa9ef797c1b9051b WatchSource:0}: Error finding container d408ef04b1b9ee4b638b6ba38881c6e3e8626fa1a7d384ddfa9ef797c1b9051b: Status 404 returned error can't find the container with id d408ef04b1b9ee4b638b6ba38881c6e3e8626fa1a7d384ddfa9ef797c1b9051b Apr 20 20:25:22.603472 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:22.603417 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" event={"ID":"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed","Type":"ContainerStarted","Data":"eb941bb494d62f2b06322cb1a86f9f4e3a07c1641b62312faef21f5e45ceaec5"} Apr 20 20:25:22.603472 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:22.603472 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" event={"ID":"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed","Type":"ContainerStarted","Data":"d408ef04b1b9ee4b638b6ba38881c6e3e8626fa1a7d384ddfa9ef797c1b9051b"} Apr 20 20:25:24.149309 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.149286 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt_fda7d385-7ff2-4f30-91cf-1a1c678a0f25/storage-initializer/1.log" Apr 20 20:25:24.149658 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.149641 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt_fda7d385-7ff2-4f30-91cf-1a1c678a0f25/storage-initializer/0.log" Apr 20 20:25:24.149723 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.149710 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" Apr 20 20:25:24.215103 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.215014 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-proxy-tls\") pod \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\" (UID: \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\") " Apr 20 20:25:24.215103 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.215054 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-cabundle-cert\") pod \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\" (UID: \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\") " Apr 20 20:25:24.215103 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.215083 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-292tw\" (UniqueName: \"kubernetes.io/projected/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-kube-api-access-292tw\") pod \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\" (UID: \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\") " Apr 20 20:25:24.215372 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.215108 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-kserve-provision-location\") pod \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\" (UID: \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\") " Apr 20 20:25:24.215372 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.215182 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-a4d11b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-isvc-init-fail-a4d11b-kube-rbac-proxy-sar-config\") pod \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\" (UID: \"fda7d385-7ff2-4f30-91cf-1a1c678a0f25\") " Apr 20 20:25:24.215505 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.215467 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fda7d385-7ff2-4f30-91cf-1a1c678a0f25" (UID: "fda7d385-7ff2-4f30-91cf-1a1c678a0f25"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:25:24.215558 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.215525 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "fda7d385-7ff2-4f30-91cf-1a1c678a0f25" (UID: "fda7d385-7ff2-4f30-91cf-1a1c678a0f25"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:25:24.215642 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.215615 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-isvc-init-fail-a4d11b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-a4d11b-kube-rbac-proxy-sar-config") pod "fda7d385-7ff2-4f30-91cf-1a1c678a0f25" (UID: "fda7d385-7ff2-4f30-91cf-1a1c678a0f25"). InnerVolumeSpecName "isvc-init-fail-a4d11b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:25:24.217318 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.217297 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda7d385-7ff2-4f30-91cf-1a1c678a0f25" (UID: "fda7d385-7ff2-4f30-91cf-1a1c678a0f25"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:25:24.217406 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.217373 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-kube-api-access-292tw" (OuterVolumeSpecName: "kube-api-access-292tw") pod "fda7d385-7ff2-4f30-91cf-1a1c678a0f25" (UID: "fda7d385-7ff2-4f30-91cf-1a1c678a0f25"). InnerVolumeSpecName "kube-api-access-292tw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:25:24.315882 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.315850 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-proxy-tls\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:25:24.315882 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.315876 2571 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-cabundle-cert\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:25:24.315882 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.315885 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-292tw\" (UniqueName: \"kubernetes.io/projected/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-kube-api-access-292tw\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:25:24.316142 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.315895 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-kserve-provision-location\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:25:24.316142 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.315905 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-a4d11b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fda7d385-7ff2-4f30-91cf-1a1c678a0f25-isvc-init-fail-a4d11b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:25:24.610246 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.610216 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt_fda7d385-7ff2-4f30-91cf-1a1c678a0f25/storage-initializer/1.log" Apr 20 20:25:24.610615 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.610600 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt_fda7d385-7ff2-4f30-91cf-1a1c678a0f25/storage-initializer/0.log" Apr 20 20:25:24.610681 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.610637 2571 generic.go:358] "Generic (PLEG): container finished" podID="fda7d385-7ff2-4f30-91cf-1a1c678a0f25" containerID="8b3f186884467fd8b55a6b610e52906f2a4688bc40ca1851696baadd04ee1cf1" exitCode=1 Apr 20 20:25:24.610753 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.610677 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" event={"ID":"fda7d385-7ff2-4f30-91cf-1a1c678a0f25","Type":"ContainerDied","Data":"8b3f186884467fd8b55a6b610e52906f2a4688bc40ca1851696baadd04ee1cf1"} Apr 20 20:25:24.610753 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.610709 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" event={"ID":"fda7d385-7ff2-4f30-91cf-1a1c678a0f25","Type":"ContainerDied","Data":"82fd209ec73d87dab80ec746febe0f0675516bfcd15aaa153fb5779fa85b84cb"} Apr 20 20:25:24.610753 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.610728 2571 scope.go:117] "RemoveContainer" containerID="8b3f186884467fd8b55a6b610e52906f2a4688bc40ca1851696baadd04ee1cf1" Apr 20 20:25:24.610850 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.610726 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt" Apr 20 20:25:24.620033 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.619967 2571 scope.go:117] "RemoveContainer" containerID="6f42ab70abb8090dcb16effbf5c34855b8bf691daa6c67584d9b81d5f0027de0" Apr 20 20:25:24.627858 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.627840 2571 scope.go:117] "RemoveContainer" containerID="8b3f186884467fd8b55a6b610e52906f2a4688bc40ca1851696baadd04ee1cf1" Apr 20 20:25:24.628113 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:25:24.628094 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b3f186884467fd8b55a6b610e52906f2a4688bc40ca1851696baadd04ee1cf1\": container with ID starting with 8b3f186884467fd8b55a6b610e52906f2a4688bc40ca1851696baadd04ee1cf1 not found: ID does not exist" containerID="8b3f186884467fd8b55a6b610e52906f2a4688bc40ca1851696baadd04ee1cf1" Apr 20 20:25:24.628157 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.628123 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b3f186884467fd8b55a6b610e52906f2a4688bc40ca1851696baadd04ee1cf1"} err="failed to get container status \"8b3f186884467fd8b55a6b610e52906f2a4688bc40ca1851696baadd04ee1cf1\": rpc error: code = NotFound desc = could not find container \"8b3f186884467fd8b55a6b610e52906f2a4688bc40ca1851696baadd04ee1cf1\": container with ID starting with 8b3f186884467fd8b55a6b610e52906f2a4688bc40ca1851696baadd04ee1cf1 not found: ID does not exist" Apr 20 20:25:24.628157 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.628142 2571 scope.go:117] "RemoveContainer" containerID="6f42ab70abb8090dcb16effbf5c34855b8bf691daa6c67584d9b81d5f0027de0" Apr 20 20:25:24.628379 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:25:24.628360 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f42ab70abb8090dcb16effbf5c34855b8bf691daa6c67584d9b81d5f0027de0\": container with ID starting with 6f42ab70abb8090dcb16effbf5c34855b8bf691daa6c67584d9b81d5f0027de0 not found: ID does not exist" containerID="6f42ab70abb8090dcb16effbf5c34855b8bf691daa6c67584d9b81d5f0027de0" Apr 20 20:25:24.628465 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.628384 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f42ab70abb8090dcb16effbf5c34855b8bf691daa6c67584d9b81d5f0027de0"} err="failed to get container status \"6f42ab70abb8090dcb16effbf5c34855b8bf691daa6c67584d9b81d5f0027de0\": rpc error: code = NotFound desc = could not find container \"6f42ab70abb8090dcb16effbf5c34855b8bf691daa6c67584d9b81d5f0027de0\": container with ID starting with 6f42ab70abb8090dcb16effbf5c34855b8bf691daa6c67584d9b81d5f0027de0 not found: ID does not exist" Apr 20 20:25:24.651139 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.651107 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt"] Apr 20 20:25:24.657988 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:24.657964 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a4d11b-predictor-67b5fcc647-bv5dt"] Apr 20 20:25:25.528236 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:25.528203 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda7d385-7ff2-4f30-91cf-1a1c678a0f25" path="/var/lib/kubelet/pods/fda7d385-7ff2-4f30-91cf-1a1c678a0f25/volumes" Apr 20 20:25:26.619408 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:26.619376 2571 generic.go:358] "Generic (PLEG): container finished" podID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerID="eb941bb494d62f2b06322cb1a86f9f4e3a07c1641b62312faef21f5e45ceaec5" exitCode=0 Apr 20 20:25:26.619878 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:26.619452 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" event={"ID":"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed","Type":"ContainerDied","Data":"eb941bb494d62f2b06322cb1a86f9f4e3a07c1641b62312faef21f5e45ceaec5"} Apr 20 20:25:27.624474 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:27.624415 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" event={"ID":"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed","Type":"ContainerStarted","Data":"97651bbf6a70eba4126a72a4381680602fdf53799712b18c1e68cf9c088d5b8b"} Apr 20 20:25:27.624894 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:27.624482 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" event={"ID":"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed","Type":"ContainerStarted","Data":"ab1482fe9cb9a7dd86a61c40ea3937efb8433be2bb8533f664a4927b69762537"} Apr 20 20:25:27.624894 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:27.624813 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" Apr 20 20:25:27.624994 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:27.624955 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" Apr 20 20:25:27.626462 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:27.626415 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" podUID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 20 20:25:27.644026 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:27.643964 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" podStartSLOduration=6.643951257 podStartE2EDuration="6.643951257s" podCreationTimestamp="2026-04-20 20:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:25:27.641755382 +0000 UTC m=+1168.681244938" watchObservedRunningTime="2026-04-20 20:25:27.643951257 +0000 UTC m=+1168.683440799" Apr 20 20:25:28.628071 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:28.628029 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" podUID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 20 20:25:33.633382 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:33.633352 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" Apr 20 20:25:33.633883 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:33.633858 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" podUID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 20 20:25:43.633870 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:43.633823 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" podUID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 20 20:25:53.633941 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:53.633899 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" podUID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 20 20:25:59.489966 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:59.489928 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kldp5_997e9539-5288-4af5-92f4-55d8ccefbbf7/ovn-acl-logging/0.log" Apr 20 20:25:59.492399 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:25:59.492377 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kldp5_997e9539-5288-4af5-92f4-55d8ccefbbf7/ovn-acl-logging/0.log" Apr 20 20:26:03.634470 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:03.634412 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" podUID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 20 20:26:13.634379 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:13.634338 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" podUID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 20 20:26:23.634565 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:23.634523 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" podUID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 20 20:26:33.634589 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:33.634556 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" Apr 20 20:26:41.314993 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.314956 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb"] Apr 20 20:26:41.315540 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.315508 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" podUID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerName="kserve-container" containerID="cri-o://ab1482fe9cb9a7dd86a61c40ea3937efb8433be2bb8533f664a4927b69762537" gracePeriod=30 Apr 20 20:26:41.315660 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.315628 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" podUID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerName="kube-rbac-proxy" containerID="cri-o://97651bbf6a70eba4126a72a4381680602fdf53799712b18c1e68cf9c088d5b8b" gracePeriod=30 Apr 20 20:26:41.384765 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.384728 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54"] Apr 20 20:26:41.385071 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.385060 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fda7d385-7ff2-4f30-91cf-1a1c678a0f25" containerName="storage-initializer" Apr 20 20:26:41.385116 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.385073 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fda7d385-7ff2-4f30-91cf-1a1c678a0f25" containerName="storage-initializer" Apr 20 20:26:41.385116 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.385099 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fda7d385-7ff2-4f30-91cf-1a1c678a0f25" containerName="storage-initializer" Apr 20 20:26:41.385116 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.385105 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fda7d385-7ff2-4f30-91cf-1a1c678a0f25" containerName="storage-initializer" Apr 20 20:26:41.385247 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.385232 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="fda7d385-7ff2-4f30-91cf-1a1c678a0f25" containerName="storage-initializer" Apr 20 20:26:41.385286 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.385254 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="fda7d385-7ff2-4f30-91cf-1a1c678a0f25" containerName="storage-initializer" Apr 20 20:26:41.387523 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.387502 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" Apr 20 20:26:41.389884 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.389857 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-146a8-predictor-serving-cert\"" Apr 20 20:26:41.390009 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.389909 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-146a8-kube-rbac-proxy-sar-config\"" Apr 20 20:26:41.398485 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.398463 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54"] Apr 20 20:26:41.562261 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.562230 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b84ff881-c782-4982-9d8a-9af8c5ed94b5-kserve-provision-location\") pod \"raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54\" (UID: \"b84ff881-c782-4982-9d8a-9af8c5ed94b5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" Apr 20 20:26:41.562472 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.562281 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-runtime-146a8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b84ff881-c782-4982-9d8a-9af8c5ed94b5-raw-sklearn-runtime-146a8-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54\" (UID: \"b84ff881-c782-4982-9d8a-9af8c5ed94b5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" Apr 20 20:26:41.562472 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.562339 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wngs\" (UniqueName: \"kubernetes.io/projected/b84ff881-c782-4982-9d8a-9af8c5ed94b5-kube-api-access-6wngs\") pod \"raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54\" (UID: \"b84ff881-c782-4982-9d8a-9af8c5ed94b5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" Apr 20 20:26:41.562472 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.562370 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b84ff881-c782-4982-9d8a-9af8c5ed94b5-proxy-tls\") pod \"raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54\" (UID: \"b84ff881-c782-4982-9d8a-9af8c5ed94b5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" Apr 20 20:26:41.663714 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.663627 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b84ff881-c782-4982-9d8a-9af8c5ed94b5-proxy-tls\") pod \"raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54\" (UID: \"b84ff881-c782-4982-9d8a-9af8c5ed94b5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" Apr 20 20:26:41.663714 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.663687 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b84ff881-c782-4982-9d8a-9af8c5ed94b5-kserve-provision-location\") pod \"raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54\" (UID: \"b84ff881-c782-4982-9d8a-9af8c5ed94b5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" Apr 20 20:26:41.663943 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.663722 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-runtime-146a8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b84ff881-c782-4982-9d8a-9af8c5ed94b5-raw-sklearn-runtime-146a8-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54\" (UID: \"b84ff881-c782-4982-9d8a-9af8c5ed94b5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" Apr 20 20:26:41.663943 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:26:41.663779 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-serving-cert: secret "raw-sklearn-runtime-146a8-predictor-serving-cert" not found Apr 20 20:26:41.663943 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.663803 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wngs\" (UniqueName: \"kubernetes.io/projected/b84ff881-c782-4982-9d8a-9af8c5ed94b5-kube-api-access-6wngs\") pod \"raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54\" (UID: \"b84ff881-c782-4982-9d8a-9af8c5ed94b5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" Apr 20 20:26:41.663943 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:26:41.663853 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b84ff881-c782-4982-9d8a-9af8c5ed94b5-proxy-tls podName:b84ff881-c782-4982-9d8a-9af8c5ed94b5 nodeName:}" failed. No retries permitted until 2026-04-20 20:26:42.163831466 +0000 UTC m=+1243.203321006 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b84ff881-c782-4982-9d8a-9af8c5ed94b5-proxy-tls") pod "raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" (UID: "b84ff881-c782-4982-9d8a-9af8c5ed94b5") : secret "raw-sklearn-runtime-146a8-predictor-serving-cert" not found Apr 20 20:26:41.664156 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.664073 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b84ff881-c782-4982-9d8a-9af8c5ed94b5-kserve-provision-location\") pod \"raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54\" (UID: \"b84ff881-c782-4982-9d8a-9af8c5ed94b5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" Apr 20 20:26:41.664362 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.664340 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-runtime-146a8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b84ff881-c782-4982-9d8a-9af8c5ed94b5-raw-sklearn-runtime-146a8-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54\" (UID: \"b84ff881-c782-4982-9d8a-9af8c5ed94b5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" Apr 20 20:26:41.674580 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.674550 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wngs\" (UniqueName: \"kubernetes.io/projected/b84ff881-c782-4982-9d8a-9af8c5ed94b5-kube-api-access-6wngs\") pod \"raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54\" (UID: \"b84ff881-c782-4982-9d8a-9af8c5ed94b5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" Apr 20 20:26:41.878909 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.878873 2571 generic.go:358] "Generic (PLEG): container finished" podID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerID="97651bbf6a70eba4126a72a4381680602fdf53799712b18c1e68cf9c088d5b8b" exitCode=2 Apr 20 20:26:41.879078 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:41.878951 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" event={"ID":"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed","Type":"ContainerDied","Data":"97651bbf6a70eba4126a72a4381680602fdf53799712b18c1e68cf9c088d5b8b"} Apr 20 20:26:42.168284 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:42.168252 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b84ff881-c782-4982-9d8a-9af8c5ed94b5-proxy-tls\") pod \"raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54\" (UID: \"b84ff881-c782-4982-9d8a-9af8c5ed94b5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" Apr 20 20:26:42.170887 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:42.170851 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b84ff881-c782-4982-9d8a-9af8c5ed94b5-proxy-tls\") pod \"raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54\" (UID: \"b84ff881-c782-4982-9d8a-9af8c5ed94b5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" Apr 20 20:26:42.298622 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:42.298566 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" Apr 20 20:26:42.423266 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:42.423241 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54"] Apr 20 20:26:42.425557 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:26:42.425527 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb84ff881_c782_4982_9d8a_9af8c5ed94b5.slice/crio-7b6ff6862e1b2782b9121b239d5db57b18962c17491159ff604b27356a1c023e WatchSource:0}: Error finding container 7b6ff6862e1b2782b9121b239d5db57b18962c17491159ff604b27356a1c023e: Status 404 returned error can't find the container with id 7b6ff6862e1b2782b9121b239d5db57b18962c17491159ff604b27356a1c023e Apr 20 20:26:42.427646 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:42.427629 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:26:42.890077 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:42.890022 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" event={"ID":"b84ff881-c782-4982-9d8a-9af8c5ed94b5","Type":"ContainerStarted","Data":"93641725ef31d8d591e216c4a3cc86b3ce783e78b94e0d5fe66d233c194ca65b"} Apr 20 20:26:42.890077 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:42.890081 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" event={"ID":"b84ff881-c782-4982-9d8a-9af8c5ed94b5","Type":"ContainerStarted","Data":"7b6ff6862e1b2782b9121b239d5db57b18962c17491159ff604b27356a1c023e"} Apr 20 20:26:43.629323 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:43.629273 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" podUID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.43:8643/healthz\": dial tcp 10.133.0.43:8643: connect: connection refused" Apr 20 20:26:43.634746 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:43.634718 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" podUID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 20 20:26:45.756713 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.756686 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" Apr 20 20:26:45.902184 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.902093 2571 generic.go:358] "Generic (PLEG): container finished" podID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerID="ab1482fe9cb9a7dd86a61c40ea3937efb8433be2bb8533f664a4927b69762537" exitCode=0 Apr 20 20:26:45.902184 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.902151 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" event={"ID":"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed","Type":"ContainerDied","Data":"ab1482fe9cb9a7dd86a61c40ea3937efb8433be2bb8533f664a4927b69762537"} Apr 20 20:26:45.902448 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.902183 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" Apr 20 20:26:45.902448 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.902203 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb" event={"ID":"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed","Type":"ContainerDied","Data":"d408ef04b1b9ee4b638b6ba38881c6e3e8626fa1a7d384ddfa9ef797c1b9051b"} Apr 20 20:26:45.902448 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.902223 2571 scope.go:117] "RemoveContainer" containerID="97651bbf6a70eba4126a72a4381680602fdf53799712b18c1e68cf9c088d5b8b" Apr 20 20:26:45.904483 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.904463 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-8db3a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-raw-sklearn-8db3a-kube-rbac-proxy-sar-config\") pod \"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed\" (UID: \"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed\") " Apr 20 20:26:45.904571 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.904544 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-proxy-tls\") pod \"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed\" (UID: \"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed\") " Apr 20 20:26:45.904617 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.904593 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-kserve-provision-location\") pod \"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed\" (UID: \"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed\") " Apr 20 20:26:45.904661 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.904619 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9279x\" (UniqueName: \"kubernetes.io/projected/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-kube-api-access-9279x\") pod \"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed\" (UID: \"2b6adc77-1f4e-4265-8bc3-a61a9cabcfed\") " Apr 20 20:26:45.904855 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.904820 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-raw-sklearn-8db3a-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-8db3a-kube-rbac-proxy-sar-config") pod "2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" (UID: "2b6adc77-1f4e-4265-8bc3-a61a9cabcfed"). InnerVolumeSpecName "raw-sklearn-8db3a-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:26:45.904976 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.904951 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" (UID: "2b6adc77-1f4e-4265-8bc3-a61a9cabcfed"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:26:45.905045 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.904962 2571 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-8db3a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-raw-sklearn-8db3a-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:26:45.906700 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.906662 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" (UID: "2b6adc77-1f4e-4265-8bc3-a61a9cabcfed"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:26:45.907176 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.907156 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-kube-api-access-9279x" (OuterVolumeSpecName: "kube-api-access-9279x") pod "2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" (UID: "2b6adc77-1f4e-4265-8bc3-a61a9cabcfed"). InnerVolumeSpecName "kube-api-access-9279x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:26:45.921352 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.921329 2571 scope.go:117] "RemoveContainer" containerID="ab1482fe9cb9a7dd86a61c40ea3937efb8433be2bb8533f664a4927b69762537" Apr 20 20:26:45.928916 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.928900 2571 scope.go:117] "RemoveContainer" containerID="eb941bb494d62f2b06322cb1a86f9f4e3a07c1641b62312faef21f5e45ceaec5" Apr 20 20:26:45.936037 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.936019 2571 scope.go:117] "RemoveContainer" containerID="97651bbf6a70eba4126a72a4381680602fdf53799712b18c1e68cf9c088d5b8b" Apr 20 20:26:45.936297 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:26:45.936274 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97651bbf6a70eba4126a72a4381680602fdf53799712b18c1e68cf9c088d5b8b\": container with ID starting with 97651bbf6a70eba4126a72a4381680602fdf53799712b18c1e68cf9c088d5b8b not found: ID does not exist" containerID="97651bbf6a70eba4126a72a4381680602fdf53799712b18c1e68cf9c088d5b8b" Apr 20 20:26:45.936364 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.936310 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97651bbf6a70eba4126a72a4381680602fdf53799712b18c1e68cf9c088d5b8b"} err="failed to get container status \"97651bbf6a70eba4126a72a4381680602fdf53799712b18c1e68cf9c088d5b8b\": rpc error: code = NotFound desc = could not find container \"97651bbf6a70eba4126a72a4381680602fdf53799712b18c1e68cf9c088d5b8b\": container with ID starting with 97651bbf6a70eba4126a72a4381680602fdf53799712b18c1e68cf9c088d5b8b not found: ID does not exist" Apr 20 20:26:45.936364 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.936331 2571 scope.go:117] "RemoveContainer" containerID="ab1482fe9cb9a7dd86a61c40ea3937efb8433be2bb8533f664a4927b69762537" Apr 20 20:26:45.936557 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:26:45.936544 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab1482fe9cb9a7dd86a61c40ea3937efb8433be2bb8533f664a4927b69762537\": container with ID starting with ab1482fe9cb9a7dd86a61c40ea3937efb8433be2bb8533f664a4927b69762537 not found: ID does not exist" containerID="ab1482fe9cb9a7dd86a61c40ea3937efb8433be2bb8533f664a4927b69762537" Apr 20 20:26:45.936598 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.936561 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab1482fe9cb9a7dd86a61c40ea3937efb8433be2bb8533f664a4927b69762537"} err="failed to get container status \"ab1482fe9cb9a7dd86a61c40ea3937efb8433be2bb8533f664a4927b69762537\": rpc error: code = NotFound desc = could not find container \"ab1482fe9cb9a7dd86a61c40ea3937efb8433be2bb8533f664a4927b69762537\": container with ID starting with ab1482fe9cb9a7dd86a61c40ea3937efb8433be2bb8533f664a4927b69762537 not found: ID does not exist" Apr 20 20:26:45.936598 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.936574 2571 scope.go:117] "RemoveContainer" containerID="eb941bb494d62f2b06322cb1a86f9f4e3a07c1641b62312faef21f5e45ceaec5" Apr 20 20:26:45.936806 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:26:45.936786 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb941bb494d62f2b06322cb1a86f9f4e3a07c1641b62312faef21f5e45ceaec5\": container with ID starting with eb941bb494d62f2b06322cb1a86f9f4e3a07c1641b62312faef21f5e45ceaec5 not found: ID does not exist" containerID="eb941bb494d62f2b06322cb1a86f9f4e3a07c1641b62312faef21f5e45ceaec5" Apr 20 20:26:45.936856 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:45.936808 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb941bb494d62f2b06322cb1a86f9f4e3a07c1641b62312faef21f5e45ceaec5"} err="failed to get container status \"eb941bb494d62f2b06322cb1a86f9f4e3a07c1641b62312faef21f5e45ceaec5\": rpc error: code = NotFound desc = could not find container \"eb941bb494d62f2b06322cb1a86f9f4e3a07c1641b62312faef21f5e45ceaec5\": container with ID starting with eb941bb494d62f2b06322cb1a86f9f4e3a07c1641b62312faef21f5e45ceaec5 not found: ID does not exist" Apr 20 20:26:46.005944 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:46.005900 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-proxy-tls\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:26:46.005944 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:46.005934 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-kserve-provision-location\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:26:46.005944 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:46.005945 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9279x\" (UniqueName: \"kubernetes.io/projected/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed-kube-api-access-9279x\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:26:46.224141 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:46.224108 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb"] Apr 20 20:26:46.227991 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:46.227966 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-8db3a-predictor-5b77d9bc95-2m5hb"] Apr 20 20:26:46.906435 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:46.906334 2571 generic.go:358] "Generic (PLEG): container finished" podID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerID="93641725ef31d8d591e216c4a3cc86b3ce783e78b94e0d5fe66d233c194ca65b" exitCode=0 Apr 20 20:26:46.906891 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:46.906410 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" event={"ID":"b84ff881-c782-4982-9d8a-9af8c5ed94b5","Type":"ContainerDied","Data":"93641725ef31d8d591e216c4a3cc86b3ce783e78b94e0d5fe66d233c194ca65b"} Apr 20 20:26:47.528885 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:47.528847 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" path="/var/lib/kubelet/pods/2b6adc77-1f4e-4265-8bc3-a61a9cabcfed/volumes" Apr 20 20:26:47.913527 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:47.913408 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" event={"ID":"b84ff881-c782-4982-9d8a-9af8c5ed94b5","Type":"ContainerStarted","Data":"e28ce06185fae7676172ff6e94d7743b86787a04050c666358b45e3464a58a51"} Apr 20 20:26:47.913527 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:47.913479 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" event={"ID":"b84ff881-c782-4982-9d8a-9af8c5ed94b5","Type":"ContainerStarted","Data":"527bd0a3d75273cb337cb4d8e665c2423f7725a85c0dffb79b751479eee558df"} Apr 20 20:26:47.914042 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:47.913747 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" Apr 20 20:26:47.936102 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:47.936056 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" podStartSLOduration=6.936043289 podStartE2EDuration="6.936043289s" podCreationTimestamp="2026-04-20 20:26:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:26:47.934173685 +0000 UTC m=+1248.973663231" watchObservedRunningTime="2026-04-20 20:26:47.936043289 +0000 UTC m=+1248.975532836" Apr 20 20:26:48.917260 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:48.917175 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" Apr 20 20:26:48.918512 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:48.918476 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" podUID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 20 20:26:49.921011 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:49.920974 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" podUID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 20 20:26:54.925588 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:54.925557 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" Apr 20 20:26:54.926131 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:26:54.926098 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" podUID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 20 20:27:04.926322 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:27:04.926275 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" podUID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 20 20:27:14.926538 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:27:14.926495 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" podUID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 20 20:27:24.927018 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:27:24.926967 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" podUID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 20 20:27:34.926973 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:27:34.926930 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" podUID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 20 20:27:44.926299 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:27:44.926257 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" podUID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 20 20:27:54.926609 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:27:54.926569 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" Apr 20 20:28:01.462553 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:01.462520 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54"] Apr 20 20:28:01.463051 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:01.462878 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" podUID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerName="kserve-container" containerID="cri-o://527bd0a3d75273cb337cb4d8e665c2423f7725a85c0dffb79b751479eee558df" gracePeriod=30 Apr 20 20:28:01.463051 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:01.462987 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" podUID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerName="kube-rbac-proxy" containerID="cri-o://e28ce06185fae7676172ff6e94d7743b86787a04050c666358b45e3464a58a51" gracePeriod=30 Apr 20 20:28:02.164474 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:02.164442 2571 generic.go:358] "Generic (PLEG): container finished" podID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerID="e28ce06185fae7676172ff6e94d7743b86787a04050c666358b45e3464a58a51" exitCode=2 Apr 20 20:28:02.164474 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:02.164456 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" event={"ID":"b84ff881-c782-4982-9d8a-9af8c5ed94b5","Type":"ContainerDied","Data":"e28ce06185fae7676172ff6e94d7743b86787a04050c666358b45e3464a58a51"} Apr 20 20:28:04.921631 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:04.921577 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" podUID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.44:8643/healthz\": dial tcp 10.133.0.44:8643: connect: connection refused" Apr 20 20:28:04.926226 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:04.926201 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" podUID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 20 20:28:06.007224 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.007200 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" Apr 20 20:28:06.097759 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.097670 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b84ff881-c782-4982-9d8a-9af8c5ed94b5-kserve-provision-location\") pod \"b84ff881-c782-4982-9d8a-9af8c5ed94b5\" (UID: \"b84ff881-c782-4982-9d8a-9af8c5ed94b5\") " Apr 20 20:28:06.097759 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.097734 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b84ff881-c782-4982-9d8a-9af8c5ed94b5-proxy-tls\") pod \"b84ff881-c782-4982-9d8a-9af8c5ed94b5\" (UID: \"b84ff881-c782-4982-9d8a-9af8c5ed94b5\") " Apr 20 20:28:06.098006 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.097776 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-runtime-146a8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b84ff881-c782-4982-9d8a-9af8c5ed94b5-raw-sklearn-runtime-146a8-kube-rbac-proxy-sar-config\") pod \"b84ff881-c782-4982-9d8a-9af8c5ed94b5\" (UID: \"b84ff881-c782-4982-9d8a-9af8c5ed94b5\") " Apr 20 20:28:06.098006 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.097792 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wngs\" (UniqueName: \"kubernetes.io/projected/b84ff881-c782-4982-9d8a-9af8c5ed94b5-kube-api-access-6wngs\") pod \"b84ff881-c782-4982-9d8a-9af8c5ed94b5\" (UID: \"b84ff881-c782-4982-9d8a-9af8c5ed94b5\") " Apr 20 20:28:06.098192 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.098030 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b84ff881-c782-4982-9d8a-9af8c5ed94b5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b84ff881-c782-4982-9d8a-9af8c5ed94b5" (UID: "b84ff881-c782-4982-9d8a-9af8c5ed94b5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:28:06.098192 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.098149 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b84ff881-c782-4982-9d8a-9af8c5ed94b5-raw-sklearn-runtime-146a8-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-runtime-146a8-kube-rbac-proxy-sar-config") pod "b84ff881-c782-4982-9d8a-9af8c5ed94b5" (UID: "b84ff881-c782-4982-9d8a-9af8c5ed94b5"). InnerVolumeSpecName "raw-sklearn-runtime-146a8-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:28:06.100014 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.099992 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b84ff881-c782-4982-9d8a-9af8c5ed94b5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b84ff881-c782-4982-9d8a-9af8c5ed94b5" (UID: "b84ff881-c782-4982-9d8a-9af8c5ed94b5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:28:06.100014 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.100005 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b84ff881-c782-4982-9d8a-9af8c5ed94b5-kube-api-access-6wngs" (OuterVolumeSpecName: "kube-api-access-6wngs") pod "b84ff881-c782-4982-9d8a-9af8c5ed94b5" (UID: "b84ff881-c782-4982-9d8a-9af8c5ed94b5"). InnerVolumeSpecName "kube-api-access-6wngs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:28:06.179536 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.179503 2571 generic.go:358] "Generic (PLEG): container finished" podID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerID="527bd0a3d75273cb337cb4d8e665c2423f7725a85c0dffb79b751479eee558df" exitCode=0 Apr 20 20:28:06.179698 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.179560 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" event={"ID":"b84ff881-c782-4982-9d8a-9af8c5ed94b5","Type":"ContainerDied","Data":"527bd0a3d75273cb337cb4d8e665c2423f7725a85c0dffb79b751479eee558df"} Apr 20 20:28:06.179698 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.179588 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" event={"ID":"b84ff881-c782-4982-9d8a-9af8c5ed94b5","Type":"ContainerDied","Data":"7b6ff6862e1b2782b9121b239d5db57b18962c17491159ff604b27356a1c023e"} Apr 20 20:28:06.179698 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.179587 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54" Apr 20 20:28:06.179698 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.179603 2571 scope.go:117] "RemoveContainer" containerID="e28ce06185fae7676172ff6e94d7743b86787a04050c666358b45e3464a58a51" Apr 20 20:28:06.188504 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.188486 2571 scope.go:117] "RemoveContainer" containerID="527bd0a3d75273cb337cb4d8e665c2423f7725a85c0dffb79b751479eee558df" Apr 20 20:28:06.195961 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.195942 2571 scope.go:117] "RemoveContainer" containerID="93641725ef31d8d591e216c4a3cc86b3ce783e78b94e0d5fe66d233c194ca65b" Apr 20 20:28:06.199116 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.199095 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b84ff881-c782-4982-9d8a-9af8c5ed94b5-kserve-provision-location\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:28:06.199213 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.199118 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b84ff881-c782-4982-9d8a-9af8c5ed94b5-proxy-tls\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:28:06.199213 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.199128 2571 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-runtime-146a8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b84ff881-c782-4982-9d8a-9af8c5ed94b5-raw-sklearn-runtime-146a8-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:28:06.199213 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.199138 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6wngs\" (UniqueName: \"kubernetes.io/projected/b84ff881-c782-4982-9d8a-9af8c5ed94b5-kube-api-access-6wngs\") on node \"ip-10-0-135-184.ec2.internal\" DevicePath \"\"" Apr 20 20:28:06.201870 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.201845 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54"] Apr 20 20:28:06.204564 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.204545 2571 scope.go:117] "RemoveContainer" containerID="e28ce06185fae7676172ff6e94d7743b86787a04050c666358b45e3464a58a51" Apr 20 20:28:06.204853 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:28:06.204828 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e28ce06185fae7676172ff6e94d7743b86787a04050c666358b45e3464a58a51\": container with ID starting with e28ce06185fae7676172ff6e94d7743b86787a04050c666358b45e3464a58a51 not found: ID does not exist" containerID="e28ce06185fae7676172ff6e94d7743b86787a04050c666358b45e3464a58a51" Apr 20 20:28:06.204923 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.204858 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e28ce06185fae7676172ff6e94d7743b86787a04050c666358b45e3464a58a51"} err="failed to get container status \"e28ce06185fae7676172ff6e94d7743b86787a04050c666358b45e3464a58a51\": rpc error: code = NotFound desc = could not find container \"e28ce06185fae7676172ff6e94d7743b86787a04050c666358b45e3464a58a51\": container with ID starting with e28ce06185fae7676172ff6e94d7743b86787a04050c666358b45e3464a58a51 not found: ID does not exist" Apr 20 20:28:06.204923 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.204879 2571 scope.go:117] "RemoveContainer" containerID="527bd0a3d75273cb337cb4d8e665c2423f7725a85c0dffb79b751479eee558df" Apr 20 20:28:06.205154 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:28:06.205138 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"527bd0a3d75273cb337cb4d8e665c2423f7725a85c0dffb79b751479eee558df\": container with ID starting with 527bd0a3d75273cb337cb4d8e665c2423f7725a85c0dffb79b751479eee558df not found: ID does not exist" containerID="527bd0a3d75273cb337cb4d8e665c2423f7725a85c0dffb79b751479eee558df" Apr 20 20:28:06.205212 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.205159 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"527bd0a3d75273cb337cb4d8e665c2423f7725a85c0dffb79b751479eee558df"} err="failed to get container status \"527bd0a3d75273cb337cb4d8e665c2423f7725a85c0dffb79b751479eee558df\": rpc error: code = NotFound desc = could not find container \"527bd0a3d75273cb337cb4d8e665c2423f7725a85c0dffb79b751479eee558df\": container with ID starting with 527bd0a3d75273cb337cb4d8e665c2423f7725a85c0dffb79b751479eee558df not found: ID does not exist" Apr 20 20:28:06.205212 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.205181 2571 scope.go:117] "RemoveContainer" containerID="93641725ef31d8d591e216c4a3cc86b3ce783e78b94e0d5fe66d233c194ca65b" Apr 20 20:28:06.205450 ip-10-0-135-184 kubenswrapper[2571]: E0420 20:28:06.205404 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93641725ef31d8d591e216c4a3cc86b3ce783e78b94e0d5fe66d233c194ca65b\": container with ID starting with 93641725ef31d8d591e216c4a3cc86b3ce783e78b94e0d5fe66d233c194ca65b not found: ID does not exist" containerID="93641725ef31d8d591e216c4a3cc86b3ce783e78b94e0d5fe66d233c194ca65b" Apr 20 20:28:06.205503 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.205460 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93641725ef31d8d591e216c4a3cc86b3ce783e78b94e0d5fe66d233c194ca65b"} err="failed to get container status \"93641725ef31d8d591e216c4a3cc86b3ce783e78b94e0d5fe66d233c194ca65b\": rpc error: code = NotFound desc = could not find container \"93641725ef31d8d591e216c4a3cc86b3ce783e78b94e0d5fe66d233c194ca65b\": container with ID starting with 93641725ef31d8d591e216c4a3cc86b3ce783e78b94e0d5fe66d233c194ca65b not found: ID does not exist" Apr 20 20:28:06.205848 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:06.205831 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-146a8-predictor-57d4c64645-rvg54"] Apr 20 20:28:07.527931 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:07.527893 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" path="/var/lib/kubelet/pods/b84ff881-c782-4982-9d8a-9af8c5ed94b5/volumes" Apr 20 20:28:31.102703 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:31.102617 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-kk6nn_5266cc03-b601-4ec1-a024-fac19615b5da/global-pull-secret-syncer/0.log" Apr 20 20:28:31.276548 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:31.276515 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-mckzb_37568615-5b75-4d85-aad5-7bfdbb676856/konnectivity-agent/0.log" Apr 20 20:28:31.322692 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:31.322651 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-184.ec2.internal_d93550a1d0c6abebbeb4587739ed181c/haproxy/0.log" Apr 20 20:28:34.472869 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:34.472836 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fff0c0d6-bd89-4736-a984-a965f262948b/alertmanager/0.log" Apr 20 20:28:34.505780 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:34.505752 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fff0c0d6-bd89-4736-a984-a965f262948b/config-reloader/0.log" Apr 20 20:28:34.531861 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:34.531834 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fff0c0d6-bd89-4736-a984-a965f262948b/kube-rbac-proxy-web/0.log" Apr 20 20:28:34.566291 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:34.566263 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fff0c0d6-bd89-4736-a984-a965f262948b/kube-rbac-proxy/0.log" Apr 20 20:28:34.603589 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:34.603563 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fff0c0d6-bd89-4736-a984-a965f262948b/kube-rbac-proxy-metric/0.log" Apr 20 20:28:34.635884 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:34.635857 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fff0c0d6-bd89-4736-a984-a965f262948b/prom-label-proxy/0.log" Apr 20 20:28:34.664362 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:34.664337 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fff0c0d6-bd89-4736-a984-a965f262948b/init-config-reloader/0.log" Apr 20 20:28:34.743549 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:34.743476 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-pw58h_d422197f-777e-4035-a81e-8dcf213b4d0a/kube-state-metrics/0.log" Apr 20 20:28:34.769145 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:34.769114 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-pw58h_d422197f-777e-4035-a81e-8dcf213b4d0a/kube-rbac-proxy-main/0.log" Apr 20 20:28:34.793489 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:34.793457 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-pw58h_d422197f-777e-4035-a81e-8dcf213b4d0a/kube-rbac-proxy-self/0.log" Apr 20 20:28:34.899061 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:34.899033 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8s5lf_b5d9430a-17b7-4994-9576-0dd1247ec436/node-exporter/0.log" Apr 20 20:28:34.921164 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:34.921142 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8s5lf_b5d9430a-17b7-4994-9576-0dd1247ec436/kube-rbac-proxy/0.log" Apr 20 20:28:34.956245 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:34.956224 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8s5lf_b5d9430a-17b7-4994-9576-0dd1247ec436/init-textfile/0.log" Apr 20 20:28:35.148895 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:35.148865 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-7vfx2_e48ea207-1b2d-460f-b13a-46721f677a63/kube-rbac-proxy-main/0.log" Apr 20 20:28:35.171310 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:35.171281 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-7vfx2_e48ea207-1b2d-460f-b13a-46721f677a63/kube-rbac-proxy-self/0.log" Apr 20 20:28:35.195579 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:35.195552 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-7vfx2_e48ea207-1b2d-460f-b13a-46721f677a63/openshift-state-metrics/0.log" Apr 20 20:28:35.416995 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:35.416909 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-g5v6w_f9646e3a-c3e6-4243-9d07-190e48cfd49e/prometheus-operator/0.log" Apr 20 20:28:35.437583 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:35.437553 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-g5v6w_f9646e3a-c3e6-4243-9d07-190e48cfd49e/kube-rbac-proxy/0.log" Apr 20 20:28:37.733764 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:37.733734 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b5dbdd44-g2gvj_a1c7430d-63f9-45d5-986a-3c4c7e8b6ef6/console/0.log" Apr 20 20:28:38.927356 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.927314 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l"] Apr 20 20:28:38.927874 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.927850 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerName="storage-initializer" Apr 20 20:28:38.927952 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.927878 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerName="storage-initializer" Apr 20 20:28:38.927952 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.927890 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerName="kserve-container" Apr 20 20:28:38.927952 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.927898 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerName="kserve-container" Apr 20 20:28:38.927952 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.927927 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerName="storage-initializer" Apr 20 20:28:38.927952 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.927937 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerName="storage-initializer" Apr 20 20:28:38.927952 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.927948 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerName="kube-rbac-proxy" Apr 20 20:28:38.928288 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.927957 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerName="kube-rbac-proxy" Apr 20 20:28:38.928288 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.927967 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerName="kube-rbac-proxy" Apr 20 20:28:38.928288 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.927976 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerName="kube-rbac-proxy" Apr 20 20:28:38.928288 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.927991 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerName="kserve-container" Apr 20 20:28:38.928288 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.927999 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerName="kserve-container" Apr 20 20:28:38.928288 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.928079 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerName="kube-rbac-proxy" Apr 20 20:28:38.928288 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.928095 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerName="kube-rbac-proxy" Apr 20 20:28:38.928288 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.928107 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b6adc77-1f4e-4265-8bc3-a61a9cabcfed" containerName="kserve-container" Apr 20 20:28:38.928288 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.928118 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b84ff881-c782-4982-9d8a-9af8c5ed94b5" containerName="kserve-container" Apr 20 20:28:38.931405 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.931382 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l" Apr 20 20:28:38.934057 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.934032 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kxgfw\"/\"kube-root-ca.crt\"" Apr 20 20:28:38.934176 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.934098 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kxgfw\"/\"openshift-service-ca.crt\"" Apr 20 20:28:38.934176 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.934131 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kxgfw\"/\"default-dockercfg-f8tg6\"" Apr 20 20:28:38.939682 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.939660 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l"] Apr 20 20:28:38.966230 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.966194 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215-sys\") pod \"perf-node-gather-daemonset-s994l\" (UID: \"79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l" Apr 20 20:28:38.966417 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.966242 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215-podres\") pod \"perf-node-gather-daemonset-s994l\" (UID: \"79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l" Apr 20 20:28:38.966417 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.966315 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj6ff\" (UniqueName: \"kubernetes.io/projected/79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215-kube-api-access-lj6ff\") pod \"perf-node-gather-daemonset-s994l\" (UID: \"79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l" Apr 20 20:28:38.966417 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.966375 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215-lib-modules\") pod \"perf-node-gather-daemonset-s994l\" (UID: \"79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l" Apr 20 20:28:38.966417 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.966403 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215-proc\") pod \"perf-node-gather-daemonset-s994l\" (UID: \"79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l" Apr 20 20:28:38.978244 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.978221 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-n8mvh_e50d24aa-c678-44f8-ba98-19a30a72720c/dns/0.log" Apr 20 20:28:38.998357 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:38.998334 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-n8mvh_e50d24aa-c678-44f8-ba98-19a30a72720c/kube-rbac-proxy/0.log" Apr 20 20:28:39.043897 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:39.043865 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7bzrm_e8fae2ab-f747-4b27-b9a3-55be9806fb45/dns-node-resolver/0.log" Apr 20 20:28:39.067490 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:39.067458 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215-sys\") pod \"perf-node-gather-daemonset-s994l\" (UID: \"79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l" Apr 20 20:28:39.067677 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:39.067498 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215-podres\") pod \"perf-node-gather-daemonset-s994l\" (UID: \"79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l" Apr 20 20:28:39.067677 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:39.067527 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lj6ff\" (UniqueName: \"kubernetes.io/projected/79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215-kube-api-access-lj6ff\") pod \"perf-node-gather-daemonset-s994l\" (UID: \"79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l" Apr 20 20:28:39.067677 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:39.067554 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215-lib-modules\") pod \"perf-node-gather-daemonset-s994l\" (UID: \"79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l" Apr 20 20:28:39.067677 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:39.067576 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215-proc\") pod \"perf-node-gather-daemonset-s994l\" (UID: \"79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l" Apr 20 20:28:39.067677 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:39.067596 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215-sys\") pod \"perf-node-gather-daemonset-s994l\" (UID: \"79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l" Apr 20 20:28:39.067677 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:39.067616 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215-podres\") pod \"perf-node-gather-daemonset-s994l\" (UID: \"79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l" Apr 20 20:28:39.067677 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:39.067650 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215-proc\") pod \"perf-node-gather-daemonset-s994l\" (UID: \"79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l" Apr 20 20:28:39.067949 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:39.067718 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215-lib-modules\") pod \"perf-node-gather-daemonset-s994l\" (UID: \"79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l" Apr 20 20:28:39.075107 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:39.075081 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj6ff\" (UniqueName: \"kubernetes.io/projected/79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215-kube-api-access-lj6ff\") pod \"perf-node-gather-daemonset-s994l\" (UID: \"79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l" Apr 20 20:28:39.242646 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:39.242542 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l" Apr 20 20:28:39.370249 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:39.370220 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l"] Apr 20 20:28:39.373032 ip-10-0-135-184 kubenswrapper[2571]: W0420 20:28:39.373006 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod79f4d47b_3dd6_4a7c_88a8_e0a6b8d4b215.slice/crio-a64e4bf4be5ce1dd2bfc3080876ec7b943b1c023668bc5356dc886c4380557e2 WatchSource:0}: Error finding container a64e4bf4be5ce1dd2bfc3080876ec7b943b1c023668bc5356dc886c4380557e2: Status 404 returned error can't find the container with id a64e4bf4be5ce1dd2bfc3080876ec7b943b1c023668bc5356dc886c4380557e2 Apr 20 20:28:39.534028 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:39.533998 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ntdcl_43405a48-098c-49ef-95e3-3544654522ad/node-ca/0.log" Apr 20 20:28:40.225141 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:40.225109 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6697d777bb-g2vc6_89fd0c53-c978-4418-a79a-46ee5bab209d/router/0.log" Apr 20 20:28:40.299999 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:40.299962 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l" event={"ID":"79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215","Type":"ContainerStarted","Data":"15424614c9314bfb6f14c8f2c79d329eaedf378ae669d9c7f911c87d0d8aa87e"} Apr 20 20:28:40.299999 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:40.299996 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l" event={"ID":"79f4d47b-3dd6-4a7c-88a8-e0a6b8d4b215","Type":"ContainerStarted","Data":"a64e4bf4be5ce1dd2bfc3080876ec7b943b1c023668bc5356dc886c4380557e2"} Apr 20 20:28:40.300209 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:40.300119 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l" Apr 20 20:28:40.316960 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:40.316917 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l" podStartSLOduration=2.316903607 podStartE2EDuration="2.316903607s" podCreationTimestamp="2026-04-20 20:28:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:28:40.315508833 +0000 UTC m=+1361.354998588" watchObservedRunningTime="2026-04-20 20:28:40.316903607 +0000 UTC m=+1361.356393154" Apr 20 20:28:40.555375 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:40.555347 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-n74j9_6b4f78da-4307-441b-8a96-07e157e132e8/serve-healthcheck-canary/0.log" Apr 20 20:28:40.906842 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:40.906763 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-8ftcl_0b604c7c-dbda-486e-9ca5-fd23ee10bc87/insights-operator/1.log" Apr 20 20:28:40.906842 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:40.906825 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-8ftcl_0b604c7c-dbda-486e-9ca5-fd23ee10bc87/insights-operator/0.log" Apr 20 20:28:41.060297 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:41.060270 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q2q25_10dc391b-44ea-4dcb-8d49-9210e125ae69/kube-rbac-proxy/0.log" Apr 20 20:28:41.080104 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:41.080081 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q2q25_10dc391b-44ea-4dcb-8d49-9210e125ae69/exporter/0.log" Apr 20 20:28:41.101060 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:41.101033 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q2q25_10dc391b-44ea-4dcb-8d49-9210e125ae69/extractor/0.log" Apr 20 20:28:42.988334 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:42.988300 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-6f655776dd-gthrt_e64c308a-f9e2-49d3-990e-86dc9f689be4/manager/0.log" Apr 20 20:28:43.006955 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:43.006932 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-8zbhw_6e7cfa8d-c464-4632-8bd5-63a6a9a8bf66/manager/0.log" Apr 20 20:28:43.027376 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:43.027351 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-lgtp4_c3e845f7-93bc-431d-9e1c-3a863b9719ce/server/0.log" Apr 20 20:28:43.230433 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:43.230404 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-xfmm8_14c1f75f-12c2-4d61-86c7-49b01cae6c40/seaweedfs/0.log" Apr 20 20:28:46.313538 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:46.313506 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-s994l" Apr 20 20:28:46.781614 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:46.781581 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-7wcw2_75cdcb83-e8e3-4d26-90d4-d4278f6d7672/migrator/0.log" Apr 20 20:28:46.801465 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:46.801439 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-7wcw2_75cdcb83-e8e3-4d26-90d4-d4278f6d7672/graceful-termination/0.log" Apr 20 20:28:48.123450 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:48.123346 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-24vt9_7cc9261c-1baf-4d71-aae3-b734d559681b/kube-multus/0.log" Apr 20 20:28:48.201823 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:48.201780 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5g757_4e90b560-013d-4eb3-83bf-d19971d4fd0c/kube-multus-additional-cni-plugins/0.log" Apr 20 20:28:48.223775 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:48.223749 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5g757_4e90b560-013d-4eb3-83bf-d19971d4fd0c/egress-router-binary-copy/0.log" Apr 20 20:28:48.247493 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:48.247470 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5g757_4e90b560-013d-4eb3-83bf-d19971d4fd0c/cni-plugins/0.log" Apr 20 20:28:48.267761 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:48.267737 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5g757_4e90b560-013d-4eb3-83bf-d19971d4fd0c/bond-cni-plugin/0.log" Apr 20 20:28:48.287146 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:48.287121 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5g757_4e90b560-013d-4eb3-83bf-d19971d4fd0c/routeoverride-cni/0.log" Apr 20 20:28:48.306729 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:48.306703 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5g757_4e90b560-013d-4eb3-83bf-d19971d4fd0c/whereabouts-cni-bincopy/0.log" Apr 20 20:28:48.327330 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:48.327306 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5g757_4e90b560-013d-4eb3-83bf-d19971d4fd0c/whereabouts-cni/0.log" Apr 20 20:28:48.688472 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:48.688446 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9sbrz_012dcd86-26f0-4115-bd86-d5066c900541/network-metrics-daemon/0.log" Apr 20 20:28:48.709616 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:48.709539 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9sbrz_012dcd86-26f0-4115-bd86-d5066c900541/kube-rbac-proxy/0.log" Apr 20 20:28:50.110179 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:50.110141 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kldp5_997e9539-5288-4af5-92f4-55d8ccefbbf7/ovn-controller/0.log" Apr 20 20:28:50.130458 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:50.130412 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kldp5_997e9539-5288-4af5-92f4-55d8ccefbbf7/ovn-acl-logging/0.log" Apr 20 20:28:50.136264 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:50.136243 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kldp5_997e9539-5288-4af5-92f4-55d8ccefbbf7/ovn-acl-logging/1.log" Apr 20 20:28:50.157860 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:50.157833 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kldp5_997e9539-5288-4af5-92f4-55d8ccefbbf7/kube-rbac-proxy-node/0.log" Apr 20 20:28:50.180830 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:50.180805 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kldp5_997e9539-5288-4af5-92f4-55d8ccefbbf7/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 20:28:50.199855 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:50.199774 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kldp5_997e9539-5288-4af5-92f4-55d8ccefbbf7/northd/0.log" Apr 20 20:28:50.226332 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:50.226308 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kldp5_997e9539-5288-4af5-92f4-55d8ccefbbf7/nbdb/0.log" Apr 20 20:28:50.250165 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:50.250141 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kldp5_997e9539-5288-4af5-92f4-55d8ccefbbf7/sbdb/0.log" Apr 20 20:28:50.353484 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:50.353450 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kldp5_997e9539-5288-4af5-92f4-55d8ccefbbf7/ovnkube-controller/0.log" Apr 20 20:28:51.285849 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:51.285813 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-9bjr7_800b4dad-a669-433c-8963-4c9f630913b5/network-check-target-container/0.log" Apr 20 20:28:52.229345 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:52.229316 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-rm9m6_2940fed8-94a7-4975-8584-3fcd4e6a7933/iptables-alerter/0.log" Apr 20 20:28:52.879154 ip-10-0-135-184 kubenswrapper[2571]: I0420 20:28:52.879125 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-89gqs_07ec338a-d16f-4d81-9472-f216291c9dba/tuned/0.log"