Apr 16 22:13:24.340031 ip-10-0-130-227 systemd[1]: Starting Kubernetes Kubelet... Apr 16 22:13:24.805308 ip-10-0-130-227 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:24.805308 ip-10-0-130-227 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 22:13:24.805308 ip-10-0-130-227 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:24.805308 ip-10-0-130-227 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 22:13:24.805308 ip-10-0-130-227 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:24.808527 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.808436 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 22:13:24.816536 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816510 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:24.816536 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816530 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:24.816536 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816534 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:24.816536 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816538 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:24.816536 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816543 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:24.816769 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816548 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:24.816769 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816552 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:24.816769 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816556 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:24.816769 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816558 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:24.816769 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816561 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:24.816769 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816564 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:24.816769 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816567 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:24.816769 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816570 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:24.816769 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816572 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:24.816769 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816575 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:24.816769 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816578 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:24.816769 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816581 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:24.816769 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816583 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:24.816769 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816586 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:24.816769 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816589 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:24.816769 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816591 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:24.816769 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816594 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:24.816769 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816597 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:24.816769 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816599 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:24.816769 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816602 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:24.816769 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816604 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:24.817262 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816614 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:24.817262 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816617 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:24.817262 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816619 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:24.817262 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816622 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:24.817262 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816625 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:24.817262 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816627 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:24.817262 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816630 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:24.817262 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816632 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:24.817262 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816636 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:24.817262 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816640 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:24.817262 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816643 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:24.817262 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816646 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:24.817262 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816650 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:24.817262 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816652 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:24.817262 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816655 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:24.817262 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816658 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:24.817262 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816660 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:24.817262 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816663 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:24.817262 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816666 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:24.817262 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816668 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:24.817784 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816684 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:24.817784 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816687 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:24.817784 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816691 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:24.817784 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816694 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:24.817784 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816697 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:24.817784 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816700 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:24.817784 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816702 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:24.817784 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816706 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:24.817784 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816709 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:24.817784 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816712 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:24.817784 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816714 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:24.817784 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816723 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:24.817784 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816727 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:24.817784 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816729 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:24.817784 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816732 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:24.817784 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816735 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:24.817784 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816738 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:24.817784 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816740 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:24.817784 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816743 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:24.818244 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816746 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:24.818244 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816748 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:24.818244 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816751 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:24.818244 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816754 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:24.818244 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816757 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:24.818244 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816760 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:24.818244 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816763 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:24.818244 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816766 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:24.818244 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816768 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:24.818244 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816771 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:24.818244 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816774 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:24.818244 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816777 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:24.818244 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816780 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:24.818244 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816784 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:24.818244 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816786 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:24.818244 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816789 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:24.818244 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816792 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:24.818244 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816794 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:24.818244 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816797 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:24.818244 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816800 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:24.818801 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.816802 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:24.818801 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817209 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:24.818801 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817213 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:24.818801 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817217 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:24.818801 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817220 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:24.818801 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817222 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:24.818801 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817225 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:24.818801 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817228 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:24.818801 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817231 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:24.818801 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817233 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:24.818801 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817236 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:24.818801 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817239 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:24.818801 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817242 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:24.818801 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817244 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:24.818801 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817247 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:24.818801 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817250 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:24.818801 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817252 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:24.818801 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817257 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:24.818801 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817259 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:24.818801 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817262 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:24.819288 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817265 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:24.819288 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817267 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:24.819288 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817270 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:24.819288 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817273 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:24.819288 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817276 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:24.819288 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817279 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:24.819288 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817282 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:24.819288 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817285 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:24.819288 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817288 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:24.819288 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817290 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:24.819288 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817293 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:24.819288 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817295 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:24.819288 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817298 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:24.819288 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817300 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:24.819288 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817303 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:24.819288 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817305 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:24.819288 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817308 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:24.819288 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817317 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:24.819288 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817320 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:24.819288 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817323 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:24.819805 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817327 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:24.819805 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817331 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:24.819805 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817334 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:24.819805 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817337 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:24.819805 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817341 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:24.819805 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817344 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:24.819805 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817347 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:24.819805 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817350 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:24.819805 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817353 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:24.819805 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817356 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:24.819805 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817359 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:24.819805 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817362 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:24.819805 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817365 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:24.819805 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817367 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:24.819805 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817370 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:24.819805 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817373 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:24.819805 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817375 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:24.819805 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817378 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:24.819805 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817381 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:24.820265 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817383 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:24.820265 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817386 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:24.820265 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817389 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:24.820265 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817391 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:24.820265 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817394 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:24.820265 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817396 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:24.820265 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817399 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:24.820265 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817402 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:24.820265 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817405 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:24.820265 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817407 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:24.820265 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817410 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:24.820265 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817413 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:24.820265 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817415 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:24.820265 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817418 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:24.820265 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817421 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:24.820265 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817423 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:24.820265 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817426 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:24.820265 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817428 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:24.820265 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817431 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:24.820826 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817434 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:24.820826 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817436 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:24.820826 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817439 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:24.820826 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817441 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:24.820826 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817444 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:24.820826 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817447 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:24.820826 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817450 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:24.820826 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817452 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:24.820826 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.817455 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:24.820826 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818620 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 22:13:24.820826 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818630 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 22:13:24.820826 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818637 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 22:13:24.820826 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818642 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 22:13:24.820826 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818648 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 22:13:24.820826 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818652 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 22:13:24.820826 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818656 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 22:13:24.820826 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818661 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 22:13:24.820826 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818665 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 22:13:24.820826 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818668 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 22:13:24.820826 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818683 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 22:13:24.820826 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818687 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818690 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818693 2576 flags.go:64] FLAG: --cgroup-root="" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818701 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818704 2576 flags.go:64] FLAG: --client-ca-file="" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818707 2576 flags.go:64] FLAG: --cloud-config="" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818710 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818713 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818717 2576 flags.go:64] FLAG: --cluster-domain="" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818720 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818724 2576 flags.go:64] FLAG: --config-dir="" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818727 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818730 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818734 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818737 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818741 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818744 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818748 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818751 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818754 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818757 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818760 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818765 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818768 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818771 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 22:13:24.821336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818774 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818778 2576 flags.go:64] FLAG: --enable-server="true" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818781 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818786 2576 flags.go:64] FLAG: --event-burst="100" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818790 2576 flags.go:64] FLAG: --event-qps="50" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818793 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818797 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818800 2576 flags.go:64] FLAG: --eviction-hard="" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818804 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818807 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818812 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818815 2576 flags.go:64] FLAG: --eviction-soft="" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818818 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818821 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818824 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818827 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818831 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818834 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818837 2576 flags.go:64] FLAG: --feature-gates="" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818841 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818844 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818847 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818851 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818854 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818857 2576 flags.go:64] FLAG: --help="false" Apr 16 22:13:24.821944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818860 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-130-227.ec2.internal" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818863 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818867 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818870 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818873 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818876 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818879 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818882 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818886 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818889 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818892 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818896 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818899 2576 flags.go:64] FLAG: --kube-reserved="" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818902 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818905 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818908 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818911 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818914 2576 flags.go:64] FLAG: --lock-file="" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818917 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818920 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818923 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818929 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818931 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818934 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 22:13:24.822561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818937 2576 flags.go:64] FLAG: --logging-format="text" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818940 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818943 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818946 2576 flags.go:64] FLAG: --manifest-url="" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818949 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818954 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818957 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818961 2576 flags.go:64] FLAG: --max-pods="110" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818964 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818968 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818971 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818974 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818977 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818980 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818983 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818992 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818996 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.818999 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819002 2576 flags.go:64] FLAG: --pod-cidr="" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819005 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819011 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819014 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819017 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819019 2576 flags.go:64] FLAG: --port="10250" Apr 16 22:13:24.823157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819023 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819026 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e22423658a3aa633" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819029 2576 flags.go:64] FLAG: --qos-reserved="" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819032 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819036 2576 flags.go:64] FLAG: --register-node="true" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819039 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819042 2576 flags.go:64] FLAG: --register-with-taints="" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819045 2576 flags.go:64] FLAG: --registry-burst="10" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819048 2576 flags.go:64] FLAG: --registry-qps="5" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819051 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819054 2576 flags.go:64] FLAG: --reserved-memory="" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819063 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819066 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819070 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819073 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819076 2576 flags.go:64] FLAG: --runonce="false" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819079 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819082 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819086 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819089 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819092 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819095 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819099 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819102 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819105 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819108 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 22:13:24.823778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819111 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819117 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819121 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819124 2576 flags.go:64] FLAG: --system-cgroups="" Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819127 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819132 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819135 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819138 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819144 2576 flags.go:64] FLAG: --tls-min-version="" Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819147 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819150 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819153 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819156 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819159 2576 flags.go:64] FLAG: --v="2" Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819163 2576 flags.go:64] FLAG: --version="false" Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819168 2576 flags.go:64] FLAG: --vmodule="" Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819172 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819175 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819269 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819272 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819276 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819279 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819282 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819285 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:24.824391 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819288 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:24.825030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819292 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:24.825030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819295 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:24.825030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819298 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:24.825030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819301 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:24.825030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819303 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:24.825030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819307 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:24.825030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819310 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:24.825030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819313 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:24.825030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819316 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:24.825030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819319 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:24.825030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819322 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:24.825030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819325 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:24.825030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819327 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:24.825030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819330 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:24.825030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819333 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:24.825030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819337 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:24.825030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819340 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:24.825030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819343 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:24.825030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819347 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:24.825507 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819350 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:24.825507 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819352 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:24.825507 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819355 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:24.825507 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819358 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:24.825507 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819361 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:24.825507 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819364 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:24.825507 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819366 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:24.825507 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819371 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:24.825507 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819374 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:24.825507 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819376 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:24.825507 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819379 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:24.825507 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819381 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:24.825507 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819384 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:24.825507 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819386 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:24.825507 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819389 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:24.825507 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819392 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:24.825507 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819394 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:24.825507 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819397 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:24.825507 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819399 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:24.825507 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819402 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:24.826006 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819404 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:24.826006 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819407 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:24.826006 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819409 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:24.826006 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819417 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:24.826006 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819420 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:24.826006 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819424 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:24.826006 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819427 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:24.826006 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819430 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:24.826006 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819432 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:24.826006 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819435 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:24.826006 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819437 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:24.826006 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819440 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:24.826006 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819443 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:24.826006 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819445 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:24.826006 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819448 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:24.826006 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819451 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:24.826006 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819453 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:24.826006 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819456 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:24.826006 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819458 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:24.826481 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819462 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:24.826481 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819465 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:24.826481 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819467 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:24.826481 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819470 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:24.826481 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819472 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:24.826481 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819475 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:24.826481 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819477 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:24.826481 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819480 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:24.826481 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819482 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:24.826481 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819485 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:24.826481 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819488 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:24.826481 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819490 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:24.826481 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819493 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:24.826481 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819495 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:24.826481 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819498 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:24.826481 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819500 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:24.826481 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819503 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:24.826481 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819511 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:24.826481 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819515 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:24.826481 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819517 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:24.827030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.819520 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:24.827030 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.819529 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:24.827030 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.825861 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 22:13:24.827030 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.825878 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 22:13:24.827030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825923 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:24.827030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825929 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:24.827030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825932 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:24.827030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825935 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:24.827030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825939 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:24.827030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825941 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:24.827030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825944 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:24.827030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825947 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:24.827030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825950 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:24.827030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825952 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:24.827030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825955 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:24.827030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825958 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:24.827432 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825960 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:24.827432 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825963 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:24.827432 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825965 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:24.827432 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825968 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:24.827432 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825972 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:24.827432 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825974 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:24.827432 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825977 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:24.827432 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825980 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:24.827432 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825983 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:24.827432 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825986 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:24.827432 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825988 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:24.827432 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825991 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:24.827432 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825994 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:24.827432 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825997 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:24.827432 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.825999 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:24.827432 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826002 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:24.827432 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826004 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:24.827432 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826008 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:24.827432 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826010 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:24.827432 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826015 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:24.828011 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826018 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:24.828011 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826021 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:24.828011 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826023 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:24.828011 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826026 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:24.828011 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826029 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:24.828011 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826032 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:24.828011 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826036 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:24.828011 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826040 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:24.828011 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826043 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:24.828011 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826046 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:24.828011 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826048 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:24.828011 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826051 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:24.828011 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826054 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:24.828011 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826057 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:24.828011 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826059 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:24.828011 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826062 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:24.828011 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826065 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:24.828011 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826068 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:24.828011 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826070 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:24.828011 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826073 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:24.828495 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826075 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:24.828495 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826078 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:24.828495 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826080 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:24.828495 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826083 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:24.828495 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826085 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:24.828495 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826088 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:24.828495 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826090 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:24.828495 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826093 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:24.828495 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826096 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:24.828495 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826099 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:24.828495 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826101 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:24.828495 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826104 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:24.828495 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826108 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:24.828495 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826111 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:24.828495 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826113 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:24.828495 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826116 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:24.828495 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826118 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:24.828495 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826121 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:24.828495 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826124 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:24.828978 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826126 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:24.828978 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826130 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:24.828978 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826134 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:24.828978 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826137 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:24.828978 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826140 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:24.828978 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826142 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:24.828978 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826145 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:24.828978 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826147 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:24.828978 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826150 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:24.828978 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826153 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:24.828978 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826155 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:24.828978 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826158 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:24.828978 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826160 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:24.828978 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826163 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:24.828978 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826166 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:24.828978 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.826171 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:24.829374 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826277 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:24.829374 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826283 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:24.829374 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826286 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:24.829374 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826290 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:24.829374 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826293 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:24.829374 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826295 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:24.829374 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826299 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:24.829374 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826301 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:24.829374 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826304 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:24.829374 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826307 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:24.829374 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826310 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:24.829374 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826313 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:24.829374 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826316 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:24.829374 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826319 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:24.829374 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826321 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:24.829374 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826324 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:24.829374 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826326 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:24.829374 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826329 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:24.829374 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826332 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:24.829884 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826334 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:24.829884 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826337 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:24.829884 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826339 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:24.829884 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826342 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:24.829884 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826344 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:24.829884 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826347 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:24.829884 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826349 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:24.829884 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826352 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:24.829884 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826354 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:24.829884 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826357 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:24.829884 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826360 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:24.829884 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826362 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:24.829884 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826365 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:24.829884 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826367 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:24.829884 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826370 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:24.829884 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826373 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:24.829884 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826375 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:24.829884 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826377 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:24.829884 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826380 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:24.829884 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826383 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:24.830371 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826385 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:24.830371 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826389 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:24.830371 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826393 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:24.830371 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826396 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:24.830371 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826406 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:24.830371 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826409 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:24.830371 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826412 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:24.830371 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826416 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:24.830371 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826419 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:24.830371 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826421 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:24.830371 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826424 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:24.830371 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826427 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:24.830371 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826429 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:24.830371 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826432 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:24.830371 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826434 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:24.830371 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826437 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:24.830371 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826440 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:24.830371 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826442 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:24.830371 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826445 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:24.831030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826447 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:24.831030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826449 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:24.831030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826452 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:24.831030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826455 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:24.831030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826457 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:24.831030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826460 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:24.831030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826462 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:24.831030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826465 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:24.831030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826468 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:24.831030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826470 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:24.831030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826473 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:24.831030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826475 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:24.831030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826478 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:24.831030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826481 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:24.831030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826483 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:24.831030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826486 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:24.831030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826489 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:24.831030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826491 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:24.831030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826494 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:24.831030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826498 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:24.831647 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826501 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:24.831647 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826503 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:24.831647 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826507 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:24.831647 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826510 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:24.831647 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826513 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:24.831647 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826515 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:24.831647 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826518 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:24.831647 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:24.826521 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:24.831647 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.826526 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:24.831647 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.827232 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 22:13:24.831647 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.829476 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 22:13:24.831647 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.830571 2576 server.go:1019] "Starting client certificate rotation" Apr 16 22:13:24.831647 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.831118 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:24.831647 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.831173 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:24.858473 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.858451 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:24.864353 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.864336 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:24.883369 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.883338 2576 log.go:25] "Validated CRI v1 runtime API" Apr 16 22:13:24.888212 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.888184 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:24.889364 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.889347 2576 log.go:25] "Validated CRI v1 image API" Apr 16 22:13:24.890503 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.890484 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 22:13:24.894940 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.894913 2576 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 ac47c0ec-1b49-4e77-b601-2cce338fd605:/dev/nvme0n1p4 e3d81b41-1873-4344-b7fa-2e4aaad386b9:/dev/nvme0n1p3] Apr 16 22:13:24.895008 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.894939 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 22:13:24.900561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.900451 2576 manager.go:217] Machine: {Timestamp:2026-04-16 22:13:24.898610181 +0000 UTC m=+0.431951044 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3102923 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec25588f459961d92f73bd911553a3a7 SystemUUID:ec25588f-4599-61d9-2f73-bd911553a3a7 BootID:50556bfd-1c65-4d76-82cf-7ca06211e174 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:96:16:5b:0a:77 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:96:16:5b:0a:77 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:9e:f8:8b:d6:4b:e7 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 22:13:24.900561 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.900556 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 22:13:24.900703 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.900691 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 22:13:24.903166 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.903139 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 22:13:24.903304 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.903168 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-227.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 22:13:24.903351 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.903313 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 22:13:24.903351 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.903325 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 22:13:24.903351 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.903338 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:24.903428 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.903353 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:24.904664 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.904653 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:24.904788 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.904779 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 22:13:24.907560 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.907550 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 16 22:13:24.907602 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.907565 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 22:13:24.907602 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.907576 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 22:13:24.907602 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.907587 2576 kubelet.go:397] "Adding apiserver pod source" Apr 16 22:13:24.907602 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.907596 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 22:13:24.909302 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.909290 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:24.909348 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.909309 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:24.912907 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.912892 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 22:13:24.914162 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.914148 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 22:13:24.915632 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.915617 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 22:13:24.915716 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.915636 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 22:13:24.915716 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.915645 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 22:13:24.915716 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.915654 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 22:13:24.915716 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.915661 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 22:13:24.915716 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.915667 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 22:13:24.915716 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.915684 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 22:13:24.915716 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.915690 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 22:13:24.915716 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.915697 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 22:13:24.915716 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.915703 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 22:13:24.915716 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.915720 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 22:13:24.915987 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.915729 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 22:13:24.916513 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.916500 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 22:13:24.916513 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.916509 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 22:13:24.922423 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:24.921067 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 22:13:24.922423 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:24.921068 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-227.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 22:13:24.922423 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.921242 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 22:13:24.922423 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.921307 2576 server.go:1295] "Started kubelet" Apr 16 22:13:24.922423 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.921590 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 22:13:24.922423 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.921655 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 22:13:24.922423 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.922036 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 22:13:24.922342 ip-10-0-130-227 systemd[1]: Started Kubernetes Kubelet. Apr 16 22:13:24.922968 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.922848 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 22:13:24.925383 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.925369 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 16 22:13:24.927528 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.927508 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-227.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 22:13:24.930646 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.930621 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dvm8w" Apr 16 22:13:24.931033 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.931011 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 22:13:24.931033 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.931028 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:24.931427 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:24.929936 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-227.ec2.internal.18a6f605204a81e6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-227.ec2.internal,UID:ip-10-0-130-227.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-227.ec2.internal,},FirstTimestamp:2026-04-16 22:13:24.921262566 +0000 UTC m=+0.454603410,LastTimestamp:2026-04-16 22:13:24.921262566 +0000 UTC m=+0.454603410,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-227.ec2.internal,}" Apr 16 22:13:24.931606 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.931593 2576 factory.go:55] Registering systemd factory Apr 16 22:13:24.931653 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.931645 2576 factory.go:223] Registration of the systemd container factory successfully Apr 16 22:13:24.931730 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.931654 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 22:13:24.931730 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.931644 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 22:13:24.931730 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.931694 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 22:13:24.931861 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.931831 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 16 22:13:24.931861 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.931840 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 16 22:13:24.931861 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:24.931851 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 22:13:24.932751 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:24.932716 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-227.ec2.internal\" not found" Apr 16 22:13:24.932894 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.932879 2576 factory.go:153] Registering CRI-O factory Apr 16 22:13:24.932952 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.932897 2576 factory.go:223] Registration of the crio container factory successfully Apr 16 22:13:24.932952 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.932947 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 22:13:24.933077 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.932970 2576 factory.go:103] Registering Raw factory Apr 16 22:13:24.933077 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.932984 2576 manager.go:1196] Started watching for new ooms in manager Apr 16 22:13:24.933402 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.933378 2576 manager.go:319] Starting recovery of all containers Apr 16 22:13:24.937390 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:24.937362 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 22:13:24.937525 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:24.937503 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-227.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 22:13:24.937645 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.937623 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dvm8w" Apr 16 22:13:24.944653 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.944635 2576 manager.go:324] Recovery completed Apr 16 22:13:24.951041 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.951026 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:24.953550 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.953534 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:24.953623 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.953563 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:24.953623 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.953574 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:24.954100 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.954084 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 22:13:24.954100 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.954099 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 22:13:24.954185 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.954134 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:24.956227 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.956215 2576 policy_none.go:49] "None policy: Start" Apr 16 22:13:24.956272 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.956231 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 22:13:24.956272 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.956240 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 16 22:13:24.991460 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.987240 2576 manager.go:341] "Starting Device Plugin manager" Apr 16 22:13:24.991460 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:24.987270 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 22:13:24.991460 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.987280 2576 server.go:85] "Starting device plugin registration server" Apr 16 22:13:24.991460 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.987530 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 22:13:24.991460 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.987543 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 22:13:24.991460 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.987619 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 22:13:24.991460 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.987725 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 22:13:24.991460 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:24.987732 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 22:13:24.991460 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:24.988329 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 22:13:24.991460 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:24.988372 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-227.ec2.internal\" not found" Apr 16 22:13:25.013374 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.013337 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 22:13:25.014551 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.014536 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 22:13:25.014654 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.014564 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 22:13:25.014654 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.014589 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 22:13:25.014654 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.014598 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 22:13:25.014654 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:25.014638 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 22:13:25.016924 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.016907 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:25.088085 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.088005 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:25.089238 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.089221 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:25.089327 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.089253 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:25.089327 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.089263 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:25.089327 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.089288 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-227.ec2.internal" Apr 16 22:13:25.098417 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.098397 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-227.ec2.internal" Apr 16 22:13:25.098467 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:25.098423 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-227.ec2.internal\": node \"ip-10-0-130-227.ec2.internal\" not found" Apr 16 22:13:25.114300 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:25.114281 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-227.ec2.internal\" not found" Apr 16 22:13:25.115288 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.115269 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal"] Apr 16 22:13:25.115345 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.115336 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:25.116647 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.116632 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:25.116744 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.116660 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:25.116744 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.116685 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:25.117956 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.117943 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:25.118146 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.118125 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" Apr 16 22:13:25.118220 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.118169 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:25.118631 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.118612 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:25.118741 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.118637 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:25.118741 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.118646 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:25.118832 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.118803 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:25.118832 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.118829 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:25.118904 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.118843 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:25.120020 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.120007 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal" Apr 16 22:13:25.120079 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.120029 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:25.120654 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.120637 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:25.120756 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.120689 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:25.120756 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.120704 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:25.133277 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.133251 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1e12d5376c1eb684628dd9cf17dac4a9-config\") pod \"kube-apiserver-proxy-ip-10-0-130-227.ec2.internal\" (UID: \"1e12d5376c1eb684628dd9cf17dac4a9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal" Apr 16 22:13:25.133383 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.133283 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0de14f2d8a312df1fd0f43b1fd02a43e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal\" (UID: \"0de14f2d8a312df1fd0f43b1fd02a43e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" Apr 16 22:13:25.133383 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.133304 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0de14f2d8a312df1fd0f43b1fd02a43e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal\" (UID: \"0de14f2d8a312df1fd0f43b1fd02a43e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" Apr 16 22:13:25.145639 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:25.145618 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-227.ec2.internal\" not found" node="ip-10-0-130-227.ec2.internal" Apr 16 22:13:25.150215 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:25.150198 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-227.ec2.internal\" not found" node="ip-10-0-130-227.ec2.internal" Apr 16 22:13:25.214749 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:25.214718 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-227.ec2.internal\" not found" Apr 16 22:13:25.233512 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.233488 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1e12d5376c1eb684628dd9cf17dac4a9-config\") pod \"kube-apiserver-proxy-ip-10-0-130-227.ec2.internal\" (UID: \"1e12d5376c1eb684628dd9cf17dac4a9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal" Apr 16 22:13:25.233614 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.233522 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1e12d5376c1eb684628dd9cf17dac4a9-config\") pod \"kube-apiserver-proxy-ip-10-0-130-227.ec2.internal\" (UID: \"1e12d5376c1eb684628dd9cf17dac4a9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal" Apr 16 22:13:25.233614 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.233582 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0de14f2d8a312df1fd0f43b1fd02a43e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal\" (UID: \"0de14f2d8a312df1fd0f43b1fd02a43e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" Apr 16 22:13:25.233738 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.233618 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0de14f2d8a312df1fd0f43b1fd02a43e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal\" (UID: \"0de14f2d8a312df1fd0f43b1fd02a43e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" Apr 16 22:13:25.233738 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.233652 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0de14f2d8a312df1fd0f43b1fd02a43e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal\" (UID: \"0de14f2d8a312df1fd0f43b1fd02a43e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" Apr 16 22:13:25.233738 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.233656 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0de14f2d8a312df1fd0f43b1fd02a43e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal\" (UID: \"0de14f2d8a312df1fd0f43b1fd02a43e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" Apr 16 22:13:25.315635 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:25.315605 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-227.ec2.internal\" not found" Apr 16 22:13:25.416394 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:25.416318 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-227.ec2.internal\" not found" Apr 16 22:13:25.448933 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.448902 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" Apr 16 22:13:25.452494 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.452475 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal" Apr 16 22:13:25.517342 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:25.517304 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-227.ec2.internal\" not found" Apr 16 22:13:25.617868 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:25.617829 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-227.ec2.internal\" not found" Apr 16 22:13:25.718375 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:25.718296 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-227.ec2.internal\" not found" Apr 16 22:13:25.740439 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.740414 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:25.799896 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.799869 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:25.829849 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.829824 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 22:13:25.830453 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.829951 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:25.830453 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.829971 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:25.830453 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.829975 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:25.831843 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.831827 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" Apr 16 22:13:25.875452 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.875419 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:25.876439 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.876417 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal" Apr 16 22:13:25.885397 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.885382 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:25.908525 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.908493 2576 apiserver.go:52] "Watching apiserver" Apr 16 22:13:25.917956 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.917935 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 22:13:25.918288 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.918266 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-6lpzp","kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk","openshift-cluster-node-tuning-operator/tuned-jdbf4","openshift-image-registry/node-ca-xr5zn","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal","openshift-multus/multus-additional-cni-plugins-xn6mm","openshift-multus/multus-hqm5z","openshift-multus/network-metrics-daemon-lpwn6","openshift-network-diagnostics/network-check-target-4kjg5","openshift-network-operator/iptables-alerter-7mhr4","openshift-ovn-kubernetes/ovnkube-node-k24wz"] Apr 16 22:13:25.921057 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.921037 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6lpzp" Apr 16 22:13:25.922213 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.922192 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:25.922393 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.922345 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:25.923357 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.923342 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xr5zn" Apr 16 22:13:25.923874 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.923854 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-k5xc6\"" Apr 16 22:13:25.923957 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.923926 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 22:13:25.924082 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.924066 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 22:13:25.924700 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.924663 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:25.924767 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.924754 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 22:13:25.924818 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.924799 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 22:13:25.924867 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.924820 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-2mrwm\"" Apr 16 22:13:25.925184 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.925161 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:25.925271 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.925256 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 22:13:25.925393 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.925379 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-tqlck\"" Apr 16 22:13:25.925433 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.925419 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:25.925604 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.925590 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 22:13:25.925747 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.925728 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 22:13:25.925865 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.925825 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 22:13:25.925988 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.925967 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hqm5z" Apr 16 22:13:25.926387 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.926368 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vpmb5\"" Apr 16 22:13:25.926900 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.926886 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 22:13:25.927086 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.927069 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:25.927231 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:25.927189 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lpwn6" podUID="ddd2c4f1-e24c-431c-a59a-9936d01e4667" Apr 16 22:13:25.927361 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.927284 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 22:13:25.927792 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.927457 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 22:13:25.927792 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.927602 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-fqgp6\"" Apr 16 22:13:25.927792 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.927647 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 22:13:25.927952 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.927800 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 22:13:25.928027 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.928010 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 22:13:25.930576 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.928601 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-lff65\"" Apr 16 22:13:25.931256 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.931239 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:25.932052 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.932037 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:25.932111 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:25.932092 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjg5" podUID="42680cd4-0a5c-4123-aae1-963237fa5b60" Apr 16 22:13:25.932156 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.932136 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7mhr4" Apr 16 22:13:25.933427 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.933413 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:25.934539 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.934522 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:25.934624 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.934547 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:25.934624 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.934565 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-9sh2z\"" Apr 16 22:13:25.934782 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.934763 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 22:13:25.934925 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:25.934899 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0de14f2d8a312df1fd0f43b1fd02a43e.slice/crio-c406eb147ba499327884e60caaa9df54de97d7a60c17dccf0b37c122a993d890 WatchSource:0}: Error finding container c406eb147ba499327884e60caaa9df54de97d7a60c17dccf0b37c122a993d890: Status 404 returned error can't find the container with id c406eb147ba499327884e60caaa9df54de97d7a60c17dccf0b37c122a993d890 Apr 16 22:13:25.935138 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:25.935120 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e12d5376c1eb684628dd9cf17dac4a9.slice/crio-43836f6a4118c25c840b6111a5cbdf27f9e8b7ebc20ae2e08d477a01cb183097 WatchSource:0}: Error finding container 43836f6a4118c25c840b6111a5cbdf27f9e8b7ebc20ae2e08d477a01cb183097: Status 404 returned error can't find the container with id 43836f6a4118c25c840b6111a5cbdf27f9e8b7ebc20ae2e08d477a01cb183097 Apr 16 22:13:25.935557 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.935538 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 22:13:25.935666 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.935649 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 22:13:25.935739 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.935634 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 22:13:25.935939 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.935922 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 22:13:25.936014 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.935947 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pppvr\"" Apr 16 22:13:25.936014 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.935994 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 22:13:25.936091 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.936046 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 22:13:25.937867 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.937765 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/77967758-2942-4cb4-aa90-9b16761c46b3-multus-daemon-config\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:25.937867 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.937793 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/365e92c8-bcc3-46c2-9512-0fa95057726c-host-slash\") pod \"iptables-alerter-7mhr4\" (UID: \"365e92c8-bcc3-46c2-9512-0fa95057726c\") " pod="openshift-network-operator/iptables-alerter-7mhr4" Apr 16 22:13:25.937867 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.937812 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d3ae4fa7-c3ae-4639-be2a-f39b4666fae0-agent-certs\") pod \"konnectivity-agent-6lpzp\" (UID: \"d3ae4fa7-c3ae-4639-be2a-f39b4666fae0\") " pod="kube-system/konnectivity-agent-6lpzp" Apr 16 22:13:25.937867 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.937837 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2ssr\" (UniqueName: \"kubernetes.io/projected/fab36813-c9b9-4c2c-aa71-55346680c966-kube-api-access-b2ssr\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:25.938102 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.937872 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-cnibin\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:25.938102 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.937907 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-lib-modules\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:25.938102 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.937983 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fab36813-c9b9-4c2c-aa71-55346680c966-cnibin\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:25.938102 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938033 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/365e92c8-bcc3-46c2-9512-0fa95057726c-iptables-alerter-script\") pod \"iptables-alerter-7mhr4\" (UID: \"365e92c8-bcc3-46c2-9512-0fa95057726c\") " pod="openshift-network-operator/iptables-alerter-7mhr4" Apr 16 22:13:25.938102 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938076 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq9hk\" (UniqueName: \"kubernetes.io/projected/f9a6741d-2664-481e-bd4b-f57d10d85ede-kube-api-access-vq9hk\") pod \"aws-ebs-csi-driver-node-pkdtk\" (UID: \"f9a6741d-2664-481e-bd4b-f57d10d85ede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:25.938330 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938112 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fab36813-c9b9-4c2c-aa71-55346680c966-cni-binary-copy\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:25.938330 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938218 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a72c732d-41ba-4c13-bf18-c21f9bf94968-tmp\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:25.938330 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938265 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fab36813-c9b9-4c2c-aa71-55346680c966-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:25.938330 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938303 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fab36813-c9b9-4c2c-aa71-55346680c966-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:25.938511 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938333 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjm7m\" (UniqueName: \"kubernetes.io/projected/ddd2c4f1-e24c-431c-a59a-9936d01e4667-kube-api-access-zjm7m\") pod \"network-metrics-daemon-lpwn6\" (UID: \"ddd2c4f1-e24c-431c-a59a-9936d01e4667\") " pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:25.938511 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938358 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-os-release\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:25.938511 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938383 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-host-var-lib-cni-bin\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:25.938511 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938435 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft7zv\" (UniqueName: \"kubernetes.io/projected/365e92c8-bcc3-46c2-9512-0fa95057726c-kube-api-access-ft7zv\") pod \"iptables-alerter-7mhr4\" (UID: \"365e92c8-bcc3-46c2-9512-0fa95057726c\") " pod="openshift-network-operator/iptables-alerter-7mhr4" Apr 16 22:13:25.938511 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938472 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-etc-systemd\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:25.938511 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938505 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f9a6741d-2664-481e-bd4b-f57d10d85ede-etc-selinux\") pod \"aws-ebs-csi-driver-node-pkdtk\" (UID: \"f9a6741d-2664-481e-bd4b-f57d10d85ede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:25.938996 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938539 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-etc-sysctl-d\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:25.938996 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938582 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-run\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:25.938996 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938607 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5fgg\" (UniqueName: \"kubernetes.io/projected/a72c732d-41ba-4c13-bf18-c21f9bf94968-kube-api-access-k5fgg\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:25.938996 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938630 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-host-run-netns\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:25.938996 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938656 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-host-var-lib-cni-multus\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:25.938996 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938695 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f9a6741d-2664-481e-bd4b-f57d10d85ede-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pkdtk\" (UID: \"f9a6741d-2664-481e-bd4b-f57d10d85ede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:25.938996 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938719 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f9a6741d-2664-481e-bd4b-f57d10d85ede-registration-dir\") pod \"aws-ebs-csi-driver-node-pkdtk\" (UID: \"f9a6741d-2664-481e-bd4b-f57d10d85ede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:25.938996 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938743 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-etc-sysctl-conf\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:25.938996 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938765 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f9a6741d-2664-481e-bd4b-f57d10d85ede-device-dir\") pod \"aws-ebs-csi-driver-node-pkdtk\" (UID: \"f9a6741d-2664-481e-bd4b-f57d10d85ede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:25.938996 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938802 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-etc-modprobe-d\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:25.938996 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938835 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-host\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:25.938996 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938860 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-multus-conf-dir\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:25.938996 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938884 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-etc-kubernetes\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:25.938996 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938908 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d3ae4fa7-c3ae-4639-be2a-f39b4666fae0-konnectivity-ca\") pod \"konnectivity-agent-6lpzp\" (UID: \"d3ae4fa7-c3ae-4639-be2a-f39b4666fae0\") " pod="kube-system/konnectivity-agent-6lpzp" Apr 16 22:13:25.938996 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938930 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07d69a7c-22b7-44a4-8aea-024e47f7912b-host\") pod \"node-ca-xr5zn\" (UID: \"07d69a7c-22b7-44a4-8aea-024e47f7912b\") " pod="openshift-image-registry/node-ca-xr5zn" Apr 16 22:13:25.938996 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938953 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/07d69a7c-22b7-44a4-8aea-024e47f7912b-serviceca\") pod \"node-ca-xr5zn\" (UID: \"07d69a7c-22b7-44a4-8aea-024e47f7912b\") " pod="openshift-image-registry/node-ca-xr5zn" Apr 16 22:13:25.938996 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.938996 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-hostroot\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:25.939880 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939027 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs\") pod \"network-metrics-daemon-lpwn6\" (UID: \"ddd2c4f1-e24c-431c-a59a-9936d01e4667\") " pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:25.939880 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939051 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fab36813-c9b9-4c2c-aa71-55346680c966-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:25.939880 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939070 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-multus-cni-dir\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:25.939880 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939086 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-host-var-lib-kubelet\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:25.939880 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939107 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrltt\" (UniqueName: \"kubernetes.io/projected/77967758-2942-4cb4-aa90-9b16761c46b3-kube-api-access-nrltt\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:25.939880 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939132 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f9a6741d-2664-481e-bd4b-f57d10d85ede-sys-fs\") pod \"aws-ebs-csi-driver-node-pkdtk\" (UID: \"f9a6741d-2664-481e-bd4b-f57d10d85ede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:25.939880 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939153 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-sys\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:25.939880 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-etc-kubernetes\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:25.939880 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939186 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fab36813-c9b9-4c2c-aa71-55346680c966-system-cni-dir\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:25.939880 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939209 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/77967758-2942-4cb4-aa90-9b16761c46b3-cni-binary-copy\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:25.939880 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939232 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-host-run-multus-certs\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:25.939880 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939264 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f9a6741d-2664-481e-bd4b-f57d10d85ede-socket-dir\") pod \"aws-ebs-csi-driver-node-pkdtk\" (UID: \"f9a6741d-2664-481e-bd4b-f57d10d85ede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:25.939880 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939290 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-system-cni-dir\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:25.939880 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939312 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58tlk\" (UniqueName: \"kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk\") pod \"network-check-target-4kjg5\" (UID: \"42680cd4-0a5c-4123-aae1-963237fa5b60\") " pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:25.939880 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939355 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a72c732d-41ba-4c13-bf18-c21f9bf94968-etc-tuned\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:25.939880 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939382 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9mlz\" (UniqueName: \"kubernetes.io/projected/07d69a7c-22b7-44a4-8aea-024e47f7912b-kube-api-access-d9mlz\") pod \"node-ca-xr5zn\" (UID: \"07d69a7c-22b7-44a4-8aea-024e47f7912b\") " pod="openshift-image-registry/node-ca-xr5zn" Apr 16 22:13:25.940523 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939410 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-etc-sysconfig\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:25.940523 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939419 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 22:08:24 +0000 UTC" deadline="2027-12-15 08:56:22.032665999 +0000 UTC" Apr 16 22:13:25.940523 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939443 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14578h42m56.093226465s" Apr 16 22:13:25.940523 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939445 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-var-lib-kubelet\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:25.940523 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939470 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fab36813-c9b9-4c2c-aa71-55346680c966-os-release\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:25.940523 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939505 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-multus-socket-dir-parent\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:25.940523 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939538 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-host-run-k8s-cni-cncf-io\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:25.940523 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.939588 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:13:25.943971 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.943950 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:25.964470 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.964453 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-b8fpm" Apr 16 22:13:25.977711 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:25.977665 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-b8fpm" Apr 16 22:13:26.017885 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.017836 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal" event={"ID":"1e12d5376c1eb684628dd9cf17dac4a9","Type":"ContainerStarted","Data":"43836f6a4118c25c840b6111a5cbdf27f9e8b7ebc20ae2e08d477a01cb183097"} Apr 16 22:13:26.018742 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.018713 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" event={"ID":"0de14f2d8a312df1fd0f43b1fd02a43e","Type":"ContainerStarted","Data":"c406eb147ba499327884e60caaa9df54de97d7a60c17dccf0b37c122a993d890"} Apr 16 22:13:26.032532 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.032513 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 22:13:26.040122 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040097 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a72c732d-41ba-4c13-bf18-c21f9bf94968-tmp\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.040226 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040125 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fab36813-c9b9-4c2c-aa71-55346680c966-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:26.040226 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fab36813-c9b9-4c2c-aa71-55346680c966-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:26.040226 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040159 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjm7m\" (UniqueName: \"kubernetes.io/projected/ddd2c4f1-e24c-431c-a59a-9936d01e4667-kube-api-access-zjm7m\") pod \"network-metrics-daemon-lpwn6\" (UID: \"ddd2c4f1-e24c-431c-a59a-9936d01e4667\") " pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:26.040226 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040179 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-etc-openvswitch\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.040226 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040196 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-os-release\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.040226 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040215 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-host-var-lib-cni-bin\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.040437 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040239 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ft7zv\" (UniqueName: \"kubernetes.io/projected/365e92c8-bcc3-46c2-9512-0fa95057726c-kube-api-access-ft7zv\") pod \"iptables-alerter-7mhr4\" (UID: \"365e92c8-bcc3-46c2-9512-0fa95057726c\") " pod="openshift-network-operator/iptables-alerter-7mhr4" Apr 16 22:13:26.040437 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040263 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/47402cab-7cba-42f8-be72-8d67c31c2e7c-ovnkube-script-lib\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.040437 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040290 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-etc-systemd\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.040437 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040314 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f9a6741d-2664-481e-bd4b-f57d10d85ede-etc-selinux\") pod \"aws-ebs-csi-driver-node-pkdtk\" (UID: \"f9a6741d-2664-481e-bd4b-f57d10d85ede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:26.040437 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040317 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-os-release\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.040437 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040338 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-host-run-ovn-kubernetes\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.040437 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040383 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-etc-systemd\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.040437 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-etc-sysctl-d\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.040437 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040404 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-host-var-lib-cni-bin\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.040437 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040438 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f9a6741d-2664-481e-bd4b-f57d10d85ede-etc-selinux\") pod \"aws-ebs-csi-driver-node-pkdtk\" (UID: \"f9a6741d-2664-481e-bd4b-f57d10d85ede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:26.040905 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040488 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-run\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.040905 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040497 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 22:13:26.040905 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040533 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5fgg\" (UniqueName: \"kubernetes.io/projected/a72c732d-41ba-4c13-bf18-c21f9bf94968-kube-api-access-k5fgg\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.040905 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040557 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-etc-sysctl-d\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.040905 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040561 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-host-run-netns\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.040905 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040577 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-run\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.040905 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040599 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-host-run-netns\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.040905 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040688 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-host-var-lib-cni-multus\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.040905 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040711 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f9a6741d-2664-481e-bd4b-f57d10d85ede-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pkdtk\" (UID: \"f9a6741d-2664-481e-bd4b-f57d10d85ede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:26.040905 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040736 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f9a6741d-2664-481e-bd4b-f57d10d85ede-registration-dir\") pod \"aws-ebs-csi-driver-node-pkdtk\" (UID: \"f9a6741d-2664-481e-bd4b-f57d10d85ede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:26.040905 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040761 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-host-var-lib-cni-multus\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.040905 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040767 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-host-run-netns\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.040905 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040789 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f9a6741d-2664-481e-bd4b-f57d10d85ede-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pkdtk\" (UID: \"f9a6741d-2664-481e-bd4b-f57d10d85ede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:26.040905 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040801 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-etc-sysctl-conf\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.040905 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040808 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f9a6741d-2664-481e-bd4b-f57d10d85ede-registration-dir\") pod \"aws-ebs-csi-driver-node-pkdtk\" (UID: \"f9a6741d-2664-481e-bd4b-f57d10d85ede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:26.040905 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040827 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f9a6741d-2664-481e-bd4b-f57d10d85ede-device-dir\") pod \"aws-ebs-csi-driver-node-pkdtk\" (UID: \"f9a6741d-2664-481e-bd4b-f57d10d85ede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:26.040905 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040854 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/47402cab-7cba-42f8-be72-8d67c31c2e7c-ovn-node-metrics-cert\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.040905 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040877 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-etc-modprobe-d\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.041624 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040898 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-host\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.041624 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040913 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f9a6741d-2664-481e-bd4b-f57d10d85ede-device-dir\") pod \"aws-ebs-csi-driver-node-pkdtk\" (UID: \"f9a6741d-2664-481e-bd4b-f57d10d85ede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:26.041624 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040926 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-etc-sysctl-conf\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.041624 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040930 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-multus-conf-dir\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.041624 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040954 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-etc-kubernetes\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.041624 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040979 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-host\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.041624 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040986 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-multus-conf-dir\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.041624 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.040981 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-node-log\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.041624 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041006 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-etc-modprobe-d\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.041624 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d3ae4fa7-c3ae-4639-be2a-f39b4666fae0-konnectivity-ca\") pod \"konnectivity-agent-6lpzp\" (UID: \"d3ae4fa7-c3ae-4639-be2a-f39b4666fae0\") " pod="kube-system/konnectivity-agent-6lpzp" Apr 16 22:13:26.041624 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041047 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07d69a7c-22b7-44a4-8aea-024e47f7912b-host\") pod \"node-ca-xr5zn\" (UID: \"07d69a7c-22b7-44a4-8aea-024e47f7912b\") " pod="openshift-image-registry/node-ca-xr5zn" Apr 16 22:13:26.041624 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041052 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-etc-kubernetes\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.041624 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041093 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/07d69a7c-22b7-44a4-8aea-024e47f7912b-serviceca\") pod \"node-ca-xr5zn\" (UID: \"07d69a7c-22b7-44a4-8aea-024e47f7912b\") " pod="openshift-image-registry/node-ca-xr5zn" Apr 16 22:13:26.041624 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041112 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07d69a7c-22b7-44a4-8aea-024e47f7912b-host\") pod \"node-ca-xr5zn\" (UID: \"07d69a7c-22b7-44a4-8aea-024e47f7912b\") " pod="openshift-image-registry/node-ca-xr5zn" Apr 16 22:13:26.041624 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041158 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-hostroot\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.041624 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041184 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs\") pod \"network-metrics-daemon-lpwn6\" (UID: \"ddd2c4f1-e24c-431c-a59a-9936d01e4667\") " pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:26.041624 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041211 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-run-openvswitch\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.041624 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041212 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-hostroot\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.042354 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041243 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fab36813-c9b9-4c2c-aa71-55346680c966-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:26.042354 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041260 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fab36813-c9b9-4c2c-aa71-55346680c966-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:26.042354 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041323 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fab36813-c9b9-4c2c-aa71-55346680c966-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:26.042354 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041360 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-multus-cni-dir\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.042354 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:26.041371 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:26.042354 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041391 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-host-var-lib-kubelet\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.042354 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041416 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrltt\" (UniqueName: \"kubernetes.io/projected/77967758-2942-4cb4-aa90-9b16761c46b3-kube-api-access-nrltt\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.042354 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041417 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-multus-cni-dir\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.042354 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041437 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fab36813-c9b9-4c2c-aa71-55346680c966-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:26.042354 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041442 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f9a6741d-2664-481e-bd4b-f57d10d85ede-sys-fs\") pod \"aws-ebs-csi-driver-node-pkdtk\" (UID: \"f9a6741d-2664-481e-bd4b-f57d10d85ede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:26.042354 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-host-slash\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.042354 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f9a6741d-2664-481e-bd4b-f57d10d85ede-sys-fs\") pod \"aws-ebs-csi-driver-node-pkdtk\" (UID: \"f9a6741d-2664-481e-bd4b-f57d10d85ede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:26.042354 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041499 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/47402cab-7cba-42f8-be72-8d67c31c2e7c-ovnkube-config\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.042354 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:26.041538 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs podName:ddd2c4f1-e24c-431c-a59a-9936d01e4667 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:26.541506037 +0000 UTC m=+2.074846882 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs") pod "network-metrics-daemon-lpwn6" (UID: "ddd2c4f1-e24c-431c-a59a-9936d01e4667") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:26.042354 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041545 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-host-var-lib-kubelet\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.042354 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-sys\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.042354 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041606 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/07d69a7c-22b7-44a4-8aea-024e47f7912b-serviceca\") pod \"node-ca-xr5zn\" (UID: \"07d69a7c-22b7-44a4-8aea-024e47f7912b\") " pod="openshift-image-registry/node-ca-xr5zn" Apr 16 22:13:26.043019 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041636 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-log-socket\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.043019 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041655 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-sys\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.043019 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041711 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-etc-kubernetes\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.043019 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041729 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fab36813-c9b9-4c2c-aa71-55346680c966-system-cni-dir\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:26.043019 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/77967758-2942-4cb4-aa90-9b16761c46b3-cni-binary-copy\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.043019 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041774 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-host-run-multus-certs\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.043019 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041779 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-etc-kubernetes\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.043019 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f9a6741d-2664-481e-bd4b-f57d10d85ede-socket-dir\") pod \"aws-ebs-csi-driver-node-pkdtk\" (UID: \"f9a6741d-2664-481e-bd4b-f57d10d85ede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:26.043019 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041830 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/47402cab-7cba-42f8-be72-8d67c31c2e7c-env-overrides\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.043019 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041842 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-host-run-multus-certs\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.043019 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-system-cni-dir\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.043019 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041880 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58tlk\" (UniqueName: \"kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk\") pod \"network-check-target-4kjg5\" (UID: \"42680cd4-0a5c-4123-aae1-963237fa5b60\") " pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:26.043019 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041923 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a72c732d-41ba-4c13-bf18-c21f9bf94968-etc-tuned\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.043019 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041929 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fab36813-c9b9-4c2c-aa71-55346680c966-system-cni-dir\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:26.043019 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041896 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-system-cni-dir\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.043019 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.041972 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f9a6741d-2664-481e-bd4b-f57d10d85ede-socket-dir\") pod \"aws-ebs-csi-driver-node-pkdtk\" (UID: \"f9a6741d-2664-481e-bd4b-f57d10d85ede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:26.043019 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042000 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d3ae4fa7-c3ae-4639-be2a-f39b4666fae0-konnectivity-ca\") pod \"konnectivity-agent-6lpzp\" (UID: \"d3ae4fa7-c3ae-4639-be2a-f39b4666fae0\") " pod="kube-system/konnectivity-agent-6lpzp" Apr 16 22:13:26.043546 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042084 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d9mlz\" (UniqueName: \"kubernetes.io/projected/07d69a7c-22b7-44a4-8aea-024e47f7912b-kube-api-access-d9mlz\") pod \"node-ca-xr5zn\" (UID: \"07d69a7c-22b7-44a4-8aea-024e47f7912b\") " pod="openshift-image-registry/node-ca-xr5zn" Apr 16 22:13:26.043546 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042111 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-etc-sysconfig\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.043546 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042162 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-etc-sysconfig\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.043546 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042245 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-var-lib-kubelet\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.043546 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fab36813-c9b9-4c2c-aa71-55346680c966-os-release\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:26.043546 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042293 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-multus-socket-dir-parent\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.043546 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042315 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/77967758-2942-4cb4-aa90-9b16761c46b3-cni-binary-copy\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.043546 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-host-run-k8s-cni-cncf-io\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.043546 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042351 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-var-lib-kubelet\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.043546 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042355 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-multus-socket-dir-parent\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.043546 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042356 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/77967758-2942-4cb4-aa90-9b16761c46b3-multus-daemon-config\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.043546 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042358 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fab36813-c9b9-4c2c-aa71-55346680c966-os-release\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:26.043546 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042391 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-host-run-k8s-cni-cncf-io\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.043546 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042442 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/365e92c8-bcc3-46c2-9512-0fa95057726c-host-slash\") pod \"iptables-alerter-7mhr4\" (UID: \"365e92c8-bcc3-46c2-9512-0fa95057726c\") " pod="openshift-network-operator/iptables-alerter-7mhr4" Apr 16 22:13:26.043546 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-systemd-units\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.043546 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042512 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d3ae4fa7-c3ae-4639-be2a-f39b4666fae0-agent-certs\") pod \"konnectivity-agent-6lpzp\" (UID: \"d3ae4fa7-c3ae-4639-be2a-f39b4666fae0\") " pod="kube-system/konnectivity-agent-6lpzp" Apr 16 22:13:26.043546 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042519 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/365e92c8-bcc3-46c2-9512-0fa95057726c-host-slash\") pod \"iptables-alerter-7mhr4\" (UID: \"365e92c8-bcc3-46c2-9512-0fa95057726c\") " pod="openshift-network-operator/iptables-alerter-7mhr4" Apr 16 22:13:26.044067 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2ssr\" (UniqueName: \"kubernetes.io/projected/fab36813-c9b9-4c2c-aa71-55346680c966-kube-api-access-b2ssr\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:26.044067 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042571 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-cnibin\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.044067 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042632 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/77967758-2942-4cb4-aa90-9b16761c46b3-cnibin\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.044067 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042716 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-host-cni-bin\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.044067 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042751 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.044067 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042811 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9vg2\" (UniqueName: \"kubernetes.io/projected/47402cab-7cba-42f8-be72-8d67c31c2e7c-kube-api-access-s9vg2\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.044067 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042838 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/77967758-2942-4cb4-aa90-9b16761c46b3-multus-daemon-config\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.044067 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042847 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-lib-modules\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.044067 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042868 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fab36813-c9b9-4c2c-aa71-55346680c966-cnibin\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:26.044067 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042891 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/365e92c8-bcc3-46c2-9512-0fa95057726c-iptables-alerter-script\") pod \"iptables-alerter-7mhr4\" (UID: \"365e92c8-bcc3-46c2-9512-0fa95057726c\") " pod="openshift-network-operator/iptables-alerter-7mhr4" Apr 16 22:13:26.044067 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042912 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vq9hk\" (UniqueName: \"kubernetes.io/projected/f9a6741d-2664-481e-bd4b-f57d10d85ede-kube-api-access-vq9hk\") pod \"aws-ebs-csi-driver-node-pkdtk\" (UID: \"f9a6741d-2664-481e-bd4b-f57d10d85ede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:26.044067 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042947 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fab36813-c9b9-4c2c-aa71-55346680c966-cnibin\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:26.044067 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042950 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-host-kubelet\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.044067 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042989 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-var-lib-openvswitch\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.044067 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.043017 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-run-ovn\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.044067 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.042951 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a72c732d-41ba-4c13-bf18-c21f9bf94968-lib-modules\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.044067 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.043045 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-host-cni-netd\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.044511 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.043078 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fab36813-c9b9-4c2c-aa71-55346680c966-cni-binary-copy\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:26.044511 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.043157 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-run-systemd\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.044511 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.043468 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/365e92c8-bcc3-46c2-9512-0fa95057726c-iptables-alerter-script\") pod \"iptables-alerter-7mhr4\" (UID: \"365e92c8-bcc3-46c2-9512-0fa95057726c\") " pod="openshift-network-operator/iptables-alerter-7mhr4" Apr 16 22:13:26.044511 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.043485 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a72c732d-41ba-4c13-bf18-c21f9bf94968-tmp\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.044511 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.043875 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fab36813-c9b9-4c2c-aa71-55346680c966-cni-binary-copy\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:26.044511 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.044233 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a72c732d-41ba-4c13-bf18-c21f9bf94968-etc-tuned\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.044828 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.044804 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d3ae4fa7-c3ae-4639-be2a-f39b4666fae0-agent-certs\") pod \"konnectivity-agent-6lpzp\" (UID: \"d3ae4fa7-c3ae-4639-be2a-f39b4666fae0\") " pod="kube-system/konnectivity-agent-6lpzp" Apr 16 22:13:26.048331 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.048312 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjm7m\" (UniqueName: \"kubernetes.io/projected/ddd2c4f1-e24c-431c-a59a-9936d01e4667-kube-api-access-zjm7m\") pod \"network-metrics-daemon-lpwn6\" (UID: \"ddd2c4f1-e24c-431c-a59a-9936d01e4667\") " pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:26.049082 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.049065 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5fgg\" (UniqueName: \"kubernetes.io/projected/a72c732d-41ba-4c13-bf18-c21f9bf94968-kube-api-access-k5fgg\") pod \"tuned-jdbf4\" (UID: \"a72c732d-41ba-4c13-bf18-c21f9bf94968\") " pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.050577 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.050561 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft7zv\" (UniqueName: \"kubernetes.io/projected/365e92c8-bcc3-46c2-9512-0fa95057726c-kube-api-access-ft7zv\") pod \"iptables-alerter-7mhr4\" (UID: \"365e92c8-bcc3-46c2-9512-0fa95057726c\") " pod="openshift-network-operator/iptables-alerter-7mhr4" Apr 16 22:13:26.053462 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:26.053443 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:26.053462 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:26.053462 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:26.053611 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:26.053473 2576 projected.go:194] Error preparing data for projected volume kube-api-access-58tlk for pod openshift-network-diagnostics/network-check-target-4kjg5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:26.053611 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:26.053530 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk podName:42680cd4-0a5c-4123-aae1-963237fa5b60 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:26.553513336 +0000 UTC m=+2.086854189 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-58tlk" (UniqueName: "kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk") pod "network-check-target-4kjg5" (UID: "42680cd4-0a5c-4123-aae1-963237fa5b60") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:26.054278 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.054255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrltt\" (UniqueName: \"kubernetes.io/projected/77967758-2942-4cb4-aa90-9b16761c46b3-kube-api-access-nrltt\") pod \"multus-hqm5z\" (UID: \"77967758-2942-4cb4-aa90-9b16761c46b3\") " pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.055073 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.055052 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9mlz\" (UniqueName: \"kubernetes.io/projected/07d69a7c-22b7-44a4-8aea-024e47f7912b-kube-api-access-d9mlz\") pod \"node-ca-xr5zn\" (UID: \"07d69a7c-22b7-44a4-8aea-024e47f7912b\") " pod="openshift-image-registry/node-ca-xr5zn" Apr 16 22:13:26.055222 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.055206 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2ssr\" (UniqueName: \"kubernetes.io/projected/fab36813-c9b9-4c2c-aa71-55346680c966-kube-api-access-b2ssr\") pod \"multus-additional-cni-plugins-xn6mm\" (UID: \"fab36813-c9b9-4c2c-aa71-55346680c966\") " pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:26.056039 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.056023 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq9hk\" (UniqueName: \"kubernetes.io/projected/f9a6741d-2664-481e-bd4b-f57d10d85ede-kube-api-access-vq9hk\") pod \"aws-ebs-csi-driver-node-pkdtk\" (UID: \"f9a6741d-2664-481e-bd4b-f57d10d85ede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:26.144482 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.144443 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-node-log\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.144693 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.144500 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-run-openvswitch\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.144693 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.144527 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-host-slash\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.144693 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.144549 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/47402cab-7cba-42f8-be72-8d67c31c2e7c-ovnkube-config\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.144693 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.144561 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-node-log\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.144693 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.144561 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-run-openvswitch\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.144693 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.144604 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-log-socket\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.144693 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.144597 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-host-slash\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.144693 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.144570 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-log-socket\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.144693 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.144656 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/47402cab-7cba-42f8-be72-8d67c31c2e7c-env-overrides\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145140 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.144725 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-systemd-units\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145140 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.144751 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-host-cni-bin\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145140 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.144778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145140 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.144805 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9vg2\" (UniqueName: \"kubernetes.io/projected/47402cab-7cba-42f8-be72-8d67c31c2e7c-kube-api-access-s9vg2\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145140 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.144833 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-host-cni-bin\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145140 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.144850 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145140 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.144889 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-systemd-units\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145140 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.144894 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-host-kubelet\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145140 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.144944 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-host-kubelet\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145140 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.144955 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-var-lib-openvswitch\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145140 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.144983 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-run-ovn\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145140 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.145009 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-var-lib-openvswitch\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145140 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.145011 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-host-cni-netd\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145140 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.145036 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-run-ovn\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145140 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.145035 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-run-systemd\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145140 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.145066 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-host-cni-netd\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145140 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.145070 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-run-systemd\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145855 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.145081 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-etc-openvswitch\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145855 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.145103 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-etc-openvswitch\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145855 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.145115 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/47402cab-7cba-42f8-be72-8d67c31c2e7c-env-overrides\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145855 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.145124 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/47402cab-7cba-42f8-be72-8d67c31c2e7c-ovnkube-script-lib\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145855 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.145153 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-host-run-ovn-kubernetes\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145855 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.145159 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/47402cab-7cba-42f8-be72-8d67c31c2e7c-ovnkube-config\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145855 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.145184 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-host-run-netns\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145855 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.145204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/47402cab-7cba-42f8-be72-8d67c31c2e7c-ovn-node-metrics-cert\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145855 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.145235 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-host-run-netns\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145855 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.145239 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47402cab-7cba-42f8-be72-8d67c31c2e7c-host-run-ovn-kubernetes\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.145855 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.145645 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/47402cab-7cba-42f8-be72-8d67c31c2e7c-ovnkube-script-lib\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.147140 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.147124 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/47402cab-7cba-42f8-be72-8d67c31c2e7c-ovn-node-metrics-cert\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.153107 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.153081 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9vg2\" (UniqueName: \"kubernetes.io/projected/47402cab-7cba-42f8-be72-8d67c31c2e7c-kube-api-access-s9vg2\") pod \"ovnkube-node-k24wz\" (UID: \"47402cab-7cba-42f8-be72-8d67c31c2e7c\") " pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.158208 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.158190 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:26.242621 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.242397 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6lpzp" Apr 16 22:13:26.248312 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.248290 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" Apr 16 22:13:26.248669 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:26.248645 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3ae4fa7_c3ae_4639_be2a_f39b4666fae0.slice/crio-0c11b0b0f8d0dadee8863507c78a565f08391290bdbfd89ce495f189d56ac7b2 WatchSource:0}: Error finding container 0c11b0b0f8d0dadee8863507c78a565f08391290bdbfd89ce495f189d56ac7b2: Status 404 returned error can't find the container with id 0c11b0b0f8d0dadee8863507c78a565f08391290bdbfd89ce495f189d56ac7b2 Apr 16 22:13:26.254277 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.254251 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" Apr 16 22:13:26.254525 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:26.254504 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9a6741d_2664_481e_bd4b_f57d10d85ede.slice/crio-6e2878485bcbdd6db05e808a7b6399e8165ecb35e9b45eb116d4b5adc902dcc0 WatchSource:0}: Error finding container 6e2878485bcbdd6db05e808a7b6399e8165ecb35e9b45eb116d4b5adc902dcc0: Status 404 returned error can't find the container with id 6e2878485bcbdd6db05e808a7b6399e8165ecb35e9b45eb116d4b5adc902dcc0 Apr 16 22:13:26.259397 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.259373 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xr5zn" Apr 16 22:13:26.259600 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:26.259581 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda72c732d_41ba_4c13_bf18_c21f9bf94968.slice/crio-720c5fd6bfac27d3c9d84cab1fdfa399076b91d6aa13d7b1734b8581437fb2c3 WatchSource:0}: Error finding container 720c5fd6bfac27d3c9d84cab1fdfa399076b91d6aa13d7b1734b8581437fb2c3: Status 404 returned error can't find the container with id 720c5fd6bfac27d3c9d84cab1fdfa399076b91d6aa13d7b1734b8581437fb2c3 Apr 16 22:13:26.265227 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.265208 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xn6mm" Apr 16 22:13:26.267562 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:26.267526 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07d69a7c_22b7_44a4_8aea_024e47f7912b.slice/crio-cf7531e277a12028f441e9db14c22f286cb37f6b15014d30c0acd3b0f5f3cf5a WatchSource:0}: Error finding container cf7531e277a12028f441e9db14c22f286cb37f6b15014d30c0acd3b0f5f3cf5a: Status 404 returned error can't find the container with id cf7531e277a12028f441e9db14c22f286cb37f6b15014d30c0acd3b0f5f3cf5a Apr 16 22:13:26.269217 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.269201 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hqm5z" Apr 16 22:13:26.272030 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:26.272003 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfab36813_c9b9_4c2c_aa71_55346680c966.slice/crio-174a06a394052caca22d8e02d6349c236c51212fe2d9665da6270c76b126139f WatchSource:0}: Error finding container 174a06a394052caca22d8e02d6349c236c51212fe2d9665da6270c76b126139f: Status 404 returned error can't find the container with id 174a06a394052caca22d8e02d6349c236c51212fe2d9665da6270c76b126139f Apr 16 22:13:26.276399 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:26.276375 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77967758_2942_4cb4_aa90_9b16761c46b3.slice/crio-a3b2535440d3162d46f8cecdb555b7b98179db959cc1b38b6c1a2892d5f37cbf WatchSource:0}: Error finding container a3b2535440d3162d46f8cecdb555b7b98179db959cc1b38b6c1a2892d5f37cbf: Status 404 returned error can't find the container with id a3b2535440d3162d46f8cecdb555b7b98179db959cc1b38b6c1a2892d5f37cbf Apr 16 22:13:26.277633 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.277618 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7mhr4" Apr 16 22:13:26.282148 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.282133 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:26.284269 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:26.284242 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod365e92c8_bcc3_46c2_9512_0fa95057726c.slice/crio-ad362a21b8315b2bd9f7abe5423af3a013e684efcb50973fb5c83654249184ce WatchSource:0}: Error finding container ad362a21b8315b2bd9f7abe5423af3a013e684efcb50973fb5c83654249184ce: Status 404 returned error can't find the container with id ad362a21b8315b2bd9f7abe5423af3a013e684efcb50973fb5c83654249184ce Apr 16 22:13:26.288979 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:26.288940 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47402cab_7cba_42f8_be72_8d67c31c2e7c.slice/crio-4a89138edb6db10e3d78e43463f3df07789171a3f3f6f13f59c17b0b813b52d7 WatchSource:0}: Error finding container 4a89138edb6db10e3d78e43463f3df07789171a3f3f6f13f59c17b0b813b52d7: Status 404 returned error can't find the container with id 4a89138edb6db10e3d78e43463f3df07789171a3f3f6f13f59c17b0b813b52d7 Apr 16 22:13:26.549841 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.549142 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs\") pod \"network-metrics-daemon-lpwn6\" (UID: \"ddd2c4f1-e24c-431c-a59a-9936d01e4667\") " pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:26.549841 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:26.549334 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:26.549841 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:26.549396 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs podName:ddd2c4f1-e24c-431c-a59a-9936d01e4667 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:27.549378249 +0000 UTC m=+3.082719103 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs") pod "network-metrics-daemon-lpwn6" (UID: "ddd2c4f1-e24c-431c-a59a-9936d01e4667") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:26.649617 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.649572 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58tlk\" (UniqueName: \"kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk\") pod \"network-check-target-4kjg5\" (UID: \"42680cd4-0a5c-4123-aae1-963237fa5b60\") " pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:26.649785 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:26.649741 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:26.649785 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:26.649761 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:26.649785 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:26.649774 2576 projected.go:194] Error preparing data for projected volume kube-api-access-58tlk for pod openshift-network-diagnostics/network-check-target-4kjg5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:26.649922 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:26.649841 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk podName:42680cd4-0a5c-4123-aae1-963237fa5b60 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:27.649823117 +0000 UTC m=+3.183163962 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-58tlk" (UniqueName: "kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk") pod "network-check-target-4kjg5" (UID: "42680cd4-0a5c-4123-aae1-963237fa5b60") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:26.936686 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.936592 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:26.979236 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.979197 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:25 +0000 UTC" deadline="2028-01-22 20:35:14.587208392 +0000 UTC" Apr 16 22:13:26.979236 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:26.979234 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15502h21m47.607978323s" Apr 16 22:13:27.035519 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:27.035468 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" event={"ID":"a72c732d-41ba-4c13-bf18-c21f9bf94968","Type":"ContainerStarted","Data":"720c5fd6bfac27d3c9d84cab1fdfa399076b91d6aa13d7b1734b8581437fb2c3"} Apr 16 22:13:27.042273 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:27.042238 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" event={"ID":"f9a6741d-2664-481e-bd4b-f57d10d85ede","Type":"ContainerStarted","Data":"6e2878485bcbdd6db05e808a7b6399e8165ecb35e9b45eb116d4b5adc902dcc0"} Apr 16 22:13:27.054045 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:27.054005 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" event={"ID":"47402cab-7cba-42f8-be72-8d67c31c2e7c","Type":"ContainerStarted","Data":"4a89138edb6db10e3d78e43463f3df07789171a3f3f6f13f59c17b0b813b52d7"} Apr 16 22:13:27.068790 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:27.068752 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7mhr4" event={"ID":"365e92c8-bcc3-46c2-9512-0fa95057726c","Type":"ContainerStarted","Data":"ad362a21b8315b2bd9f7abe5423af3a013e684efcb50973fb5c83654249184ce"} Apr 16 22:13:27.080515 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:27.080479 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6lpzp" event={"ID":"d3ae4fa7-c3ae-4639-be2a-f39b4666fae0","Type":"ContainerStarted","Data":"0c11b0b0f8d0dadee8863507c78a565f08391290bdbfd89ce495f189d56ac7b2"} Apr 16 22:13:27.096009 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:27.095965 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hqm5z" event={"ID":"77967758-2942-4cb4-aa90-9b16761c46b3","Type":"ContainerStarted","Data":"a3b2535440d3162d46f8cecdb555b7b98179db959cc1b38b6c1a2892d5f37cbf"} Apr 16 22:13:27.101928 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:27.101889 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn6mm" event={"ID":"fab36813-c9b9-4c2c-aa71-55346680c966","Type":"ContainerStarted","Data":"174a06a394052caca22d8e02d6349c236c51212fe2d9665da6270c76b126139f"} Apr 16 22:13:27.109713 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:27.109667 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xr5zn" event={"ID":"07d69a7c-22b7-44a4-8aea-024e47f7912b","Type":"ContainerStarted","Data":"cf7531e277a12028f441e9db14c22f286cb37f6b15014d30c0acd3b0f5f3cf5a"} Apr 16 22:13:27.557764 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:27.557144 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs\") pod \"network-metrics-daemon-lpwn6\" (UID: \"ddd2c4f1-e24c-431c-a59a-9936d01e4667\") " pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:27.557764 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:27.557284 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:27.557764 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:27.557349 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs podName:ddd2c4f1-e24c-431c-a59a-9936d01e4667 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:29.55733097 +0000 UTC m=+5.090671816 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs") pod "network-metrics-daemon-lpwn6" (UID: "ddd2c4f1-e24c-431c-a59a-9936d01e4667") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:27.658738 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:27.658085 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58tlk\" (UniqueName: \"kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk\") pod \"network-check-target-4kjg5\" (UID: \"42680cd4-0a5c-4123-aae1-963237fa5b60\") " pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:27.658738 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:27.658283 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:27.658738 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:27.658302 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:27.658738 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:27.658314 2576 projected.go:194] Error preparing data for projected volume kube-api-access-58tlk for pod openshift-network-diagnostics/network-check-target-4kjg5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:27.658738 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:27.658381 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk podName:42680cd4-0a5c-4123-aae1-963237fa5b60 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:29.658363473 +0000 UTC m=+5.191704310 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-58tlk" (UniqueName: "kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk") pod "network-check-target-4kjg5" (UID: "42680cd4-0a5c-4123-aae1-963237fa5b60") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:27.731936 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:27.731893 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:27.907780 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:27.907301 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:27.980437 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:27.980366 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:25 +0000 UTC" deadline="2027-12-11 12:11:17.552966937 +0000 UTC" Apr 16 22:13:27.980437 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:27.980407 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14485h57m49.572565033s" Apr 16 22:13:28.015957 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:28.015884 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:28.016150 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:28.015983 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:28.016150 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:28.016105 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjg5" podUID="42680cd4-0a5c-4123-aae1-963237fa5b60" Apr 16 22:13:28.016610 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:28.016417 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lpwn6" podUID="ddd2c4f1-e24c-431c-a59a-9936d01e4667" Apr 16 22:13:29.573739 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:29.573632 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs\") pod \"network-metrics-daemon-lpwn6\" (UID: \"ddd2c4f1-e24c-431c-a59a-9936d01e4667\") " pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:29.574194 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:29.573804 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:29.574194 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:29.573873 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs podName:ddd2c4f1-e24c-431c-a59a-9936d01e4667 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:33.573853528 +0000 UTC m=+9.107194424 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs") pod "network-metrics-daemon-lpwn6" (UID: "ddd2c4f1-e24c-431c-a59a-9936d01e4667") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:29.674429 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:29.674395 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58tlk\" (UniqueName: \"kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk\") pod \"network-check-target-4kjg5\" (UID: \"42680cd4-0a5c-4123-aae1-963237fa5b60\") " pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:29.674610 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:29.674570 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:29.674610 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:29.674604 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:29.674744 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:29.674617 2576 projected.go:194] Error preparing data for projected volume kube-api-access-58tlk for pod openshift-network-diagnostics/network-check-target-4kjg5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:29.674796 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:29.674768 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk podName:42680cd4-0a5c-4123-aae1-963237fa5b60 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:33.674657664 +0000 UTC m=+9.207998510 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-58tlk" (UniqueName: "kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk") pod "network-check-target-4kjg5" (UID: "42680cd4-0a5c-4123-aae1-963237fa5b60") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:30.015074 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:30.014994 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:30.015233 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:30.015094 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjg5" podUID="42680cd4-0a5c-4123-aae1-963237fa5b60" Apr 16 22:13:30.015233 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:30.015007 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:30.015352 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:30.015230 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lpwn6" podUID="ddd2c4f1-e24c-431c-a59a-9936d01e4667" Apr 16 22:13:32.014982 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:32.014950 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:32.015426 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:32.015071 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjg5" podUID="42680cd4-0a5c-4123-aae1-963237fa5b60" Apr 16 22:13:32.015426 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:32.014950 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:32.015426 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:32.015221 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lpwn6" podUID="ddd2c4f1-e24c-431c-a59a-9936d01e4667" Apr 16 22:13:32.498265 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:32.497532 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-9vfz7"] Apr 16 22:13:32.503846 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:32.503815 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9vfz7" Apr 16 22:13:32.506819 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:32.506599 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 22:13:32.506819 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:32.506688 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 22:13:32.506819 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:32.506599 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-czb2r\"" Apr 16 22:13:32.598109 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:32.598062 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/97863583-437e-48ec-8c04-9cb36c6d5a89-hosts-file\") pod \"node-resolver-9vfz7\" (UID: \"97863583-437e-48ec-8c04-9cb36c6d5a89\") " pod="openshift-dns/node-resolver-9vfz7" Apr 16 22:13:32.598269 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:32.598121 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb4sr\" (UniqueName: \"kubernetes.io/projected/97863583-437e-48ec-8c04-9cb36c6d5a89-kube-api-access-vb4sr\") pod \"node-resolver-9vfz7\" (UID: \"97863583-437e-48ec-8c04-9cb36c6d5a89\") " pod="openshift-dns/node-resolver-9vfz7" Apr 16 22:13:32.598269 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:32.598162 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/97863583-437e-48ec-8c04-9cb36c6d5a89-tmp-dir\") pod \"node-resolver-9vfz7\" (UID: \"97863583-437e-48ec-8c04-9cb36c6d5a89\") " pod="openshift-dns/node-resolver-9vfz7" Apr 16 22:13:32.698808 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:32.698626 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vb4sr\" (UniqueName: \"kubernetes.io/projected/97863583-437e-48ec-8c04-9cb36c6d5a89-kube-api-access-vb4sr\") pod \"node-resolver-9vfz7\" (UID: \"97863583-437e-48ec-8c04-9cb36c6d5a89\") " pod="openshift-dns/node-resolver-9vfz7" Apr 16 22:13:32.698808 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:32.698705 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/97863583-437e-48ec-8c04-9cb36c6d5a89-tmp-dir\") pod \"node-resolver-9vfz7\" (UID: \"97863583-437e-48ec-8c04-9cb36c6d5a89\") " pod="openshift-dns/node-resolver-9vfz7" Apr 16 22:13:32.699093 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:32.698992 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/97863583-437e-48ec-8c04-9cb36c6d5a89-hosts-file\") pod \"node-resolver-9vfz7\" (UID: \"97863583-437e-48ec-8c04-9cb36c6d5a89\") " pod="openshift-dns/node-resolver-9vfz7" Apr 16 22:13:32.699093 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:32.699015 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/97863583-437e-48ec-8c04-9cb36c6d5a89-tmp-dir\") pod \"node-resolver-9vfz7\" (UID: \"97863583-437e-48ec-8c04-9cb36c6d5a89\") " pod="openshift-dns/node-resolver-9vfz7" Apr 16 22:13:32.699189 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:32.699090 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/97863583-437e-48ec-8c04-9cb36c6d5a89-hosts-file\") pod \"node-resolver-9vfz7\" (UID: \"97863583-437e-48ec-8c04-9cb36c6d5a89\") " pod="openshift-dns/node-resolver-9vfz7" Apr 16 22:13:32.709022 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:32.708994 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb4sr\" (UniqueName: \"kubernetes.io/projected/97863583-437e-48ec-8c04-9cb36c6d5a89-kube-api-access-vb4sr\") pod \"node-resolver-9vfz7\" (UID: \"97863583-437e-48ec-8c04-9cb36c6d5a89\") " pod="openshift-dns/node-resolver-9vfz7" Apr 16 22:13:32.816434 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:32.816352 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9vfz7" Apr 16 22:13:33.607419 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:33.607380 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs\") pod \"network-metrics-daemon-lpwn6\" (UID: \"ddd2c4f1-e24c-431c-a59a-9936d01e4667\") " pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:33.607898 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:33.607528 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:33.607898 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:33.607605 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs podName:ddd2c4f1-e24c-431c-a59a-9936d01e4667 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:41.607585692 +0000 UTC m=+17.140926534 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs") pod "network-metrics-daemon-lpwn6" (UID: "ddd2c4f1-e24c-431c-a59a-9936d01e4667") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:33.654655 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:33.654602 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-76zkl"] Apr 16 22:13:33.656578 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:33.656550 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:33.656709 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:33.656626 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-76zkl" podUID="4074edbf-bd19-4b41-b90e-a27b5cc066e7" Apr 16 22:13:33.708751 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:33.708701 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58tlk\" (UniqueName: \"kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk\") pod \"network-check-target-4kjg5\" (UID: \"42680cd4-0a5c-4123-aae1-963237fa5b60\") " pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:33.708931 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:33.708764 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4074edbf-bd19-4b41-b90e-a27b5cc066e7-dbus\") pod \"global-pull-secret-syncer-76zkl\" (UID: \"4074edbf-bd19-4b41-b90e-a27b5cc066e7\") " pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:33.708931 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:33.708804 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4074edbf-bd19-4b41-b90e-a27b5cc066e7-kubelet-config\") pod \"global-pull-secret-syncer-76zkl\" (UID: \"4074edbf-bd19-4b41-b90e-a27b5cc066e7\") " pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:33.708931 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:33.708856 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4074edbf-bd19-4b41-b90e-a27b5cc066e7-original-pull-secret\") pod \"global-pull-secret-syncer-76zkl\" (UID: \"4074edbf-bd19-4b41-b90e-a27b5cc066e7\") " pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:33.709094 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:33.709029 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:33.709094 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:33.709048 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:33.709094 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:33.709063 2576 projected.go:194] Error preparing data for projected volume kube-api-access-58tlk for pod openshift-network-diagnostics/network-check-target-4kjg5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:33.709226 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:33.709199 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk podName:42680cd4-0a5c-4123-aae1-963237fa5b60 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:41.709158104 +0000 UTC m=+17.242498949 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-58tlk" (UniqueName: "kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk") pod "network-check-target-4kjg5" (UID: "42680cd4-0a5c-4123-aae1-963237fa5b60") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:33.810574 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:33.810141 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4074edbf-bd19-4b41-b90e-a27b5cc066e7-original-pull-secret\") pod \"global-pull-secret-syncer-76zkl\" (UID: \"4074edbf-bd19-4b41-b90e-a27b5cc066e7\") " pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:33.810574 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:33.810233 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4074edbf-bd19-4b41-b90e-a27b5cc066e7-dbus\") pod \"global-pull-secret-syncer-76zkl\" (UID: \"4074edbf-bd19-4b41-b90e-a27b5cc066e7\") " pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:33.810574 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:33.810273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4074edbf-bd19-4b41-b90e-a27b5cc066e7-kubelet-config\") pod \"global-pull-secret-syncer-76zkl\" (UID: \"4074edbf-bd19-4b41-b90e-a27b5cc066e7\") " pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:33.810574 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:33.810285 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:33.810574 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:33.810363 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4074edbf-bd19-4b41-b90e-a27b5cc066e7-original-pull-secret podName:4074edbf-bd19-4b41-b90e-a27b5cc066e7 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:34.310343142 +0000 UTC m=+9.843683987 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4074edbf-bd19-4b41-b90e-a27b5cc066e7-original-pull-secret") pod "global-pull-secret-syncer-76zkl" (UID: "4074edbf-bd19-4b41-b90e-a27b5cc066e7") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:33.810574 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:33.810392 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4074edbf-bd19-4b41-b90e-a27b5cc066e7-kubelet-config\") pod \"global-pull-secret-syncer-76zkl\" (UID: \"4074edbf-bd19-4b41-b90e-a27b5cc066e7\") " pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:33.810574 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:33.810541 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4074edbf-bd19-4b41-b90e-a27b5cc066e7-dbus\") pod \"global-pull-secret-syncer-76zkl\" (UID: \"4074edbf-bd19-4b41-b90e-a27b5cc066e7\") " pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:34.015491 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:34.015452 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:34.015691 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:34.015511 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:34.015691 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:34.015640 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lpwn6" podUID="ddd2c4f1-e24c-431c-a59a-9936d01e4667" Apr 16 22:13:34.015806 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:34.015762 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjg5" podUID="42680cd4-0a5c-4123-aae1-963237fa5b60" Apr 16 22:13:34.314588 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:34.314550 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4074edbf-bd19-4b41-b90e-a27b5cc066e7-original-pull-secret\") pod \"global-pull-secret-syncer-76zkl\" (UID: \"4074edbf-bd19-4b41-b90e-a27b5cc066e7\") " pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:34.314780 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:34.314720 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:34.314860 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:34.314782 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4074edbf-bd19-4b41-b90e-a27b5cc066e7-original-pull-secret podName:4074edbf-bd19-4b41-b90e-a27b5cc066e7 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:35.314763006 +0000 UTC m=+10.848103852 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4074edbf-bd19-4b41-b90e-a27b5cc066e7-original-pull-secret") pod "global-pull-secret-syncer-76zkl" (UID: "4074edbf-bd19-4b41-b90e-a27b5cc066e7") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:35.016205 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:35.016072 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:35.016205 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:35.016193 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-76zkl" podUID="4074edbf-bd19-4b41-b90e-a27b5cc066e7" Apr 16 22:13:35.324243 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:35.324144 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4074edbf-bd19-4b41-b90e-a27b5cc066e7-original-pull-secret\") pod \"global-pull-secret-syncer-76zkl\" (UID: \"4074edbf-bd19-4b41-b90e-a27b5cc066e7\") " pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:35.324392 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:35.324317 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:35.324392 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:35.324382 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4074edbf-bd19-4b41-b90e-a27b5cc066e7-original-pull-secret podName:4074edbf-bd19-4b41-b90e-a27b5cc066e7 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:37.324366635 +0000 UTC m=+12.857707466 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4074edbf-bd19-4b41-b90e-a27b5cc066e7-original-pull-secret") pod "global-pull-secret-syncer-76zkl" (UID: "4074edbf-bd19-4b41-b90e-a27b5cc066e7") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:36.015624 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:36.015590 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:36.015807 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:36.015590 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:36.015807 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:36.015743 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lpwn6" podUID="ddd2c4f1-e24c-431c-a59a-9936d01e4667" Apr 16 22:13:36.015807 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:36.015762 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjg5" podUID="42680cd4-0a5c-4123-aae1-963237fa5b60" Apr 16 22:13:37.015077 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:37.015045 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:37.015488 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:37.015180 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-76zkl" podUID="4074edbf-bd19-4b41-b90e-a27b5cc066e7" Apr 16 22:13:37.341154 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:37.341057 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4074edbf-bd19-4b41-b90e-a27b5cc066e7-original-pull-secret\") pod \"global-pull-secret-syncer-76zkl\" (UID: \"4074edbf-bd19-4b41-b90e-a27b5cc066e7\") " pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:37.341326 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:37.341242 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:37.341326 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:37.341326 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4074edbf-bd19-4b41-b90e-a27b5cc066e7-original-pull-secret podName:4074edbf-bd19-4b41-b90e-a27b5cc066e7 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:41.341303608 +0000 UTC m=+16.874644457 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4074edbf-bd19-4b41-b90e-a27b5cc066e7-original-pull-secret") pod "global-pull-secret-syncer-76zkl" (UID: "4074edbf-bd19-4b41-b90e-a27b5cc066e7") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:38.015327 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:38.015298 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:38.015782 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:38.015301 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:38.015782 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:38.015429 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lpwn6" podUID="ddd2c4f1-e24c-431c-a59a-9936d01e4667" Apr 16 22:13:38.015782 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:38.015514 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjg5" podUID="42680cd4-0a5c-4123-aae1-963237fa5b60" Apr 16 22:13:39.014961 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:39.014913 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:39.015189 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:39.015099 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-76zkl" podUID="4074edbf-bd19-4b41-b90e-a27b5cc066e7" Apr 16 22:13:40.015527 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:40.015489 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:40.015987 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:40.015489 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:40.015987 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:40.015633 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjg5" podUID="42680cd4-0a5c-4123-aae1-963237fa5b60" Apr 16 22:13:40.015987 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:40.015716 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lpwn6" podUID="ddd2c4f1-e24c-431c-a59a-9936d01e4667" Apr 16 22:13:41.015061 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:41.015027 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:41.015220 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:41.015157 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-76zkl" podUID="4074edbf-bd19-4b41-b90e-a27b5cc066e7" Apr 16 22:13:41.374732 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:41.374637 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4074edbf-bd19-4b41-b90e-a27b5cc066e7-original-pull-secret\") pod \"global-pull-secret-syncer-76zkl\" (UID: \"4074edbf-bd19-4b41-b90e-a27b5cc066e7\") " pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:41.375207 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:41.374782 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:41.375207 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:41.374861 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4074edbf-bd19-4b41-b90e-a27b5cc066e7-original-pull-secret podName:4074edbf-bd19-4b41-b90e-a27b5cc066e7 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:49.374839776 +0000 UTC m=+24.908180815 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4074edbf-bd19-4b41-b90e-a27b5cc066e7-original-pull-secret") pod "global-pull-secret-syncer-76zkl" (UID: "4074edbf-bd19-4b41-b90e-a27b5cc066e7") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:41.677565 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:41.677490 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs\") pod \"network-metrics-daemon-lpwn6\" (UID: \"ddd2c4f1-e24c-431c-a59a-9936d01e4667\") " pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:41.677745 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:41.677649 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:41.677745 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:41.677735 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs podName:ddd2c4f1-e24c-431c-a59a-9936d01e4667 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:57.677714811 +0000 UTC m=+33.211055645 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs") pod "network-metrics-daemon-lpwn6" (UID: "ddd2c4f1-e24c-431c-a59a-9936d01e4667") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:41.778435 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:41.778402 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58tlk\" (UniqueName: \"kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk\") pod \"network-check-target-4kjg5\" (UID: \"42680cd4-0a5c-4123-aae1-963237fa5b60\") " pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:41.778616 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:41.778551 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:41.778616 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:41.778569 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:41.778616 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:41.778580 2576 projected.go:194] Error preparing data for projected volume kube-api-access-58tlk for pod openshift-network-diagnostics/network-check-target-4kjg5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:41.778792 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:41.778641 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk podName:42680cd4-0a5c-4123-aae1-963237fa5b60 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:57.778622367 +0000 UTC m=+33.311963203 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-58tlk" (UniqueName: "kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk") pod "network-check-target-4kjg5" (UID: "42680cd4-0a5c-4123-aae1-963237fa5b60") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:42.015821 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:42.015760 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:42.015995 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:42.015907 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjg5" podUID="42680cd4-0a5c-4123-aae1-963237fa5b60" Apr 16 22:13:42.015995 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:42.015939 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:42.016113 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:42.016084 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lpwn6" podUID="ddd2c4f1-e24c-431c-a59a-9936d01e4667" Apr 16 22:13:43.015528 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:43.015494 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:43.015973 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:43.015605 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-76zkl" podUID="4074edbf-bd19-4b41-b90e-a27b5cc066e7" Apr 16 22:13:43.946045 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:13:43.946006 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97863583_437e_48ec_8c04_9cb36c6d5a89.slice/crio-8af1bf7569bcf973e196a377b40335fb06cc6ba1bcdb788e26897c5eb369492a WatchSource:0}: Error finding container 8af1bf7569bcf973e196a377b40335fb06cc6ba1bcdb788e26897c5eb369492a: Status 404 returned error can't find the container with id 8af1bf7569bcf973e196a377b40335fb06cc6ba1bcdb788e26897c5eb369492a Apr 16 22:13:44.015685 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:44.015647 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:44.016014 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:44.015690 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:44.016014 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:44.015771 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lpwn6" podUID="ddd2c4f1-e24c-431c-a59a-9936d01e4667" Apr 16 22:13:44.016014 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:44.015899 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjg5" podUID="42680cd4-0a5c-4123-aae1-963237fa5b60" Apr 16 22:13:44.142461 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:44.142433 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9vfz7" event={"ID":"97863583-437e-48ec-8c04-9cb36c6d5a89","Type":"ContainerStarted","Data":"8af1bf7569bcf973e196a377b40335fb06cc6ba1bcdb788e26897c5eb369492a"} Apr 16 22:13:45.019262 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.017185 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:45.019262 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:45.017719 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-76zkl" podUID="4074edbf-bd19-4b41-b90e-a27b5cc066e7" Apr 16 22:13:45.026664 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:45.026634 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0de14f2d8a312df1fd0f43b1fd02a43e.slice/crio-56fef051a3306bdd268125614d61dc434aba07a31b9f5b58bd6ae0af4d9b6b2c.scope\": RecentStats: unable to find data in memory cache]" Apr 16 22:13:45.145387 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.145275 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal" event={"ID":"1e12d5376c1eb684628dd9cf17dac4a9","Type":"ContainerStarted","Data":"a5af24993de281b767780c1a2e55e94d8f84789a55919649f1675ed600dc58c4"} Apr 16 22:13:45.146728 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.146699 2576 generic.go:358] "Generic (PLEG): container finished" podID="0de14f2d8a312df1fd0f43b1fd02a43e" containerID="56fef051a3306bdd268125614d61dc434aba07a31b9f5b58bd6ae0af4d9b6b2c" exitCode=0 Apr 16 22:13:45.146850 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.146763 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" event={"ID":"0de14f2d8a312df1fd0f43b1fd02a43e","Type":"ContainerDied","Data":"56fef051a3306bdd268125614d61dc434aba07a31b9f5b58bd6ae0af4d9b6b2c"} Apr 16 22:13:45.148054 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.148035 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hqm5z" event={"ID":"77967758-2942-4cb4-aa90-9b16761c46b3","Type":"ContainerStarted","Data":"a6c23a4cd349cdcd7c075b593a6f78b5610f423b14ca94946daa48454ed5c766"} Apr 16 22:13:45.149438 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.149415 2576 generic.go:358] "Generic (PLEG): container finished" podID="fab36813-c9b9-4c2c-aa71-55346680c966" containerID="bdb4c06a41f6441a7e206b56cf66d2033e54a223590ce214aeeb883cc955bdb9" exitCode=0 Apr 16 22:13:45.149519 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.149448 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn6mm" event={"ID":"fab36813-c9b9-4c2c-aa71-55346680c966","Type":"ContainerDied","Data":"bdb4c06a41f6441a7e206b56cf66d2033e54a223590ce214aeeb883cc955bdb9"} Apr 16 22:13:45.150841 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.150819 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xr5zn" event={"ID":"07d69a7c-22b7-44a4-8aea-024e47f7912b","Type":"ContainerStarted","Data":"550fc356a11813eb69b9f6f57f5201a188ac0247b701905dfa05ba3aa1633d3f"} Apr 16 22:13:45.152131 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.152104 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" event={"ID":"a72c732d-41ba-4c13-bf18-c21f9bf94968","Type":"ContainerStarted","Data":"30d54a00a4f68d4ab7bcce73ff05f4681c4b279aebc8c116eb76722a7604290c"} Apr 16 22:13:45.153491 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.153464 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" event={"ID":"f9a6741d-2664-481e-bd4b-f57d10d85ede","Type":"ContainerStarted","Data":"093ca30a875b1b20671cadb21830c755b4291c24fd1c0282fc1e5842b4a87d9a"} Apr 16 22:13:45.154706 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.154667 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9vfz7" event={"ID":"97863583-437e-48ec-8c04-9cb36c6d5a89","Type":"ContainerStarted","Data":"bc53bba385e5f9c81e05651a18d5aa5ec8ac49591da5ac541ace2c6d998d3678"} Apr 16 22:13:45.157241 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.157191 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal" podStartSLOduration=20.157180318 podStartE2EDuration="20.157180318s" podCreationTimestamp="2026-04-16 22:13:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:13:45.156798118 +0000 UTC m=+20.690138984" watchObservedRunningTime="2026-04-16 22:13:45.157180318 +0000 UTC m=+20.690521170" Apr 16 22:13:45.157348 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.157301 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" event={"ID":"47402cab-7cba-42f8-be72-8d67c31c2e7c","Type":"ContainerStarted","Data":"271fd83eff28c739be97cfc6fe7c737431333342876908761c120cfe74f27ddc"} Apr 16 22:13:45.157348 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.157324 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" event={"ID":"47402cab-7cba-42f8-be72-8d67c31c2e7c","Type":"ContainerStarted","Data":"5d6722ab880e0e70d1b8cbabb767495a7601087b80128f999f4462de66c5199f"} Apr 16 22:13:45.157348 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.157336 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" event={"ID":"47402cab-7cba-42f8-be72-8d67c31c2e7c","Type":"ContainerStarted","Data":"b89ff4ee63881ecad438e777b36e76cd5d287e1755b8479a65f6ac65ad2308b3"} Apr 16 22:13:45.157495 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.157350 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" event={"ID":"47402cab-7cba-42f8-be72-8d67c31c2e7c","Type":"ContainerStarted","Data":"1e62e20dba3a68efaef3e804b9e6eb6348b4b90b12636b3f2022e768b8d56a3f"} Apr 16 22:13:45.157495 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.157362 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" event={"ID":"47402cab-7cba-42f8-be72-8d67c31c2e7c","Type":"ContainerStarted","Data":"ddad731cbb36590bf9335c4ce15193bfe3cd2e5bce1592a3af27bc044003c83f"} Apr 16 22:13:45.157495 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.157375 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" event={"ID":"47402cab-7cba-42f8-be72-8d67c31c2e7c","Type":"ContainerStarted","Data":"f9544ea072fc308148799fa703f6b9a4c85d8675dbe0936f3a9a4b6a1fd71889"} Apr 16 22:13:45.163483 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.163458 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6lpzp" event={"ID":"d3ae4fa7-c3ae-4639-be2a-f39b4666fae0","Type":"ContainerStarted","Data":"b32638dfed6ef30c4e9873472b24dafb1b76d46dc24d2628377c33bb3171fd1f"} Apr 16 22:13:45.170034 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.169995 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xr5zn" podStartSLOduration=2.420832477 podStartE2EDuration="20.169984031s" podCreationTimestamp="2026-04-16 22:13:25 +0000 UTC" firstStartedPulling="2026-04-16 22:13:26.269303332 +0000 UTC m=+1.802644164" lastFinishedPulling="2026-04-16 22:13:44.018454872 +0000 UTC m=+19.551795718" observedRunningTime="2026-04-16 22:13:45.169606838 +0000 UTC m=+20.702947694" watchObservedRunningTime="2026-04-16 22:13:45.169984031 +0000 UTC m=+20.703324883" Apr 16 22:13:45.180860 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.180788 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9vfz7" podStartSLOduration=13.180773134 podStartE2EDuration="13.180773134s" podCreationTimestamp="2026-04-16 22:13:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:13:45.180227111 +0000 UTC m=+20.713567963" watchObservedRunningTime="2026-04-16 22:13:45.180773134 +0000 UTC m=+20.714113987" Apr 16 22:13:45.209807 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.209766 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jdbf4" podStartSLOduration=2.451813626 podStartE2EDuration="20.209752238s" podCreationTimestamp="2026-04-16 22:13:25 +0000 UTC" firstStartedPulling="2026-04-16 22:13:26.261778771 +0000 UTC m=+1.795119616" lastFinishedPulling="2026-04-16 22:13:44.019717382 +0000 UTC m=+19.553058228" observedRunningTime="2026-04-16 22:13:45.209640995 +0000 UTC m=+20.742981850" watchObservedRunningTime="2026-04-16 22:13:45.209752238 +0000 UTC m=+20.743093089" Apr 16 22:13:45.259996 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:45.259956 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hqm5z" podStartSLOduration=2.50539813 podStartE2EDuration="20.259940943s" podCreationTimestamp="2026-04-16 22:13:25 +0000 UTC" firstStartedPulling="2026-04-16 22:13:26.27776953 +0000 UTC m=+1.811110361" lastFinishedPulling="2026-04-16 22:13:44.032312343 +0000 UTC m=+19.565653174" observedRunningTime="2026-04-16 22:13:45.245784999 +0000 UTC m=+20.779125865" watchObservedRunningTime="2026-04-16 22:13:45.259940943 +0000 UTC m=+20.793281796" Apr 16 22:13:46.015534 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:46.015499 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:46.015728 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:46.015499 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:46.015728 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:46.015637 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjg5" podUID="42680cd4-0a5c-4123-aae1-963237fa5b60" Apr 16 22:13:46.015728 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:46.015704 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lpwn6" podUID="ddd2c4f1-e24c-431c-a59a-9936d01e4667" Apr 16 22:13:46.149941 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:46.149919 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 22:13:46.166618 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:46.166576 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7mhr4" event={"ID":"365e92c8-bcc3-46c2-9512-0fa95057726c","Type":"ContainerStarted","Data":"35f57a6bd0cac23679d071d863a6218f49055a68868d20ac12281e1b040b3b29"} Apr 16 22:13:46.168647 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:46.168618 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" event={"ID":"0de14f2d8a312df1fd0f43b1fd02a43e","Type":"ContainerStarted","Data":"f2a9365f0535f81e5e83b39d139f961901e072ce5f97db19188ba3554f7bcc3b"} Apr 16 22:13:46.170471 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:46.170449 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" event={"ID":"f9a6741d-2664-481e-bd4b-f57d10d85ede","Type":"ContainerStarted","Data":"71da5b28f12606960ca1796f84b6e3c327500aa9930accb896987a7507c4152c"} Apr 16 22:13:46.179536 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:46.179487 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-7mhr4" podStartSLOduration=3.447354515 podStartE2EDuration="21.179472828s" podCreationTimestamp="2026-04-16 22:13:25 +0000 UTC" firstStartedPulling="2026-04-16 22:13:26.285816369 +0000 UTC m=+1.819157200" lastFinishedPulling="2026-04-16 22:13:44.017934669 +0000 UTC m=+19.551275513" observedRunningTime="2026-04-16 22:13:46.179194754 +0000 UTC m=+21.712535607" watchObservedRunningTime="2026-04-16 22:13:46.179472828 +0000 UTC m=+21.712813683" Apr 16 22:13:46.179943 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:46.179875 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-6lpzp" podStartSLOduration=3.411875144 podStartE2EDuration="21.179864344s" podCreationTimestamp="2026-04-16 22:13:25 +0000 UTC" firstStartedPulling="2026-04-16 22:13:26.250541329 +0000 UTC m=+1.783882160" lastFinishedPulling="2026-04-16 22:13:44.018530527 +0000 UTC m=+19.551871360" observedRunningTime="2026-04-16 22:13:45.259787248 +0000 UTC m=+20.793128122" watchObservedRunningTime="2026-04-16 22:13:46.179864344 +0000 UTC m=+21.713205198" Apr 16 22:13:46.191296 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:46.191253 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" podStartSLOduration=21.191238753 podStartE2EDuration="21.191238753s" podCreationTimestamp="2026-04-16 22:13:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:13:46.191114528 +0000 UTC m=+21.724455381" watchObservedRunningTime="2026-04-16 22:13:46.191238753 +0000 UTC m=+21.724579606" Apr 16 22:13:46.999225 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:46.999141 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T22:13:46.149937938Z","UUID":"4b2fa733-9611-4132-8dba-22d55a047c0c","Handler":null,"Name":"","Endpoint":""} Apr 16 22:13:47.002220 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:47.002199 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 22:13:47.002344 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:47.002228 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 22:13:47.015839 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:47.015701 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:47.015964 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:47.015943 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-76zkl" podUID="4074edbf-bd19-4b41-b90e-a27b5cc066e7" Apr 16 22:13:47.175290 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:47.175249 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" event={"ID":"47402cab-7cba-42f8-be72-8d67c31c2e7c","Type":"ContainerStarted","Data":"d91b3d78723cd2845c1ec97c87592c20a938be77e87cec94becb376dda97cb85"} Apr 16 22:13:48.015247 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:48.015210 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:48.015453 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:48.015210 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:48.015453 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:48.015328 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjg5" podUID="42680cd4-0a5c-4123-aae1-963237fa5b60" Apr 16 22:13:48.015453 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:48.015431 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lpwn6" podUID="ddd2c4f1-e24c-431c-a59a-9936d01e4667" Apr 16 22:13:48.179446 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:48.179410 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" event={"ID":"f9a6741d-2664-481e-bd4b-f57d10d85ede","Type":"ContainerStarted","Data":"edcc8ed796bff2452810fda8a029d318a5eb2d3804474a76d4ba68d24e1d6a2a"} Apr 16 22:13:49.015085 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:49.015054 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:49.015258 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:49.015158 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-76zkl" podUID="4074edbf-bd19-4b41-b90e-a27b5cc066e7" Apr 16 22:13:49.361880 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:49.361800 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-6lpzp" Apr 16 22:13:49.362509 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:49.362482 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-6lpzp" Apr 16 22:13:49.378838 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:49.378788 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pkdtk" podStartSLOduration=3.563171416 podStartE2EDuration="24.378773974s" podCreationTimestamp="2026-04-16 22:13:25 +0000 UTC" firstStartedPulling="2026-04-16 22:13:26.257021341 +0000 UTC m=+1.790362172" lastFinishedPulling="2026-04-16 22:13:47.072623812 +0000 UTC m=+22.605964730" observedRunningTime="2026-04-16 22:13:48.196630656 +0000 UTC m=+23.729971508" watchObservedRunningTime="2026-04-16 22:13:49.378773974 +0000 UTC m=+24.912114844" Apr 16 22:13:49.438176 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:49.438135 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4074edbf-bd19-4b41-b90e-a27b5cc066e7-original-pull-secret\") pod \"global-pull-secret-syncer-76zkl\" (UID: \"4074edbf-bd19-4b41-b90e-a27b5cc066e7\") " pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:49.438355 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:49.438289 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:49.438409 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:49.438362 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4074edbf-bd19-4b41-b90e-a27b5cc066e7-original-pull-secret podName:4074edbf-bd19-4b41-b90e-a27b5cc066e7 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:05.438343398 +0000 UTC m=+40.971684250 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4074edbf-bd19-4b41-b90e-a27b5cc066e7-original-pull-secret") pod "global-pull-secret-syncer-76zkl" (UID: "4074edbf-bd19-4b41-b90e-a27b5cc066e7") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:50.015003 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:50.014970 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:50.015185 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:50.014966 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:50.015185 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:50.015092 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lpwn6" podUID="ddd2c4f1-e24c-431c-a59a-9936d01e4667" Apr 16 22:13:50.015302 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:50.015181 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjg5" podUID="42680cd4-0a5c-4123-aae1-963237fa5b60" Apr 16 22:13:50.182217 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:50.182129 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-6lpzp" Apr 16 22:13:50.182595 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:50.182580 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-6lpzp" Apr 16 22:13:51.015566 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:51.015383 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:51.016331 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:51.015653 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-76zkl" podUID="4074edbf-bd19-4b41-b90e-a27b5cc066e7" Apr 16 22:13:51.186623 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:51.186588 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" event={"ID":"47402cab-7cba-42f8-be72-8d67c31c2e7c","Type":"ContainerStarted","Data":"96fb1247196e7ea09dc8a521b93366167becbe710797fab31c8b29a5cff8b143"} Apr 16 22:13:51.186862 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:51.186843 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:51.188270 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:51.188242 2576 generic.go:358] "Generic (PLEG): container finished" podID="fab36813-c9b9-4c2c-aa71-55346680c966" containerID="27336501b20e421cb72a8f71c69f449d7b6371a41f2bbebfa8add3ed11bb2253" exitCode=0 Apr 16 22:13:51.188375 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:51.188309 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn6mm" event={"ID":"fab36813-c9b9-4c2c-aa71-55346680c966","Type":"ContainerDied","Data":"27336501b20e421cb72a8f71c69f449d7b6371a41f2bbebfa8add3ed11bb2253"} Apr 16 22:13:51.201863 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:51.201842 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:51.212721 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:51.212658 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" podStartSLOduration=8.118799887 podStartE2EDuration="26.212644191s" podCreationTimestamp="2026-04-16 22:13:25 +0000 UTC" firstStartedPulling="2026-04-16 22:13:26.290551457 +0000 UTC m=+1.823892287" lastFinishedPulling="2026-04-16 22:13:44.384395747 +0000 UTC m=+19.917736591" observedRunningTime="2026-04-16 22:13:51.211692397 +0000 UTC m=+26.745033250" watchObservedRunningTime="2026-04-16 22:13:51.212644191 +0000 UTC m=+26.745985038" Apr 16 22:13:52.015543 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:52.015506 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:52.015725 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:52.015506 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:52.015725 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:52.015651 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjg5" podUID="42680cd4-0a5c-4123-aae1-963237fa5b60" Apr 16 22:13:52.016075 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:52.015734 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lpwn6" podUID="ddd2c4f1-e24c-431c-a59a-9936d01e4667" Apr 16 22:13:52.192100 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:52.192065 2576 generic.go:358] "Generic (PLEG): container finished" podID="fab36813-c9b9-4c2c-aa71-55346680c966" containerID="516f2018f9c2abe68108cdf3d50442beb55f80aa4e890163e36aac1b477e9ac1" exitCode=0 Apr 16 22:13:52.192230 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:52.192123 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn6mm" event={"ID":"fab36813-c9b9-4c2c-aa71-55346680c966","Type":"ContainerDied","Data":"516f2018f9c2abe68108cdf3d50442beb55f80aa4e890163e36aac1b477e9ac1"} Apr 16 22:13:52.192910 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:52.192887 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:52.192956 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:52.192916 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:52.207067 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:52.207041 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:13:52.240693 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:52.240616 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lpwn6"] Apr 16 22:13:52.240829 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:52.240744 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:52.240884 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:52.240828 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lpwn6" podUID="ddd2c4f1-e24c-431c-a59a-9936d01e4667" Apr 16 22:13:52.245040 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:52.245014 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-76zkl"] Apr 16 22:13:52.245154 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:52.245130 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:52.245219 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:52.245203 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-76zkl" podUID="4074edbf-bd19-4b41-b90e-a27b5cc066e7" Apr 16 22:13:52.245716 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:52.245697 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4kjg5"] Apr 16 22:13:52.245807 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:52.245794 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:52.245896 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:52.245872 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjg5" podUID="42680cd4-0a5c-4123-aae1-963237fa5b60" Apr 16 22:13:53.196598 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:53.196566 2576 generic.go:358] "Generic (PLEG): container finished" podID="fab36813-c9b9-4c2c-aa71-55346680c966" containerID="f3fb4e12844080016c11013e81de43a94b912fb6e09d8cc716bfb93fb049cb3f" exitCode=0 Apr 16 22:13:53.197054 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:53.196701 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn6mm" event={"ID":"fab36813-c9b9-4c2c-aa71-55346680c966","Type":"ContainerDied","Data":"f3fb4e12844080016c11013e81de43a94b912fb6e09d8cc716bfb93fb049cb3f"} Apr 16 22:13:54.015757 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:54.015714 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:54.015973 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:54.015774 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:54.015973 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:54.015714 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:54.015973 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:54.015846 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjg5" podUID="42680cd4-0a5c-4123-aae1-963237fa5b60" Apr 16 22:13:54.015973 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:54.015933 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-76zkl" podUID="4074edbf-bd19-4b41-b90e-a27b5cc066e7" Apr 16 22:13:54.016174 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:54.016032 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lpwn6" podUID="ddd2c4f1-e24c-431c-a59a-9936d01e4667" Apr 16 22:13:56.015257 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:56.015218 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:56.015867 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:56.015290 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:56.015867 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:56.015325 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:56.015867 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:56.015447 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4kjg5" podUID="42680cd4-0a5c-4123-aae1-963237fa5b60" Apr 16 22:13:56.015867 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:56.015469 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-76zkl" podUID="4074edbf-bd19-4b41-b90e-a27b5cc066e7" Apr 16 22:13:56.015867 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:56.015534 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lpwn6" podUID="ddd2c4f1-e24c-431c-a59a-9936d01e4667" Apr 16 22:13:57.318869 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.318785 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeReady" Apr 16 22:13:57.319301 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.318935 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 22:13:57.355947 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.355906 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-85b77b7fd5-n94qh"] Apr 16 22:13:57.380385 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.379664 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fbgc2"] Apr 16 22:13:57.380385 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.379804 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.383008 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.382968 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 22:13:57.383127 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.383020 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 22:13:57.383127 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.383036 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8tzbx\"" Apr 16 22:13:57.383208 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.382976 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 22:13:57.389909 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.389493 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 22:13:57.390920 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.390899 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-s8bss"] Apr 16 22:13:57.391054 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.391039 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fbgc2" Apr 16 22:13:57.393895 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.393874 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 22:13:57.394015 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.393912 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 22:13:57.394084 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.394032 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-66k94\"" Apr 16 22:13:57.409349 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.409325 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-85b77b7fd5-n94qh"] Apr 16 22:13:57.409349 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.409357 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fbgc2"] Apr 16 22:13:57.409545 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.409372 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-s8bss"] Apr 16 22:13:57.409545 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.409487 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s8bss" Apr 16 22:13:57.412113 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.412021 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 22:13:57.412274 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.412257 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 22:13:57.412381 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.412348 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 22:13:57.412481 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.412458 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-b4hjg\"" Apr 16 22:13:57.496777 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.496747 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-trusted-ca\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.496934 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.496822 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6nc6\" (UniqueName: \"kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-kube-api-access-s6nc6\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.496934 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.496869 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnh8z\" (UniqueName: \"kubernetes.io/projected/a290a4ed-5ccc-47be-bd46-836ba21fea56-kube-api-access-nnh8z\") pod \"dns-default-fbgc2\" (UID: \"a290a4ed-5ccc-47be-bd46-836ba21fea56\") " pod="openshift-dns/dns-default-fbgc2" Apr 16 22:13:57.496934 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.496924 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-ca-trust-extracted\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.497099 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.496957 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls\") pod \"dns-default-fbgc2\" (UID: \"a290a4ed-5ccc-47be-bd46-836ba21fea56\") " pod="openshift-dns/dns-default-fbgc2" Apr 16 22:13:57.497099 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.496989 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-image-registry-private-configuration\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.497099 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.497016 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-certificates\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.497099 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.497040 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs2r9\" (UniqueName: \"kubernetes.io/projected/49f46636-9c8e-44e1-88ec-d5c207868f31-kube-api-access-cs2r9\") pod \"ingress-canary-s8bss\" (UID: \"49f46636-9c8e-44e1-88ec-d5c207868f31\") " pod="openshift-ingress-canary/ingress-canary-s8bss" Apr 16 22:13:57.497099 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.497075 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.497288 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.497101 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-installation-pull-secrets\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.497288 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.497125 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a290a4ed-5ccc-47be-bd46-836ba21fea56-tmp-dir\") pod \"dns-default-fbgc2\" (UID: \"a290a4ed-5ccc-47be-bd46-836ba21fea56\") " pod="openshift-dns/dns-default-fbgc2" Apr 16 22:13:57.497288 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.497168 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert\") pod \"ingress-canary-s8bss\" (UID: \"49f46636-9c8e-44e1-88ec-d5c207868f31\") " pod="openshift-ingress-canary/ingress-canary-s8bss" Apr 16 22:13:57.497288 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.497244 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-bound-sa-token\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.497288 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.497283 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a290a4ed-5ccc-47be-bd46-836ba21fea56-config-volume\") pod \"dns-default-fbgc2\" (UID: \"a290a4ed-5ccc-47be-bd46-836ba21fea56\") " pod="openshift-dns/dns-default-fbgc2" Apr 16 22:13:57.598701 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.598599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6nc6\" (UniqueName: \"kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-kube-api-access-s6nc6\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.598701 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.598658 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnh8z\" (UniqueName: \"kubernetes.io/projected/a290a4ed-5ccc-47be-bd46-836ba21fea56-kube-api-access-nnh8z\") pod \"dns-default-fbgc2\" (UID: \"a290a4ed-5ccc-47be-bd46-836ba21fea56\") " pod="openshift-dns/dns-default-fbgc2" Apr 16 22:13:57.598942 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.598705 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-ca-trust-extracted\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.598942 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.598730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls\") pod \"dns-default-fbgc2\" (UID: \"a290a4ed-5ccc-47be-bd46-836ba21fea56\") " pod="openshift-dns/dns-default-fbgc2" Apr 16 22:13:57.598942 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.598756 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-image-registry-private-configuration\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.598942 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.598779 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-certificates\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.598942 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.598801 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cs2r9\" (UniqueName: \"kubernetes.io/projected/49f46636-9c8e-44e1-88ec-d5c207868f31-kube-api-access-cs2r9\") pod \"ingress-canary-s8bss\" (UID: \"49f46636-9c8e-44e1-88ec-d5c207868f31\") " pod="openshift-ingress-canary/ingress-canary-s8bss" Apr 16 22:13:57.598942 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.598871 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.598942 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:57.598916 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:13:57.598942 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.598921 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-installation-pull-secrets\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.598942 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:57.598931 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85b77b7fd5-n94qh: secret "image-registry-tls" not found Apr 16 22:13:57.598942 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.598945 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a290a4ed-5ccc-47be-bd46-836ba21fea56-tmp-dir\") pod \"dns-default-fbgc2\" (UID: \"a290a4ed-5ccc-47be-bd46-836ba21fea56\") " pod="openshift-dns/dns-default-fbgc2" Apr 16 22:13:57.599472 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.598978 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert\") pod \"ingress-canary-s8bss\" (UID: \"49f46636-9c8e-44e1-88ec-d5c207868f31\") " pod="openshift-ingress-canary/ingress-canary-s8bss" Apr 16 22:13:57.599472 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:57.598998 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls podName:4af26ebf-8c98-4962-a2da-cdbdf212d8a7 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:58.098978218 +0000 UTC m=+33.632319064 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls") pod "image-registry-85b77b7fd5-n94qh" (UID: "4af26ebf-8c98-4962-a2da-cdbdf212d8a7") : secret "image-registry-tls" not found Apr 16 22:13:57.599472 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:57.599031 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:13:57.599472 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.599046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-bound-sa-token\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.599472 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:57.599064 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert podName:49f46636-9c8e-44e1-88ec-d5c207868f31 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:58.099052912 +0000 UTC m=+33.632393742 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert") pod "ingress-canary-s8bss" (UID: "49f46636-9c8e-44e1-88ec-d5c207868f31") : secret "canary-serving-cert" not found Apr 16 22:13:57.599472 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.599087 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a290a4ed-5ccc-47be-bd46-836ba21fea56-config-volume\") pod \"dns-default-fbgc2\" (UID: \"a290a4ed-5ccc-47be-bd46-836ba21fea56\") " pod="openshift-dns/dns-default-fbgc2" Apr 16 22:13:57.599472 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.599108 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-trusted-ca\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.599472 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.599169 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-ca-trust-extracted\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.599472 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:57.599322 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:13:57.599472 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:57.599364 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls podName:a290a4ed-5ccc-47be-bd46-836ba21fea56 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:58.099350385 +0000 UTC m=+33.632691231 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls") pod "dns-default-fbgc2" (UID: "a290a4ed-5ccc-47be-bd46-836ba21fea56") : secret "dns-default-metrics-tls" not found Apr 16 22:13:57.599472 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.599404 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a290a4ed-5ccc-47be-bd46-836ba21fea56-tmp-dir\") pod \"dns-default-fbgc2\" (UID: \"a290a4ed-5ccc-47be-bd46-836ba21fea56\") " pod="openshift-dns/dns-default-fbgc2" Apr 16 22:13:57.600013 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.599877 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a290a4ed-5ccc-47be-bd46-836ba21fea56-config-volume\") pod \"dns-default-fbgc2\" (UID: \"a290a4ed-5ccc-47be-bd46-836ba21fea56\") " pod="openshift-dns/dns-default-fbgc2" Apr 16 22:13:57.600394 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.600374 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-certificates\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.600510 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.600394 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-trusted-ca\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.603569 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.603483 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-image-registry-private-configuration\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.603569 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.603494 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-installation-pull-secrets\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.607265 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.607232 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6nc6\" (UniqueName: \"kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-kube-api-access-s6nc6\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.608929 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.608081 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnh8z\" (UniqueName: \"kubernetes.io/projected/a290a4ed-5ccc-47be-bd46-836ba21fea56-kube-api-access-nnh8z\") pod \"dns-default-fbgc2\" (UID: \"a290a4ed-5ccc-47be-bd46-836ba21fea56\") " pod="openshift-dns/dns-default-fbgc2" Apr 16 22:13:57.608929 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.608634 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-bound-sa-token\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:57.608929 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.608883 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs2r9\" (UniqueName: \"kubernetes.io/projected/49f46636-9c8e-44e1-88ec-d5c207868f31-kube-api-access-cs2r9\") pod \"ingress-canary-s8bss\" (UID: \"49f46636-9c8e-44e1-88ec-d5c207868f31\") " pod="openshift-ingress-canary/ingress-canary-s8bss" Apr 16 22:13:57.700506 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.700456 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs\") pod \"network-metrics-daemon-lpwn6\" (UID: \"ddd2c4f1-e24c-431c-a59a-9936d01e4667\") " pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:57.700657 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:57.700600 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:57.700747 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:57.700695 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs podName:ddd2c4f1-e24c-431c-a59a-9936d01e4667 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:29.700659876 +0000 UTC m=+65.234000707 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs") pod "network-metrics-daemon-lpwn6" (UID: "ddd2c4f1-e24c-431c-a59a-9936d01e4667") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:57.801205 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:57.801166 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58tlk\" (UniqueName: \"kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk\") pod \"network-check-target-4kjg5\" (UID: \"42680cd4-0a5c-4123-aae1-963237fa5b60\") " pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:57.801387 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:57.801313 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:57.801387 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:57.801337 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:57.801387 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:57.801349 2576 projected.go:194] Error preparing data for projected volume kube-api-access-58tlk for pod openshift-network-diagnostics/network-check-target-4kjg5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:57.801498 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:57.801425 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk podName:42680cd4-0a5c-4123-aae1-963237fa5b60 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:29.801403931 +0000 UTC m=+65.334744782 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-58tlk" (UniqueName: "kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk") pod "network-check-target-4kjg5" (UID: "42680cd4-0a5c-4123-aae1-963237fa5b60") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:58.014895 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:58.014855 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:13:58.015104 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:58.014855 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:13:58.015104 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:58.014855 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:13:58.017963 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:58.017940 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:13:58.017963 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:58.017957 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-vqjpn\"" Apr 16 22:13:58.018136 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:58.017959 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 22:13:58.018136 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:58.017940 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:13:58.018136 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:58.017959 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zhbqt\"" Apr 16 22:13:58.018136 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:58.017981 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:13:58.105174 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:58.105137 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls\") pod \"dns-default-fbgc2\" (UID: \"a290a4ed-5ccc-47be-bd46-836ba21fea56\") " pod="openshift-dns/dns-default-fbgc2" Apr 16 22:13:58.105174 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:58.105192 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:58.105432 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:58.105232 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert\") pod \"ingress-canary-s8bss\" (UID: \"49f46636-9c8e-44e1-88ec-d5c207868f31\") " pod="openshift-ingress-canary/ingress-canary-s8bss" Apr 16 22:13:58.105432 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:58.105303 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:13:58.105432 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:58.105339 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:13:58.105432 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:58.105356 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85b77b7fd5-n94qh: secret "image-registry-tls" not found Apr 16 22:13:58.105432 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:58.105367 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls podName:a290a4ed-5ccc-47be-bd46-836ba21fea56 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:59.105351504 +0000 UTC m=+34.638692335 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls") pod "dns-default-fbgc2" (UID: "a290a4ed-5ccc-47be-bd46-836ba21fea56") : secret "dns-default-metrics-tls" not found Apr 16 22:13:58.105432 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:58.105399 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls podName:4af26ebf-8c98-4962-a2da-cdbdf212d8a7 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:59.10538243 +0000 UTC m=+34.638723275 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls") pod "image-registry-85b77b7fd5-n94qh" (UID: "4af26ebf-8c98-4962-a2da-cdbdf212d8a7") : secret "image-registry-tls" not found Apr 16 22:13:58.105432 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:58.105338 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:13:58.105432 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:58.105434 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert podName:49f46636-9c8e-44e1-88ec-d5c207868f31 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:59.105424574 +0000 UTC m=+34.638765416 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert") pod "ingress-canary-s8bss" (UID: "49f46636-9c8e-44e1-88ec-d5c207868f31") : secret "canary-serving-cert" not found Apr 16 22:13:59.114283 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:59.114244 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls\") pod \"dns-default-fbgc2\" (UID: \"a290a4ed-5ccc-47be-bd46-836ba21fea56\") " pod="openshift-dns/dns-default-fbgc2" Apr 16 22:13:59.115045 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:59.114305 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:13:59.115045 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:13:59.114336 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert\") pod \"ingress-canary-s8bss\" (UID: \"49f46636-9c8e-44e1-88ec-d5c207868f31\") " pod="openshift-ingress-canary/ingress-canary-s8bss" Apr 16 22:13:59.115045 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:59.114424 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:13:59.115045 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:59.114435 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:13:59.115045 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:59.114456 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85b77b7fd5-n94qh: secret "image-registry-tls" not found Apr 16 22:13:59.115045 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:59.114471 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls podName:a290a4ed-5ccc-47be-bd46-836ba21fea56 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:01.114456908 +0000 UTC m=+36.647797739 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls") pod "dns-default-fbgc2" (UID: "a290a4ed-5ccc-47be-bd46-836ba21fea56") : secret "dns-default-metrics-tls" not found Apr 16 22:13:59.115045 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:59.114511 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls podName:4af26ebf-8c98-4962-a2da-cdbdf212d8a7 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:01.11449233 +0000 UTC m=+36.647833162 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls") pod "image-registry-85b77b7fd5-n94qh" (UID: "4af26ebf-8c98-4962-a2da-cdbdf212d8a7") : secret "image-registry-tls" not found Apr 16 22:13:59.115045 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:59.114543 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:13:59.115045 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:13:59.114610 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert podName:49f46636-9c8e-44e1-88ec-d5c207868f31 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:01.114598384 +0000 UTC m=+36.647939214 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert") pod "ingress-canary-s8bss" (UID: "49f46636-9c8e-44e1-88ec-d5c207868f31") : secret "canary-serving-cert" not found Apr 16 22:14:00.213904 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:00.213726 2576 generic.go:358] "Generic (PLEG): container finished" podID="fab36813-c9b9-4c2c-aa71-55346680c966" containerID="5176087e7c8f95e67bc7e1c993a45fc3faa54f6ba0946b0341ac65de188a874f" exitCode=0 Apr 16 22:14:00.213904 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:00.213809 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn6mm" event={"ID":"fab36813-c9b9-4c2c-aa71-55346680c966","Type":"ContainerDied","Data":"5176087e7c8f95e67bc7e1c993a45fc3faa54f6ba0946b0341ac65de188a874f"} Apr 16 22:14:01.130033 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:01.129994 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls\") pod \"dns-default-fbgc2\" (UID: \"a290a4ed-5ccc-47be-bd46-836ba21fea56\") " pod="openshift-dns/dns-default-fbgc2" Apr 16 22:14:01.130033 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:01.130040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:14:01.130261 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:01.130136 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:01.130261 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:01.130147 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85b77b7fd5-n94qh: secret "image-registry-tls" not found Apr 16 22:14:01.130261 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:01.130146 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:01.130261 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:01.130182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert\") pod \"ingress-canary-s8bss\" (UID: \"49f46636-9c8e-44e1-88ec-d5c207868f31\") " pod="openshift-ingress-canary/ingress-canary-s8bss" Apr 16 22:14:01.130261 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:01.130190 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls podName:4af26ebf-8c98-4962-a2da-cdbdf212d8a7 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:05.130177407 +0000 UTC m=+40.663518238 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls") pod "image-registry-85b77b7fd5-n94qh" (UID: "4af26ebf-8c98-4962-a2da-cdbdf212d8a7") : secret "image-registry-tls" not found Apr 16 22:14:01.130261 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:01.130222 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls podName:a290a4ed-5ccc-47be-bd46-836ba21fea56 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:05.130208103 +0000 UTC m=+40.663548945 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls") pod "dns-default-fbgc2" (UID: "a290a4ed-5ccc-47be-bd46-836ba21fea56") : secret "dns-default-metrics-tls" not found Apr 16 22:14:01.130261 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:01.130233 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:01.130261 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:01.130258 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert podName:49f46636-9c8e-44e1-88ec-d5c207868f31 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:05.130251729 +0000 UTC m=+40.663592559 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert") pod "ingress-canary-s8bss" (UID: "49f46636-9c8e-44e1-88ec-d5c207868f31") : secret "canary-serving-cert" not found Apr 16 22:14:01.217959 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:01.217929 2576 generic.go:358] "Generic (PLEG): container finished" podID="fab36813-c9b9-4c2c-aa71-55346680c966" containerID="f6521171b6eb7717fdf6fadaf5e7ac39b257a6c5e647899b000345f8eecaf30f" exitCode=0 Apr 16 22:14:01.218363 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:01.217984 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn6mm" event={"ID":"fab36813-c9b9-4c2c-aa71-55346680c966","Type":"ContainerDied","Data":"f6521171b6eb7717fdf6fadaf5e7ac39b257a6c5e647899b000345f8eecaf30f"} Apr 16 22:14:02.222423 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:02.222343 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn6mm" event={"ID":"fab36813-c9b9-4c2c-aa71-55346680c966","Type":"ContainerStarted","Data":"88c21b3c66332b9d4c69224a73c003a54af91b38b833ecc98c2948a6a028519a"} Apr 16 22:14:02.249723 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:02.249636 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xn6mm" podStartSLOduration=3.761043922 podStartE2EDuration="37.249616966s" podCreationTimestamp="2026-04-16 22:13:25 +0000 UTC" firstStartedPulling="2026-04-16 22:13:26.273517161 +0000 UTC m=+1.806857999" lastFinishedPulling="2026-04-16 22:13:59.762090213 +0000 UTC m=+35.295431043" observedRunningTime="2026-04-16 22:14:02.248111204 +0000 UTC m=+37.781452054" watchObservedRunningTime="2026-04-16 22:14:02.249616966 +0000 UTC m=+37.782957820" Apr 16 22:14:05.158630 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:05.158581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls\") pod \"dns-default-fbgc2\" (UID: \"a290a4ed-5ccc-47be-bd46-836ba21fea56\") " pod="openshift-dns/dns-default-fbgc2" Apr 16 22:14:05.158630 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:05.158632 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:14:05.159063 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:05.158745 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:05.159063 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:05.158759 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85b77b7fd5-n94qh: secret "image-registry-tls" not found Apr 16 22:14:05.159063 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:05.158758 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:05.159063 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:05.158760 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert\") pod \"ingress-canary-s8bss\" (UID: \"49f46636-9c8e-44e1-88ec-d5c207868f31\") " pod="openshift-ingress-canary/ingress-canary-s8bss" Apr 16 22:14:05.159063 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:05.158807 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls podName:4af26ebf-8c98-4962-a2da-cdbdf212d8a7 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:13.158791023 +0000 UTC m=+48.692131876 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls") pod "image-registry-85b77b7fd5-n94qh" (UID: "4af26ebf-8c98-4962-a2da-cdbdf212d8a7") : secret "image-registry-tls" not found Apr 16 22:14:05.159063 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:05.158817 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:05.159063 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:05.158839 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls podName:a290a4ed-5ccc-47be-bd46-836ba21fea56 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:13.158819478 +0000 UTC m=+48.692160320 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls") pod "dns-default-fbgc2" (UID: "a290a4ed-5ccc-47be-bd46-836ba21fea56") : secret "dns-default-metrics-tls" not found Apr 16 22:14:05.159063 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:05.158857 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert podName:49f46636-9c8e-44e1-88ec-d5c207868f31 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:13.158847903 +0000 UTC m=+48.692188738 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert") pod "ingress-canary-s8bss" (UID: "49f46636-9c8e-44e1-88ec-d5c207868f31") : secret "canary-serving-cert" not found Apr 16 22:14:05.460997 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:05.460907 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4074edbf-bd19-4b41-b90e-a27b5cc066e7-original-pull-secret\") pod \"global-pull-secret-syncer-76zkl\" (UID: \"4074edbf-bd19-4b41-b90e-a27b5cc066e7\") " pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:14:05.464701 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:05.464659 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4074edbf-bd19-4b41-b90e-a27b5cc066e7-original-pull-secret\") pod \"global-pull-secret-syncer-76zkl\" (UID: \"4074edbf-bd19-4b41-b90e-a27b5cc066e7\") " pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:14:05.534084 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:05.534040 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-76zkl" Apr 16 22:14:05.703506 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:05.703477 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-76zkl"] Apr 16 22:14:05.706621 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:14:05.706578 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4074edbf_bd19_4b41_b90e_a27b5cc066e7.slice/crio-2b9957d544cb2663b9a4927560e64a1616e8619ad626f457dc6685f059befc51 WatchSource:0}: Error finding container 2b9957d544cb2663b9a4927560e64a1616e8619ad626f457dc6685f059befc51: Status 404 returned error can't find the container with id 2b9957d544cb2663b9a4927560e64a1616e8619ad626f457dc6685f059befc51 Apr 16 22:14:06.231564 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:06.231518 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-76zkl" event={"ID":"4074edbf-bd19-4b41-b90e-a27b5cc066e7","Type":"ContainerStarted","Data":"2b9957d544cb2663b9a4927560e64a1616e8619ad626f457dc6685f059befc51"} Apr 16 22:14:10.240577 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:10.240541 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-76zkl" event={"ID":"4074edbf-bd19-4b41-b90e-a27b5cc066e7","Type":"ContainerStarted","Data":"8b20b9a4ceb86541a3fe10668d9b35030c7521e50e4b3fc36229b516fe7644e0"} Apr 16 22:14:10.256362 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:10.256312 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-76zkl" podStartSLOduration=33.370623496 podStartE2EDuration="37.256298247s" podCreationTimestamp="2026-04-16 22:13:33 +0000 UTC" firstStartedPulling="2026-04-16 22:14:05.708164466 +0000 UTC m=+41.241505301" lastFinishedPulling="2026-04-16 22:14:09.59383922 +0000 UTC m=+45.127180052" observedRunningTime="2026-04-16 22:14:10.255375602 +0000 UTC m=+45.788716455" watchObservedRunningTime="2026-04-16 22:14:10.256298247 +0000 UTC m=+45.789639099" Apr 16 22:14:13.226086 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:13.225986 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls\") pod \"dns-default-fbgc2\" (UID: \"a290a4ed-5ccc-47be-bd46-836ba21fea56\") " pod="openshift-dns/dns-default-fbgc2" Apr 16 22:14:13.226086 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:13.226042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:14:13.226086 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:13.226070 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert\") pod \"ingress-canary-s8bss\" (UID: \"49f46636-9c8e-44e1-88ec-d5c207868f31\") " pod="openshift-ingress-canary/ingress-canary-s8bss" Apr 16 22:14:13.226549 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:13.226169 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:13.226549 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:13.226199 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:13.226549 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:13.226243 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls podName:a290a4ed-5ccc-47be-bd46-836ba21fea56 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:29.226222955 +0000 UTC m=+64.759563806 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls") pod "dns-default-fbgc2" (UID: "a290a4ed-5ccc-47be-bd46-836ba21fea56") : secret "dns-default-metrics-tls" not found Apr 16 22:14:13.226549 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:13.226201 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:13.226549 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:13.226290 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85b77b7fd5-n94qh: secret "image-registry-tls" not found Apr 16 22:14:13.226549 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:13.226274 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert podName:49f46636-9c8e-44e1-88ec-d5c207868f31 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:29.226257891 +0000 UTC m=+64.759598736 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert") pod "ingress-canary-s8bss" (UID: "49f46636-9c8e-44e1-88ec-d5c207868f31") : secret "canary-serving-cert" not found Apr 16 22:14:13.226549 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:13.226351 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls podName:4af26ebf-8c98-4962-a2da-cdbdf212d8a7 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:29.226338882 +0000 UTC m=+64.759679713 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls") pod "image-registry-85b77b7fd5-n94qh" (UID: "4af26ebf-8c98-4962-a2da-cdbdf212d8a7") : secret "image-registry-tls" not found Apr 16 22:14:24.209463 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:24.209434 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k24wz" Apr 16 22:14:29.236325 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:29.236282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls\") pod \"dns-default-fbgc2\" (UID: \"a290a4ed-5ccc-47be-bd46-836ba21fea56\") " pod="openshift-dns/dns-default-fbgc2" Apr 16 22:14:29.236325 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:29.236329 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:14:29.236770 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:29.236349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert\") pod \"ingress-canary-s8bss\" (UID: \"49f46636-9c8e-44e1-88ec-d5c207868f31\") " pod="openshift-ingress-canary/ingress-canary-s8bss" Apr 16 22:14:29.236770 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:29.236431 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:29.236770 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:29.236435 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:29.236770 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:29.236437 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:29.236770 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:29.236492 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert podName:49f46636-9c8e-44e1-88ec-d5c207868f31 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:01.236477662 +0000 UTC m=+96.769818492 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert") pod "ingress-canary-s8bss" (UID: "49f46636-9c8e-44e1-88ec-d5c207868f31") : secret "canary-serving-cert" not found Apr 16 22:14:29.236770 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:29.236496 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85b77b7fd5-n94qh: secret "image-registry-tls" not found Apr 16 22:14:29.236770 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:29.236506 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls podName:a290a4ed-5ccc-47be-bd46-836ba21fea56 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:01.236499307 +0000 UTC m=+96.769840138 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls") pod "dns-default-fbgc2" (UID: "a290a4ed-5ccc-47be-bd46-836ba21fea56") : secret "dns-default-metrics-tls" not found Apr 16 22:14:29.236770 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:29.236527 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls podName:4af26ebf-8c98-4962-a2da-cdbdf212d8a7 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:01.236513615 +0000 UTC m=+96.769854445 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls") pod "image-registry-85b77b7fd5-n94qh" (UID: "4af26ebf-8c98-4962-a2da-cdbdf212d8a7") : secret "image-registry-tls" not found Apr 16 22:14:29.738784 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:29.738753 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs\") pod \"network-metrics-daemon-lpwn6\" (UID: \"ddd2c4f1-e24c-431c-a59a-9936d01e4667\") " pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:14:29.741641 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:29.741621 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:14:29.749616 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:29.749597 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:14:29.749703 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:14:29.749662 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs podName:ddd2c4f1-e24c-431c-a59a-9936d01e4667 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:33.749642871 +0000 UTC m=+129.282983702 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs") pod "network-metrics-daemon-lpwn6" (UID: "ddd2c4f1-e24c-431c-a59a-9936d01e4667") : secret "metrics-daemon-secret" not found Apr 16 22:14:29.840088 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:29.840045 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58tlk\" (UniqueName: \"kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk\") pod \"network-check-target-4kjg5\" (UID: \"42680cd4-0a5c-4123-aae1-963237fa5b60\") " pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:14:29.842749 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:29.842731 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:14:29.852542 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:29.852522 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:14:29.863664 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:29.863635 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58tlk\" (UniqueName: \"kubernetes.io/projected/42680cd4-0a5c-4123-aae1-963237fa5b60-kube-api-access-58tlk\") pod \"network-check-target-4kjg5\" (UID: \"42680cd4-0a5c-4123-aae1-963237fa5b60\") " pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:14:30.129853 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:30.129826 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-vqjpn\"" Apr 16 22:14:30.138029 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:30.138009 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:14:30.283770 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:30.283739 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4kjg5"] Apr 16 22:14:30.286640 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:14:30.286613 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42680cd4_0a5c_4123_aae1_963237fa5b60.slice/crio-847989d150a7a9b431692eefc5708a9fac3e12499ff2b1e4e7ea1e4e6f7a9495 WatchSource:0}: Error finding container 847989d150a7a9b431692eefc5708a9fac3e12499ff2b1e4e7ea1e4e6f7a9495: Status 404 returned error can't find the container with id 847989d150a7a9b431692eefc5708a9fac3e12499ff2b1e4e7ea1e4e6f7a9495 Apr 16 22:14:31.278462 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:31.278422 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4kjg5" event={"ID":"42680cd4-0a5c-4123-aae1-963237fa5b60","Type":"ContainerStarted","Data":"847989d150a7a9b431692eefc5708a9fac3e12499ff2b1e4e7ea1e4e6f7a9495"} Apr 16 22:14:33.283344 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:33.283309 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4kjg5" event={"ID":"42680cd4-0a5c-4123-aae1-963237fa5b60","Type":"ContainerStarted","Data":"dff8a9aa5b8fed47022e4efe68499e54f0e6aae6459cf6d56e8d25ab7f8b6d11"} Apr 16 22:14:33.283716 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:33.283448 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:14:33.298658 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:14:33.298609 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-4kjg5" podStartSLOduration=65.528591551 podStartE2EDuration="1m8.298596147s" podCreationTimestamp="2026-04-16 22:13:25 +0000 UTC" firstStartedPulling="2026-04-16 22:14:30.28842702 +0000 UTC m=+65.821767851" lastFinishedPulling="2026-04-16 22:14:33.058431595 +0000 UTC m=+68.591772447" observedRunningTime="2026-04-16 22:14:33.298463973 +0000 UTC m=+68.831804850" watchObservedRunningTime="2026-04-16 22:14:33.298596147 +0000 UTC m=+68.831936999" Apr 16 22:15:01.264977 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:01.264837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls\") pod \"dns-default-fbgc2\" (UID: \"a290a4ed-5ccc-47be-bd46-836ba21fea56\") " pod="openshift-dns/dns-default-fbgc2" Apr 16 22:15:01.264977 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:01.264901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:15:01.264977 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:01.264925 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert\") pod \"ingress-canary-s8bss\" (UID: \"49f46636-9c8e-44e1-88ec-d5c207868f31\") " pod="openshift-ingress-canary/ingress-canary-s8bss" Apr 16 22:15:01.265599 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:15:01.264983 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:15:01.265599 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:15:01.265028 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:15:01.265599 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:15:01.265033 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:15:01.265599 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:15:01.265049 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85b77b7fd5-n94qh: secret "image-registry-tls" not found Apr 16 22:15:01.265599 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:15:01.265079 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls podName:a290a4ed-5ccc-47be-bd46-836ba21fea56 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:05.265055001 +0000 UTC m=+160.798395853 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls") pod "dns-default-fbgc2" (UID: "a290a4ed-5ccc-47be-bd46-836ba21fea56") : secret "dns-default-metrics-tls" not found Apr 16 22:15:01.265599 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:15:01.265098 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert podName:49f46636-9c8e-44e1-88ec-d5c207868f31 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:05.265089506 +0000 UTC m=+160.798430344 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert") pod "ingress-canary-s8bss" (UID: "49f46636-9c8e-44e1-88ec-d5c207868f31") : secret "canary-serving-cert" not found Apr 16 22:15:01.265599 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:15:01.265108 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls podName:4af26ebf-8c98-4962-a2da-cdbdf212d8a7 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:05.265102902 +0000 UTC m=+160.798443732 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls") pod "image-registry-85b77b7fd5-n94qh" (UID: "4af26ebf-8c98-4962-a2da-cdbdf212d8a7") : secret "image-registry-tls" not found Apr 16 22:15:04.287811 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:04.287782 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-4kjg5" Apr 16 22:15:33.796469 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:33.796397 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs\") pod \"network-metrics-daemon-lpwn6\" (UID: \"ddd2c4f1-e24c-431c-a59a-9936d01e4667\") " pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:15:33.797086 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:15:33.796569 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:15:33.797086 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:15:33.796668 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs podName:ddd2c4f1-e24c-431c-a59a-9936d01e4667 nodeName:}" failed. No retries permitted until 2026-04-16 22:17:35.796645547 +0000 UTC m=+251.329986395 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs") pod "network-metrics-daemon-lpwn6" (UID: "ddd2c4f1-e24c-431c-a59a-9936d01e4667") : secret "metrics-daemon-secret" not found Apr 16 22:15:59.635514 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.635476 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l94dt"] Apr 16 22:15:59.638188 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.638172 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l94dt" Apr 16 22:15:59.641888 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.641867 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:59.642136 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.642110 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-kczsz\"" Apr 16 22:15:59.642933 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.642915 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-7r6sf"] Apr 16 22:15:59.643429 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.643410 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 22:15:59.645398 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.645382 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k5vkd"] Apr 16 22:15:59.645518 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.645505 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" Apr 16 22:15:59.648454 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.648430 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 22:15:59.648724 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.648707 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 22:15:59.648981 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.648722 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-ffcmp\"" Apr 16 22:15:59.649044 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.649008 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:59.649474 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.649416 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 22:15:59.650517 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.650236 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-rpsvs"] Apr 16 22:15:59.650789 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.650753 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k5vkd" Apr 16 22:15:59.653172 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.653151 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 22:15:59.653941 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.653567 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l94dt"] Apr 16 22:15:59.653941 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.653661 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k5vkd"] Apr 16 22:15:59.653941 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.653743 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-rpsvs" Apr 16 22:15:59.653941 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.653793 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 22:15:59.654179 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.654127 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-p79zb\"" Apr 16 22:15:59.654436 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.654416 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:59.655215 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.655197 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 22:15:59.657010 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.656991 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 22:15:59.657125 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.657072 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 22:15:59.657459 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.657436 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 22:15:59.657625 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.657601 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-bwgms\"" Apr 16 22:15:59.657927 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.657902 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 22:15:59.658122 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.658103 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-7r6sf"] Apr 16 22:15:59.663479 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.663328 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 22:15:59.667938 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.667919 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-rpsvs"] Apr 16 22:15:59.678442 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.678420 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2rf9\" (UniqueName: \"kubernetes.io/projected/79d26f89-119e-4c1e-a0a0-4d0e4e546efe-kube-api-access-j2rf9\") pod \"volume-data-source-validator-7c6cbb6c87-l94dt\" (UID: \"79d26f89-119e-4c1e-a0a0-4d0e4e546efe\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l94dt" Apr 16 22:15:59.741878 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.741840 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-749dh"] Apr 16 22:15:59.744937 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.744913 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-749dh" Apr 16 22:15:59.749569 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.749541 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-hgtdl\"" Apr 16 22:15:59.749746 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.749618 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:59.749828 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.749791 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 22:15:59.750063 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.750043 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 22:15:59.750357 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.750337 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 22:15:59.767708 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.767662 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-749dh"] Apr 16 22:15:59.779473 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.779448 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed1e6036-1beb-445f-8b61-65e736181605-tmp\") pod \"insights-operator-585dfdc468-rpsvs\" (UID: \"ed1e6036-1beb-445f-8b61-65e736181605\") " pod="openshift-insights/insights-operator-585dfdc468-rpsvs" Apr 16 22:15:59.779609 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.779483 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2hk7\" (UniqueName: \"kubernetes.io/projected/1335948c-8695-4844-a3a5-6819848109ce-kube-api-access-m2hk7\") pod \"cluster-samples-operator-6dc5bdb6b4-k5vkd\" (UID: \"1335948c-8695-4844-a3a5-6819848109ce\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k5vkd" Apr 16 22:15:59.779609 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.779520 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88wfb\" (UniqueName: \"kubernetes.io/projected/ba403d08-6963-4274-8dcb-309016f31037-kube-api-access-88wfb\") pod \"console-operator-9d4b6777b-7r6sf\" (UID: \"ba403d08-6963-4274-8dcb-309016f31037\") " pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" Apr 16 22:15:59.779609 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.779545 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed1e6036-1beb-445f-8b61-65e736181605-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-rpsvs\" (UID: \"ed1e6036-1beb-445f-8b61-65e736181605\") " pod="openshift-insights/insights-operator-585dfdc468-rpsvs" Apr 16 22:15:59.779609 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.779584 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba403d08-6963-4274-8dcb-309016f31037-serving-cert\") pod \"console-operator-9d4b6777b-7r6sf\" (UID: \"ba403d08-6963-4274-8dcb-309016f31037\") " pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" Apr 16 22:15:59.779800 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.779630 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba403d08-6963-4274-8dcb-309016f31037-config\") pod \"console-operator-9d4b6777b-7r6sf\" (UID: \"ba403d08-6963-4274-8dcb-309016f31037\") " pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" Apr 16 22:15:59.779800 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.779687 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ed1e6036-1beb-445f-8b61-65e736181605-snapshots\") pod \"insights-operator-585dfdc468-rpsvs\" (UID: \"ed1e6036-1beb-445f-8b61-65e736181605\") " pod="openshift-insights/insights-operator-585dfdc468-rpsvs" Apr 16 22:15:59.779800 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.779719 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2rf9\" (UniqueName: \"kubernetes.io/projected/79d26f89-119e-4c1e-a0a0-4d0e4e546efe-kube-api-access-j2rf9\") pod \"volume-data-source-validator-7c6cbb6c87-l94dt\" (UID: \"79d26f89-119e-4c1e-a0a0-4d0e4e546efe\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l94dt" Apr 16 22:15:59.779800 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.779737 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba403d08-6963-4274-8dcb-309016f31037-trusted-ca\") pod \"console-operator-9d4b6777b-7r6sf\" (UID: \"ba403d08-6963-4274-8dcb-309016f31037\") " pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" Apr 16 22:15:59.779800 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.779764 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed1e6036-1beb-445f-8b61-65e736181605-service-ca-bundle\") pod \"insights-operator-585dfdc468-rpsvs\" (UID: \"ed1e6036-1beb-445f-8b61-65e736181605\") " pod="openshift-insights/insights-operator-585dfdc468-rpsvs" Apr 16 22:15:59.779800 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.779783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed1e6036-1beb-445f-8b61-65e736181605-serving-cert\") pod \"insights-operator-585dfdc468-rpsvs\" (UID: \"ed1e6036-1beb-445f-8b61-65e736181605\") " pod="openshift-insights/insights-operator-585dfdc468-rpsvs" Apr 16 22:15:59.779800 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.779801 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1335948c-8695-4844-a3a5-6819848109ce-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k5vkd\" (UID: \"1335948c-8695-4844-a3a5-6819848109ce\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k5vkd" Apr 16 22:15:59.780102 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.779819 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thgkt\" (UniqueName: \"kubernetes.io/projected/ed1e6036-1beb-445f-8b61-65e736181605-kube-api-access-thgkt\") pod \"insights-operator-585dfdc468-rpsvs\" (UID: \"ed1e6036-1beb-445f-8b61-65e736181605\") " pod="openshift-insights/insights-operator-585dfdc468-rpsvs" Apr 16 22:15:59.791386 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.791360 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2rf9\" (UniqueName: \"kubernetes.io/projected/79d26f89-119e-4c1e-a0a0-4d0e4e546efe-kube-api-access-j2rf9\") pod \"volume-data-source-validator-7c6cbb6c87-l94dt\" (UID: \"79d26f89-119e-4c1e-a0a0-4d0e4e546efe\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l94dt" Apr 16 22:15:59.833172 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.833143 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vvx4n"] Apr 16 22:15:59.836088 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.836064 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vvx4n" Apr 16 22:15:59.836195 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.836121 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fcxmn"] Apr 16 22:15:59.838895 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.838873 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fcxmn" Apr 16 22:15:59.839095 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.839077 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-cqz7m\"" Apr 16 22:15:59.841367 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.841349 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 22:15:59.841494 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.841474 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-ms8ss\"" Apr 16 22:15:59.842101 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.842080 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 22:15:59.848368 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.848345 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vvx4n"] Apr 16 22:15:59.849560 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.849539 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fcxmn"] Apr 16 22:15:59.880934 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.880473 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2wl8\" (UniqueName: \"kubernetes.io/projected/cbfb5b76-e002-4f88-a22a-9d55fd6b9348-kube-api-access-f2wl8\") pod \"kube-storage-version-migrator-operator-6769c5d45-749dh\" (UID: \"cbfb5b76-e002-4f88-a22a-9d55fd6b9348\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-749dh" Apr 16 22:15:59.881127 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.880984 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ed1e6036-1beb-445f-8b61-65e736181605-snapshots\") pod \"insights-operator-585dfdc468-rpsvs\" (UID: \"ed1e6036-1beb-445f-8b61-65e736181605\") " pod="openshift-insights/insights-operator-585dfdc468-rpsvs" Apr 16 22:15:59.881127 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.881036 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbfb5b76-e002-4f88-a22a-9d55fd6b9348-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-749dh\" (UID: \"cbfb5b76-e002-4f88-a22a-9d55fd6b9348\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-749dh" Apr 16 22:15:59.881243 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.881134 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f8ba909f-a641-4832-b7a1-a11849ea7211-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fcxmn\" (UID: \"f8ba909f-a641-4832-b7a1-a11849ea7211\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fcxmn" Apr 16 22:15:59.881243 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.881203 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba403d08-6963-4274-8dcb-309016f31037-trusted-ca\") pod \"console-operator-9d4b6777b-7r6sf\" (UID: \"ba403d08-6963-4274-8dcb-309016f31037\") " pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" Apr 16 22:15:59.881372 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.881248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed1e6036-1beb-445f-8b61-65e736181605-service-ca-bundle\") pod \"insights-operator-585dfdc468-rpsvs\" (UID: \"ed1e6036-1beb-445f-8b61-65e736181605\") " pod="openshift-insights/insights-operator-585dfdc468-rpsvs" Apr 16 22:15:59.881372 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.881282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed1e6036-1beb-445f-8b61-65e736181605-serving-cert\") pod \"insights-operator-585dfdc468-rpsvs\" (UID: \"ed1e6036-1beb-445f-8b61-65e736181605\") " pod="openshift-insights/insights-operator-585dfdc468-rpsvs" Apr 16 22:15:59.881372 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.881317 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1335948c-8695-4844-a3a5-6819848109ce-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k5vkd\" (UID: \"1335948c-8695-4844-a3a5-6819848109ce\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k5vkd" Apr 16 22:15:59.881372 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.881345 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f8ba909f-a641-4832-b7a1-a11849ea7211-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fcxmn\" (UID: \"f8ba909f-a641-4832-b7a1-a11849ea7211\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fcxmn" Apr 16 22:15:59.881556 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.881386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thgkt\" (UniqueName: \"kubernetes.io/projected/ed1e6036-1beb-445f-8b61-65e736181605-kube-api-access-thgkt\") pod \"insights-operator-585dfdc468-rpsvs\" (UID: \"ed1e6036-1beb-445f-8b61-65e736181605\") " pod="openshift-insights/insights-operator-585dfdc468-rpsvs" Apr 16 22:15:59.881556 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.881420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed1e6036-1beb-445f-8b61-65e736181605-tmp\") pod \"insights-operator-585dfdc468-rpsvs\" (UID: \"ed1e6036-1beb-445f-8b61-65e736181605\") " pod="openshift-insights/insights-operator-585dfdc468-rpsvs" Apr 16 22:15:59.881556 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.881456 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2hk7\" (UniqueName: \"kubernetes.io/projected/1335948c-8695-4844-a3a5-6819848109ce-kube-api-access-m2hk7\") pod \"cluster-samples-operator-6dc5bdb6b4-k5vkd\" (UID: \"1335948c-8695-4844-a3a5-6819848109ce\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k5vkd" Apr 16 22:15:59.881556 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.881500 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbfb5b76-e002-4f88-a22a-9d55fd6b9348-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-749dh\" (UID: \"cbfb5b76-e002-4f88-a22a-9d55fd6b9348\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-749dh" Apr 16 22:15:59.881556 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.881535 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnmj6\" (UniqueName: \"kubernetes.io/projected/a7f138c6-266e-41cd-9d3e-1ad1d55c0770-kube-api-access-bnmj6\") pod \"network-check-source-8894fc9bd-vvx4n\" (UID: \"a7f138c6-266e-41cd-9d3e-1ad1d55c0770\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vvx4n" Apr 16 22:15:59.881814 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.881576 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88wfb\" (UniqueName: \"kubernetes.io/projected/ba403d08-6963-4274-8dcb-309016f31037-kube-api-access-88wfb\") pod \"console-operator-9d4b6777b-7r6sf\" (UID: \"ba403d08-6963-4274-8dcb-309016f31037\") " pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" Apr 16 22:15:59.881814 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.881611 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed1e6036-1beb-445f-8b61-65e736181605-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-rpsvs\" (UID: \"ed1e6036-1beb-445f-8b61-65e736181605\") " pod="openshift-insights/insights-operator-585dfdc468-rpsvs" Apr 16 22:15:59.881814 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.881650 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba403d08-6963-4274-8dcb-309016f31037-serving-cert\") pod \"console-operator-9d4b6777b-7r6sf\" (UID: \"ba403d08-6963-4274-8dcb-309016f31037\") " pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" Apr 16 22:15:59.881814 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.881752 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba403d08-6963-4274-8dcb-309016f31037-config\") pod \"console-operator-9d4b6777b-7r6sf\" (UID: \"ba403d08-6963-4274-8dcb-309016f31037\") " pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" Apr 16 22:15:59.881814 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.881787 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ed1e6036-1beb-445f-8b61-65e736181605-snapshots\") pod \"insights-operator-585dfdc468-rpsvs\" (UID: \"ed1e6036-1beb-445f-8b61-65e736181605\") " pod="openshift-insights/insights-operator-585dfdc468-rpsvs" Apr 16 22:15:59.882357 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.882331 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed1e6036-1beb-445f-8b61-65e736181605-service-ca-bundle\") pod \"insights-operator-585dfdc468-rpsvs\" (UID: \"ed1e6036-1beb-445f-8b61-65e736181605\") " pod="openshift-insights/insights-operator-585dfdc468-rpsvs" Apr 16 22:15:59.883373 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:15:59.883350 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:15:59.883478 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:15:59.883449 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1335948c-8695-4844-a3a5-6819848109ce-samples-operator-tls podName:1335948c-8695-4844-a3a5-6819848109ce nodeName:}" failed. No retries permitted until 2026-04-16 22:16:00.383420734 +0000 UTC m=+155.916761639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1335948c-8695-4844-a3a5-6819848109ce-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-k5vkd" (UID: "1335948c-8695-4844-a3a5-6819848109ce") : secret "samples-operator-tls" not found Apr 16 22:15:59.884161 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.884135 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed1e6036-1beb-445f-8b61-65e736181605-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-rpsvs\" (UID: \"ed1e6036-1beb-445f-8b61-65e736181605\") " pod="openshift-insights/insights-operator-585dfdc468-rpsvs" Apr 16 22:15:59.884502 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.884478 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba403d08-6963-4274-8dcb-309016f31037-config\") pod \"console-operator-9d4b6777b-7r6sf\" (UID: \"ba403d08-6963-4274-8dcb-309016f31037\") " pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" Apr 16 22:15:59.884580 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.883356 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba403d08-6963-4274-8dcb-309016f31037-trusted-ca\") pod \"console-operator-9d4b6777b-7r6sf\" (UID: \"ba403d08-6963-4274-8dcb-309016f31037\") " pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" Apr 16 22:15:59.884859 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.884821 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed1e6036-1beb-445f-8b61-65e736181605-tmp\") pod \"insights-operator-585dfdc468-rpsvs\" (UID: \"ed1e6036-1beb-445f-8b61-65e736181605\") " pod="openshift-insights/insights-operator-585dfdc468-rpsvs" Apr 16 22:15:59.885313 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.885292 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed1e6036-1beb-445f-8b61-65e736181605-serving-cert\") pod \"insights-operator-585dfdc468-rpsvs\" (UID: \"ed1e6036-1beb-445f-8b61-65e736181605\") " pod="openshift-insights/insights-operator-585dfdc468-rpsvs" Apr 16 22:15:59.887557 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.887496 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba403d08-6963-4274-8dcb-309016f31037-serving-cert\") pod \"console-operator-9d4b6777b-7r6sf\" (UID: \"ba403d08-6963-4274-8dcb-309016f31037\") " pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" Apr 16 22:15:59.892866 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.892842 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88wfb\" (UniqueName: \"kubernetes.io/projected/ba403d08-6963-4274-8dcb-309016f31037-kube-api-access-88wfb\") pod \"console-operator-9d4b6777b-7r6sf\" (UID: \"ba403d08-6963-4274-8dcb-309016f31037\") " pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" Apr 16 22:15:59.894462 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.894443 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thgkt\" (UniqueName: \"kubernetes.io/projected/ed1e6036-1beb-445f-8b61-65e736181605-kube-api-access-thgkt\") pod \"insights-operator-585dfdc468-rpsvs\" (UID: \"ed1e6036-1beb-445f-8b61-65e736181605\") " pod="openshift-insights/insights-operator-585dfdc468-rpsvs" Apr 16 22:15:59.894704 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.894668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2hk7\" (UniqueName: \"kubernetes.io/projected/1335948c-8695-4844-a3a5-6819848109ce-kube-api-access-m2hk7\") pod \"cluster-samples-operator-6dc5bdb6b4-k5vkd\" (UID: \"1335948c-8695-4844-a3a5-6819848109ce\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k5vkd" Apr 16 22:15:59.948011 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.947971 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l94dt" Apr 16 22:15:59.959832 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.959803 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" Apr 16 22:15:59.972749 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.972722 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-rpsvs" Apr 16 22:15:59.982779 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.982696 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f8ba909f-a641-4832-b7a1-a11849ea7211-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fcxmn\" (UID: \"f8ba909f-a641-4832-b7a1-a11849ea7211\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fcxmn" Apr 16 22:15:59.982779 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.982768 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbfb5b76-e002-4f88-a22a-9d55fd6b9348-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-749dh\" (UID: \"cbfb5b76-e002-4f88-a22a-9d55fd6b9348\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-749dh" Apr 16 22:15:59.982964 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.982799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bnmj6\" (UniqueName: \"kubernetes.io/projected/a7f138c6-266e-41cd-9d3e-1ad1d55c0770-kube-api-access-bnmj6\") pod \"network-check-source-8894fc9bd-vvx4n\" (UID: \"a7f138c6-266e-41cd-9d3e-1ad1d55c0770\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vvx4n" Apr 16 22:15:59.982964 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.982897 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2wl8\" (UniqueName: \"kubernetes.io/projected/cbfb5b76-e002-4f88-a22a-9d55fd6b9348-kube-api-access-f2wl8\") pod \"kube-storage-version-migrator-operator-6769c5d45-749dh\" (UID: \"cbfb5b76-e002-4f88-a22a-9d55fd6b9348\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-749dh" Apr 16 22:15:59.982964 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.982929 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbfb5b76-e002-4f88-a22a-9d55fd6b9348-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-749dh\" (UID: \"cbfb5b76-e002-4f88-a22a-9d55fd6b9348\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-749dh" Apr 16 22:15:59.982964 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.982959 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f8ba909f-a641-4832-b7a1-a11849ea7211-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fcxmn\" (UID: \"f8ba909f-a641-4832-b7a1-a11849ea7211\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fcxmn" Apr 16 22:15:59.983173 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:15:59.983066 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:15:59.983173 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:15:59.983134 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8ba909f-a641-4832-b7a1-a11849ea7211-networking-console-plugin-cert podName:f8ba909f-a641-4832-b7a1-a11849ea7211 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:00.483118198 +0000 UTC m=+156.016459033 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f8ba909f-a641-4832-b7a1-a11849ea7211-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fcxmn" (UID: "f8ba909f-a641-4832-b7a1-a11849ea7211") : secret "networking-console-plugin-cert" not found Apr 16 22:15:59.983373 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.983326 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbfb5b76-e002-4f88-a22a-9d55fd6b9348-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-749dh\" (UID: \"cbfb5b76-e002-4f88-a22a-9d55fd6b9348\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-749dh" Apr 16 22:15:59.983479 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.983399 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f8ba909f-a641-4832-b7a1-a11849ea7211-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fcxmn\" (UID: \"f8ba909f-a641-4832-b7a1-a11849ea7211\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fcxmn" Apr 16 22:15:59.986901 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.986877 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbfb5b76-e002-4f88-a22a-9d55fd6b9348-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-749dh\" (UID: \"cbfb5b76-e002-4f88-a22a-9d55fd6b9348\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-749dh" Apr 16 22:15:59.992326 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.992281 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2wl8\" (UniqueName: \"kubernetes.io/projected/cbfb5b76-e002-4f88-a22a-9d55fd6b9348-kube-api-access-f2wl8\") pod \"kube-storage-version-migrator-operator-6769c5d45-749dh\" (UID: \"cbfb5b76-e002-4f88-a22a-9d55fd6b9348\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-749dh" Apr 16 22:15:59.992966 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:15:59.992936 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnmj6\" (UniqueName: \"kubernetes.io/projected/a7f138c6-266e-41cd-9d3e-1ad1d55c0770-kube-api-access-bnmj6\") pod \"network-check-source-8894fc9bd-vvx4n\" (UID: \"a7f138c6-266e-41cd-9d3e-1ad1d55c0770\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vvx4n" Apr 16 22:16:00.054096 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:00.053798 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-749dh" Apr 16 22:16:00.085098 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:00.083944 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l94dt"] Apr 16 22:16:00.086651 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:16:00.086612 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79d26f89_119e_4c1e_a0a0_4d0e4e546efe.slice/crio-a1a3295a2f83d47c4d48dbd7bf204e17b80b650b5c826141b3703c1a41ede562 WatchSource:0}: Error finding container a1a3295a2f83d47c4d48dbd7bf204e17b80b650b5c826141b3703c1a41ede562: Status 404 returned error can't find the container with id a1a3295a2f83d47c4d48dbd7bf204e17b80b650b5c826141b3703c1a41ede562 Apr 16 22:16:00.105010 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:00.104963 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-7r6sf"] Apr 16 22:16:00.107923 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:16:00.107890 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba403d08_6963_4274_8dcb_309016f31037.slice/crio-9efbd5fc8f3c7a358343521d7b7536475dbea0e8c5a6410b8b5604460e810015 WatchSource:0}: Error finding container 9efbd5fc8f3c7a358343521d7b7536475dbea0e8c5a6410b8b5604460e810015: Status 404 returned error can't find the container with id 9efbd5fc8f3c7a358343521d7b7536475dbea0e8c5a6410b8b5604460e810015 Apr 16 22:16:00.124008 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:00.123981 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-rpsvs"] Apr 16 22:16:00.126603 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:16:00.126570 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded1e6036_1beb_445f_8b61_65e736181605.slice/crio-bcd63beaaedf440be5b7f25839f3fee8ea8d17987aab336da2b06927e16280dc WatchSource:0}: Error finding container bcd63beaaedf440be5b7f25839f3fee8ea8d17987aab336da2b06927e16280dc: Status 404 returned error can't find the container with id bcd63beaaedf440be5b7f25839f3fee8ea8d17987aab336da2b06927e16280dc Apr 16 22:16:00.145994 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:00.145968 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vvx4n" Apr 16 22:16:00.188330 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:00.188298 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-749dh"] Apr 16 22:16:00.190911 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:16:00.190861 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbfb5b76_e002_4f88_a22a_9d55fd6b9348.slice/crio-90b127060fb9cd67e7ca8cb7df8d7981b7fa5d80d2316ce270f046c37e90a324 WatchSource:0}: Error finding container 90b127060fb9cd67e7ca8cb7df8d7981b7fa5d80d2316ce270f046c37e90a324: Status 404 returned error can't find the container with id 90b127060fb9cd67e7ca8cb7df8d7981b7fa5d80d2316ce270f046c37e90a324 Apr 16 22:16:00.261281 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:00.261251 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vvx4n"] Apr 16 22:16:00.264412 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:16:00.264382 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7f138c6_266e_41cd_9d3e_1ad1d55c0770.slice/crio-17b04af4f69c86f03b11b4fb1fd5536ded35e90eccbe14d1d9ebc57bf3ffb7b8 WatchSource:0}: Error finding container 17b04af4f69c86f03b11b4fb1fd5536ded35e90eccbe14d1d9ebc57bf3ffb7b8: Status 404 returned error can't find the container with id 17b04af4f69c86f03b11b4fb1fd5536ded35e90eccbe14d1d9ebc57bf3ffb7b8 Apr 16 22:16:00.387532 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:00.387498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1335948c-8695-4844-a3a5-6819848109ce-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k5vkd\" (UID: \"1335948c-8695-4844-a3a5-6819848109ce\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k5vkd" Apr 16 22:16:00.387720 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:00.387684 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:16:00.387777 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:00.387766 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1335948c-8695-4844-a3a5-6819848109ce-samples-operator-tls podName:1335948c-8695-4844-a3a5-6819848109ce nodeName:}" failed. No retries permitted until 2026-04-16 22:16:01.387743802 +0000 UTC m=+156.921084637 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1335948c-8695-4844-a3a5-6819848109ce-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-k5vkd" (UID: "1335948c-8695-4844-a3a5-6819848109ce") : secret "samples-operator-tls" not found Apr 16 22:16:00.393609 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:00.393541 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" podUID="4af26ebf-8c98-4962-a2da-cdbdf212d8a7" Apr 16 22:16:00.401885 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:00.401864 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-fbgc2" podUID="a290a4ed-5ccc-47be-bd46-836ba21fea56" Apr 16 22:16:00.420307 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:00.420258 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-s8bss" podUID="49f46636-9c8e-44e1-88ec-d5c207868f31" Apr 16 22:16:00.446879 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:00.446849 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-rpsvs" event={"ID":"ed1e6036-1beb-445f-8b61-65e736181605","Type":"ContainerStarted","Data":"bcd63beaaedf440be5b7f25839f3fee8ea8d17987aab336da2b06927e16280dc"} Apr 16 22:16:00.447732 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:00.447706 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-749dh" event={"ID":"cbfb5b76-e002-4f88-a22a-9d55fd6b9348","Type":"ContainerStarted","Data":"90b127060fb9cd67e7ca8cb7df8d7981b7fa5d80d2316ce270f046c37e90a324"} Apr 16 22:16:00.448862 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:00.448841 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vvx4n" event={"ID":"a7f138c6-266e-41cd-9d3e-1ad1d55c0770","Type":"ContainerStarted","Data":"43e8757c9df4c2b3297a0d709b2de9b562eded30b96c027e791690bd648eb73f"} Apr 16 22:16:00.448950 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:00.448865 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vvx4n" event={"ID":"a7f138c6-266e-41cd-9d3e-1ad1d55c0770","Type":"ContainerStarted","Data":"17b04af4f69c86f03b11b4fb1fd5536ded35e90eccbe14d1d9ebc57bf3ffb7b8"} Apr 16 22:16:00.449821 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:00.449799 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" event={"ID":"ba403d08-6963-4274-8dcb-309016f31037","Type":"ContainerStarted","Data":"9efbd5fc8f3c7a358343521d7b7536475dbea0e8c5a6410b8b5604460e810015"} Apr 16 22:16:00.450659 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:00.450641 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l94dt" event={"ID":"79d26f89-119e-4c1e-a0a0-4d0e4e546efe","Type":"ContainerStarted","Data":"a1a3295a2f83d47c4d48dbd7bf204e17b80b650b5c826141b3703c1a41ede562"} Apr 16 22:16:00.450752 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:00.450695 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fbgc2" Apr 16 22:16:00.450752 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:00.450720 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:16:00.467415 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:00.467373 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vvx4n" podStartSLOduration=1.467356078 podStartE2EDuration="1.467356078s" podCreationTimestamp="2026-04-16 22:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:16:00.466655123 +0000 UTC m=+155.999995978" watchObservedRunningTime="2026-04-16 22:16:00.467356078 +0000 UTC m=+156.000696971" Apr 16 22:16:00.488879 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:00.488845 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f8ba909f-a641-4832-b7a1-a11849ea7211-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fcxmn\" (UID: \"f8ba909f-a641-4832-b7a1-a11849ea7211\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fcxmn" Apr 16 22:16:00.489023 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:00.488986 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:16:00.489075 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:00.489050 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8ba909f-a641-4832-b7a1-a11849ea7211-networking-console-plugin-cert podName:f8ba909f-a641-4832-b7a1-a11849ea7211 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:01.489032742 +0000 UTC m=+157.022373586 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f8ba909f-a641-4832-b7a1-a11849ea7211-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fcxmn" (UID: "f8ba909f-a641-4832-b7a1-a11849ea7211") : secret "networking-console-plugin-cert" not found Apr 16 22:16:01.038351 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:01.038309 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-lpwn6" podUID="ddd2c4f1-e24c-431c-a59a-9936d01e4667" Apr 16 22:16:01.398083 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:01.397366 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1335948c-8695-4844-a3a5-6819848109ce-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k5vkd\" (UID: \"1335948c-8695-4844-a3a5-6819848109ce\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k5vkd" Apr 16 22:16:01.398083 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:01.397569 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:16:01.398083 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:01.397649 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1335948c-8695-4844-a3a5-6819848109ce-samples-operator-tls podName:1335948c-8695-4844-a3a5-6819848109ce nodeName:}" failed. No retries permitted until 2026-04-16 22:16:03.397615709 +0000 UTC m=+158.930956543 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1335948c-8695-4844-a3a5-6819848109ce-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-k5vkd" (UID: "1335948c-8695-4844-a3a5-6819848109ce") : secret "samples-operator-tls" not found Apr 16 22:16:01.498695 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:01.498180 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f8ba909f-a641-4832-b7a1-a11849ea7211-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fcxmn\" (UID: \"f8ba909f-a641-4832-b7a1-a11849ea7211\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fcxmn" Apr 16 22:16:01.498695 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:01.498433 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:16:01.498695 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:01.498566 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8ba909f-a641-4832-b7a1-a11849ea7211-networking-console-plugin-cert podName:f8ba909f-a641-4832-b7a1-a11849ea7211 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:03.498542074 +0000 UTC m=+159.031882928 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f8ba909f-a641-4832-b7a1-a11849ea7211-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fcxmn" (UID: "f8ba909f-a641-4832-b7a1-a11849ea7211") : secret "networking-console-plugin-cert" not found Apr 16 22:16:03.415316 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:03.415226 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1335948c-8695-4844-a3a5-6819848109ce-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k5vkd\" (UID: \"1335948c-8695-4844-a3a5-6819848109ce\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k5vkd" Apr 16 22:16:03.415713 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:03.415414 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:16:03.415713 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:03.415493 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1335948c-8695-4844-a3a5-6819848109ce-samples-operator-tls podName:1335948c-8695-4844-a3a5-6819848109ce nodeName:}" failed. No retries permitted until 2026-04-16 22:16:07.415472149 +0000 UTC m=+162.948812983 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1335948c-8695-4844-a3a5-6819848109ce-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-k5vkd" (UID: "1335948c-8695-4844-a3a5-6819848109ce") : secret "samples-operator-tls" not found Apr 16 22:16:03.464106 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:03.464081 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/0.log" Apr 16 22:16:03.464266 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:03.464119 2576 generic.go:358] "Generic (PLEG): container finished" podID="ba403d08-6963-4274-8dcb-309016f31037" containerID="248ad060ffdfa4aaa50db5bc0b259dcdd62de8994d328e5fc52b41bf8843efcd" exitCode=255 Apr 16 22:16:03.464266 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:03.464183 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" event={"ID":"ba403d08-6963-4274-8dcb-309016f31037","Type":"ContainerDied","Data":"248ad060ffdfa4aaa50db5bc0b259dcdd62de8994d328e5fc52b41bf8843efcd"} Apr 16 22:16:03.464479 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:03.464451 2576 scope.go:117] "RemoveContainer" containerID="248ad060ffdfa4aaa50db5bc0b259dcdd62de8994d328e5fc52b41bf8843efcd" Apr 16 22:16:03.465708 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:03.465663 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l94dt" event={"ID":"79d26f89-119e-4c1e-a0a0-4d0e4e546efe","Type":"ContainerStarted","Data":"f5cee5889a6c2375065bd36a3e11db28cb8dfd4148466a8696be28d2c8ddb27c"} Apr 16 22:16:03.467132 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:03.467095 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-rpsvs" event={"ID":"ed1e6036-1beb-445f-8b61-65e736181605","Type":"ContainerStarted","Data":"e0a1ef7ee1dab85e2377285e579c21b2e5194196057fd9f6af73606f68dbcc30"} Apr 16 22:16:03.468444 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:03.468409 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-749dh" event={"ID":"cbfb5b76-e002-4f88-a22a-9d55fd6b9348","Type":"ContainerStarted","Data":"801fc4d1204f094039b401257c94fc7657a8ea7214e3f38c66d0df6a9dc9f41c"} Apr 16 22:16:03.517494 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:03.516733 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f8ba909f-a641-4832-b7a1-a11849ea7211-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fcxmn\" (UID: \"f8ba909f-a641-4832-b7a1-a11849ea7211\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fcxmn" Apr 16 22:16:03.517494 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:03.517063 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:16:03.517494 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:03.517127 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8ba909f-a641-4832-b7a1-a11849ea7211-networking-console-plugin-cert podName:f8ba909f-a641-4832-b7a1-a11849ea7211 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:07.517108158 +0000 UTC m=+163.050449008 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f8ba909f-a641-4832-b7a1-a11849ea7211-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fcxmn" (UID: "f8ba909f-a641-4832-b7a1-a11849ea7211") : secret "networking-console-plugin-cert" not found Apr 16 22:16:03.525024 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:03.524970 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l94dt" podStartSLOduration=1.459837112 podStartE2EDuration="4.524955045s" podCreationTimestamp="2026-04-16 22:15:59 +0000 UTC" firstStartedPulling="2026-04-16 22:16:00.088998695 +0000 UTC m=+155.622339528" lastFinishedPulling="2026-04-16 22:16:03.15411663 +0000 UTC m=+158.687457461" observedRunningTime="2026-04-16 22:16:03.523334642 +0000 UTC m=+159.056675494" watchObservedRunningTime="2026-04-16 22:16:03.524955045 +0000 UTC m=+159.058295898" Apr 16 22:16:03.556057 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:03.555999 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-749dh" podStartSLOduration=1.589362108 podStartE2EDuration="4.555982412s" podCreationTimestamp="2026-04-16 22:15:59 +0000 UTC" firstStartedPulling="2026-04-16 22:16:00.193554784 +0000 UTC m=+155.726895617" lastFinishedPulling="2026-04-16 22:16:03.160175079 +0000 UTC m=+158.693515921" observedRunningTime="2026-04-16 22:16:03.554930842 +0000 UTC m=+159.088271696" watchObservedRunningTime="2026-04-16 22:16:03.555982412 +0000 UTC m=+159.089323269" Apr 16 22:16:03.585936 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:03.585881 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-rpsvs" podStartSLOduration=1.558684647 podStartE2EDuration="4.585862698s" podCreationTimestamp="2026-04-16 22:15:59 +0000 UTC" firstStartedPulling="2026-04-16 22:16:00.130065026 +0000 UTC m=+155.663405857" lastFinishedPulling="2026-04-16 22:16:03.157243077 +0000 UTC m=+158.690583908" observedRunningTime="2026-04-16 22:16:03.58569182 +0000 UTC m=+159.119032677" watchObservedRunningTime="2026-04-16 22:16:03.585862698 +0000 UTC m=+159.119203553" Apr 16 22:16:04.471661 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:04.471631 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 22:16:04.472154 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:04.472004 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/0.log" Apr 16 22:16:04.472154 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:04.472036 2576 generic.go:358] "Generic (PLEG): container finished" podID="ba403d08-6963-4274-8dcb-309016f31037" containerID="d65627dfe728ab084bdd26a1792d3e64b8096aa468c4bb2f304d3fb010895993" exitCode=255 Apr 16 22:16:04.472154 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:04.472064 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" event={"ID":"ba403d08-6963-4274-8dcb-309016f31037","Type":"ContainerDied","Data":"d65627dfe728ab084bdd26a1792d3e64b8096aa468c4bb2f304d3fb010895993"} Apr 16 22:16:04.472154 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:04.472102 2576 scope.go:117] "RemoveContainer" containerID="248ad060ffdfa4aaa50db5bc0b259dcdd62de8994d328e5fc52b41bf8843efcd" Apr 16 22:16:04.472422 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:04.472407 2576 scope.go:117] "RemoveContainer" containerID="d65627dfe728ab084bdd26a1792d3e64b8096aa468c4bb2f304d3fb010895993" Apr 16 22:16:04.472634 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:04.472598 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-7r6sf_openshift-console-operator(ba403d08-6963-4274-8dcb-309016f31037)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" podUID="ba403d08-6963-4274-8dcb-309016f31037" Apr 16 22:16:05.332327 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:05.332298 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls\") pod \"dns-default-fbgc2\" (UID: \"a290a4ed-5ccc-47be-bd46-836ba21fea56\") " pod="openshift-dns/dns-default-fbgc2" Apr 16 22:16:05.332327 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:05.332336 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls\") pod \"image-registry-85b77b7fd5-n94qh\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:16:05.332617 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:05.332368 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert\") pod \"ingress-canary-s8bss\" (UID: \"49f46636-9c8e-44e1-88ec-d5c207868f31\") " pod="openshift-ingress-canary/ingress-canary-s8bss" Apr 16 22:16:05.332617 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:05.332441 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:16:05.332617 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:05.332492 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls podName:a290a4ed-5ccc-47be-bd46-836ba21fea56 nodeName:}" failed. No retries permitted until 2026-04-16 22:18:07.332479745 +0000 UTC m=+282.865820575 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls") pod "dns-default-fbgc2" (UID: "a290a4ed-5ccc-47be-bd46-836ba21fea56") : secret "dns-default-metrics-tls" not found Apr 16 22:16:05.332617 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:05.332444 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:16:05.332617 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:05.332527 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:16:05.332617 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:05.332534 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85b77b7fd5-n94qh: secret "image-registry-tls" not found Apr 16 22:16:05.332617 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:05.332580 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert podName:49f46636-9c8e-44e1-88ec-d5c207868f31 nodeName:}" failed. No retries permitted until 2026-04-16 22:18:07.332566122 +0000 UTC m=+282.865906952 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert") pod "ingress-canary-s8bss" (UID: "49f46636-9c8e-44e1-88ec-d5c207868f31") : secret "canary-serving-cert" not found Apr 16 22:16:05.332617 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:05.332600 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls podName:4af26ebf-8c98-4962-a2da-cdbdf212d8a7 nodeName:}" failed. No retries permitted until 2026-04-16 22:18:07.332591772 +0000 UTC m=+282.865932602 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls") pod "image-registry-85b77b7fd5-n94qh" (UID: "4af26ebf-8c98-4962-a2da-cdbdf212d8a7") : secret "image-registry-tls" not found Apr 16 22:16:05.476014 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:05.475988 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 22:16:05.476416 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:05.476321 2576 scope.go:117] "RemoveContainer" containerID="d65627dfe728ab084bdd26a1792d3e64b8096aa468c4bb2f304d3fb010895993" Apr 16 22:16:05.476498 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:05.476480 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-7r6sf_openshift-console-operator(ba403d08-6963-4274-8dcb-309016f31037)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" podUID="ba403d08-6963-4274-8dcb-309016f31037" Apr 16 22:16:05.892338 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:05.892305 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-fvvh4"] Apr 16 22:16:05.896553 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:05.896536 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fvvh4" Apr 16 22:16:05.899657 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:05.899634 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 22:16:05.899805 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:05.899709 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-2wc9d\"" Apr 16 22:16:05.899805 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:05.899789 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 22:16:05.904345 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:05.904325 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-fvvh4"] Apr 16 22:16:06.038569 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:06.038534 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rknn2\" (UniqueName: \"kubernetes.io/projected/3b80a9f7-2b18-4471-a955-6e15ca0536b3-kube-api-access-rknn2\") pod \"migrator-74bb7799d9-fvvh4\" (UID: \"3b80a9f7-2b18-4471-a955-6e15ca0536b3\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fvvh4" Apr 16 22:16:06.086752 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:06.086724 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9vfz7_97863583-437e-48ec-8c04-9cb36c6d5a89/dns-node-resolver/0.log" Apr 16 22:16:06.139288 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:06.139248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rknn2\" (UniqueName: \"kubernetes.io/projected/3b80a9f7-2b18-4471-a955-6e15ca0536b3-kube-api-access-rknn2\") pod \"migrator-74bb7799d9-fvvh4\" (UID: \"3b80a9f7-2b18-4471-a955-6e15ca0536b3\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fvvh4" Apr 16 22:16:06.148548 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:06.148484 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rknn2\" (UniqueName: \"kubernetes.io/projected/3b80a9f7-2b18-4471-a955-6e15ca0536b3-kube-api-access-rknn2\") pod \"migrator-74bb7799d9-fvvh4\" (UID: \"3b80a9f7-2b18-4471-a955-6e15ca0536b3\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fvvh4" Apr 16 22:16:06.205799 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:06.205763 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fvvh4" Apr 16 22:16:06.322560 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:06.322497 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-fvvh4"] Apr 16 22:16:06.325182 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:16:06.325155 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b80a9f7_2b18_4471_a955_6e15ca0536b3.slice/crio-bd18ba92d7a906727da8004d1658dcfb45bd174da85a03dbd86bd8bfdb5cc84b WatchSource:0}: Error finding container bd18ba92d7a906727da8004d1658dcfb45bd174da85a03dbd86bd8bfdb5cc84b: Status 404 returned error can't find the container with id bd18ba92d7a906727da8004d1658dcfb45bd174da85a03dbd86bd8bfdb5cc84b Apr 16 22:16:06.479135 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:06.479046 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fvvh4" event={"ID":"3b80a9f7-2b18-4471-a955-6e15ca0536b3","Type":"ContainerStarted","Data":"bd18ba92d7a906727da8004d1658dcfb45bd174da85a03dbd86bd8bfdb5cc84b"} Apr 16 22:16:07.451995 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:07.451949 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1335948c-8695-4844-a3a5-6819848109ce-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k5vkd\" (UID: \"1335948c-8695-4844-a3a5-6819848109ce\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k5vkd" Apr 16 22:16:07.452158 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:07.452105 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:16:07.452198 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:07.452177 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1335948c-8695-4844-a3a5-6819848109ce-samples-operator-tls podName:1335948c-8695-4844-a3a5-6819848109ce nodeName:}" failed. No retries permitted until 2026-04-16 22:16:15.452158283 +0000 UTC m=+170.985499131 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1335948c-8695-4844-a3a5-6819848109ce-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-k5vkd" (UID: "1335948c-8695-4844-a3a5-6819848109ce") : secret "samples-operator-tls" not found Apr 16 22:16:07.483255 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:07.483170 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fvvh4" event={"ID":"3b80a9f7-2b18-4471-a955-6e15ca0536b3","Type":"ContainerStarted","Data":"3e016f4d3c2f1957be3fdc466ab8b54e2d76e890732f2b11d7eeff2daf87df01"} Apr 16 22:16:07.483255 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:07.483212 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fvvh4" event={"ID":"3b80a9f7-2b18-4471-a955-6e15ca0536b3","Type":"ContainerStarted","Data":"9d6f13fe25ba394c5c127dcfdace30420a1fd0c74de6344594456bbf359070e5"} Apr 16 22:16:07.491227 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:07.491204 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xr5zn_07d69a7c-22b7-44a4-8aea-024e47f7912b/node-ca/0.log" Apr 16 22:16:07.513192 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:07.513145 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-fvvh4" podStartSLOduration=1.629216613 podStartE2EDuration="2.513132059s" podCreationTimestamp="2026-04-16 22:16:05 +0000 UTC" firstStartedPulling="2026-04-16 22:16:06.327329726 +0000 UTC m=+161.860670556" lastFinishedPulling="2026-04-16 22:16:07.211245162 +0000 UTC m=+162.744586002" observedRunningTime="2026-04-16 22:16:07.51255246 +0000 UTC m=+163.045893313" watchObservedRunningTime="2026-04-16 22:16:07.513132059 +0000 UTC m=+163.046472911" Apr 16 22:16:07.552871 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:07.552836 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f8ba909f-a641-4832-b7a1-a11849ea7211-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fcxmn\" (UID: \"f8ba909f-a641-4832-b7a1-a11849ea7211\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fcxmn" Apr 16 22:16:07.553019 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:07.552969 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:16:07.553056 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:07.553035 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8ba909f-a641-4832-b7a1-a11849ea7211-networking-console-plugin-cert podName:f8ba909f-a641-4832-b7a1-a11849ea7211 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:15.553015609 +0000 UTC m=+171.086356463 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f8ba909f-a641-4832-b7a1-a11849ea7211-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fcxmn" (UID: "f8ba909f-a641-4832-b7a1-a11849ea7211") : secret "networking-console-plugin-cert" not found Apr 16 22:16:09.960420 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:09.960381 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" Apr 16 22:16:09.960420 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:09.960419 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" Apr 16 22:16:09.960864 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:09.960794 2576 scope.go:117] "RemoveContainer" containerID="d65627dfe728ab084bdd26a1792d3e64b8096aa468c4bb2f304d3fb010895993" Apr 16 22:16:09.960988 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:09.960971 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-7r6sf_openshift-console-operator(ba403d08-6963-4274-8dcb-309016f31037)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" podUID="ba403d08-6963-4274-8dcb-309016f31037" Apr 16 22:16:13.015484 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:13.015449 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:16:13.015925 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:13.015636 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s8bss" Apr 16 22:16:15.519575 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:15.519484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1335948c-8695-4844-a3a5-6819848109ce-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k5vkd\" (UID: \"1335948c-8695-4844-a3a5-6819848109ce\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k5vkd" Apr 16 22:16:15.521895 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:15.521875 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1335948c-8695-4844-a3a5-6819848109ce-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k5vkd\" (UID: \"1335948c-8695-4844-a3a5-6819848109ce\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k5vkd" Apr 16 22:16:15.566984 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:15.566936 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k5vkd" Apr 16 22:16:15.619989 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:15.619954 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f8ba909f-a641-4832-b7a1-a11849ea7211-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fcxmn\" (UID: \"f8ba909f-a641-4832-b7a1-a11849ea7211\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fcxmn" Apr 16 22:16:15.620129 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:15.620094 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:16:15.620200 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:15.620153 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8ba909f-a641-4832-b7a1-a11849ea7211-networking-console-plugin-cert podName:f8ba909f-a641-4832-b7a1-a11849ea7211 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:31.620135561 +0000 UTC m=+187.153476393 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f8ba909f-a641-4832-b7a1-a11849ea7211-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fcxmn" (UID: "f8ba909f-a641-4832-b7a1-a11849ea7211") : secret "networking-console-plugin-cert" not found Apr 16 22:16:15.682804 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:15.682782 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k5vkd"] Apr 16 22:16:16.507795 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:16.507741 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k5vkd" event={"ID":"1335948c-8695-4844-a3a5-6819848109ce","Type":"ContainerStarted","Data":"855ee47deade343b502631a1134213640c529b09f9819fe814b454b67631edef"} Apr 16 22:16:17.511909 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:17.511862 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k5vkd" event={"ID":"1335948c-8695-4844-a3a5-6819848109ce","Type":"ContainerStarted","Data":"f0ab2132e19b48471644e66b5a755a0da1266dda7f09533239223828559fe3ae"} Apr 16 22:16:17.511909 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:17.511909 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k5vkd" event={"ID":"1335948c-8695-4844-a3a5-6819848109ce","Type":"ContainerStarted","Data":"ac3537356bd50e6a4317523f09a685663797819a1b0fe2933bc60677e8428e8b"} Apr 16 22:16:17.542316 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:17.542265 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k5vkd" podStartSLOduration=17.06205832 podStartE2EDuration="18.542248913s" podCreationTimestamp="2026-04-16 22:15:59 +0000 UTC" firstStartedPulling="2026-04-16 22:16:15.720047052 +0000 UTC m=+171.253387883" lastFinishedPulling="2026-04-16 22:16:17.200237642 +0000 UTC m=+172.733578476" observedRunningTime="2026-04-16 22:16:17.542047947 +0000 UTC m=+173.075388800" watchObservedRunningTime="2026-04-16 22:16:17.542248913 +0000 UTC m=+173.075589765" Apr 16 22:16:24.015927 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:24.015894 2576 scope.go:117] "RemoveContainer" containerID="d65627dfe728ab084bdd26a1792d3e64b8096aa468c4bb2f304d3fb010895993" Apr 16 22:16:24.535356 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:24.535328 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 22:16:24.535517 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:24.535412 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" event={"ID":"ba403d08-6963-4274-8dcb-309016f31037","Type":"ContainerStarted","Data":"2bd7f9f6441f5ff20aa3fe9f65224b7705d41d490debe19572c5d5f302359e0f"} Apr 16 22:16:24.535706 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:24.535666 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" Apr 16 22:16:24.552967 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:24.552911 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" podStartSLOduration=22.507865111 podStartE2EDuration="25.552897401s" podCreationTimestamp="2026-04-16 22:15:59 +0000 UTC" firstStartedPulling="2026-04-16 22:16:00.109821929 +0000 UTC m=+155.643162759" lastFinishedPulling="2026-04-16 22:16:03.154854218 +0000 UTC m=+158.688195049" observedRunningTime="2026-04-16 22:16:24.551945851 +0000 UTC m=+180.085286704" watchObservedRunningTime="2026-04-16 22:16:24.552897401 +0000 UTC m=+180.086238293" Apr 16 22:16:24.911492 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:24.911422 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-7r6sf" Apr 16 22:16:25.727492 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.727461 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-54956"] Apr 16 22:16:25.733250 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.733227 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-54956" Apr 16 22:16:25.735815 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.735787 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 22:16:25.736756 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.736729 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 22:16:25.737580 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.737552 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-hc5p5\"" Apr 16 22:16:25.751083 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.751059 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-54956"] Apr 16 22:16:25.757535 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.757508 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t2x6j"] Apr 16 22:16:25.761485 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.761464 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t2x6j" Apr 16 22:16:25.764223 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.764205 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-5grgq\"" Apr 16 22:16:25.764416 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.764393 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 22:16:25.778182 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.778128 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t2x6j"] Apr 16 22:16:25.780581 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.780559 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-4x9sz"] Apr 16 22:16:25.784538 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.784522 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-4x9sz" Apr 16 22:16:25.788027 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.788006 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-mnxwb\"" Apr 16 22:16:25.788267 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.788248 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 22:16:25.788577 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.788527 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 22:16:25.798059 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.798035 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-4x9sz"] Apr 16 22:16:25.900113 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.900072 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1a9ba876-91ba-49b9-87b0-90be2057ee0a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-54956\" (UID: \"1a9ba876-91ba-49b9-87b0-90be2057ee0a\") " pod="openshift-insights/insights-runtime-extractor-54956" Apr 16 22:16:25.900295 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.900120 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdjld\" (UniqueName: \"kubernetes.io/projected/1a9ba876-91ba-49b9-87b0-90be2057ee0a-kube-api-access-jdjld\") pod \"insights-runtime-extractor-54956\" (UID: \"1a9ba876-91ba-49b9-87b0-90be2057ee0a\") " pod="openshift-insights/insights-runtime-extractor-54956" Apr 16 22:16:25.900295 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.900183 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqpvf\" (UniqueName: \"kubernetes.io/projected/20d3b92f-a244-4735-991f-c5e63025f301-kube-api-access-sqpvf\") pod \"downloads-6bcc868b7-4x9sz\" (UID: \"20d3b92f-a244-4735-991f-c5e63025f301\") " pod="openshift-console/downloads-6bcc868b7-4x9sz" Apr 16 22:16:25.900295 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.900263 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1a9ba876-91ba-49b9-87b0-90be2057ee0a-data-volume\") pod \"insights-runtime-extractor-54956\" (UID: \"1a9ba876-91ba-49b9-87b0-90be2057ee0a\") " pod="openshift-insights/insights-runtime-extractor-54956" Apr 16 22:16:25.900413 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.900319 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1a9ba876-91ba-49b9-87b0-90be2057ee0a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-54956\" (UID: \"1a9ba876-91ba-49b9-87b0-90be2057ee0a\") " pod="openshift-insights/insights-runtime-extractor-54956" Apr 16 22:16:25.900413 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.900358 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1a9ba876-91ba-49b9-87b0-90be2057ee0a-crio-socket\") pod \"insights-runtime-extractor-54956\" (UID: \"1a9ba876-91ba-49b9-87b0-90be2057ee0a\") " pod="openshift-insights/insights-runtime-extractor-54956" Apr 16 22:16:25.900413 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:25.900384 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f8c9f944-9faf-4679-8737-bf5a9333ec82-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-t2x6j\" (UID: \"f8c9f944-9faf-4679-8737-bf5a9333ec82\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t2x6j" Apr 16 22:16:26.001303 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.001273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1a9ba876-91ba-49b9-87b0-90be2057ee0a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-54956\" (UID: \"1a9ba876-91ba-49b9-87b0-90be2057ee0a\") " pod="openshift-insights/insights-runtime-extractor-54956" Apr 16 22:16:26.001481 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.001392 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1a9ba876-91ba-49b9-87b0-90be2057ee0a-crio-socket\") pod \"insights-runtime-extractor-54956\" (UID: \"1a9ba876-91ba-49b9-87b0-90be2057ee0a\") " pod="openshift-insights/insights-runtime-extractor-54956" Apr 16 22:16:26.001481 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.001418 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f8c9f944-9faf-4679-8737-bf5a9333ec82-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-t2x6j\" (UID: \"f8c9f944-9faf-4679-8737-bf5a9333ec82\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t2x6j" Apr 16 22:16:26.001481 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.001446 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1a9ba876-91ba-49b9-87b0-90be2057ee0a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-54956\" (UID: \"1a9ba876-91ba-49b9-87b0-90be2057ee0a\") " pod="openshift-insights/insights-runtime-extractor-54956" Apr 16 22:16:26.001644 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.001516 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1a9ba876-91ba-49b9-87b0-90be2057ee0a-crio-socket\") pod \"insights-runtime-extractor-54956\" (UID: \"1a9ba876-91ba-49b9-87b0-90be2057ee0a\") " pod="openshift-insights/insights-runtime-extractor-54956" Apr 16 22:16:26.001644 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.001565 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdjld\" (UniqueName: \"kubernetes.io/projected/1a9ba876-91ba-49b9-87b0-90be2057ee0a-kube-api-access-jdjld\") pod \"insights-runtime-extractor-54956\" (UID: \"1a9ba876-91ba-49b9-87b0-90be2057ee0a\") " pod="openshift-insights/insights-runtime-extractor-54956" Apr 16 22:16:26.001644 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.001619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqpvf\" (UniqueName: \"kubernetes.io/projected/20d3b92f-a244-4735-991f-c5e63025f301-kube-api-access-sqpvf\") pod \"downloads-6bcc868b7-4x9sz\" (UID: \"20d3b92f-a244-4735-991f-c5e63025f301\") " pod="openshift-console/downloads-6bcc868b7-4x9sz" Apr 16 22:16:26.001833 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.001703 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1a9ba876-91ba-49b9-87b0-90be2057ee0a-data-volume\") pod \"insights-runtime-extractor-54956\" (UID: \"1a9ba876-91ba-49b9-87b0-90be2057ee0a\") " pod="openshift-insights/insights-runtime-extractor-54956" Apr 16 22:16:26.001945 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.001927 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1a9ba876-91ba-49b9-87b0-90be2057ee0a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-54956\" (UID: \"1a9ba876-91ba-49b9-87b0-90be2057ee0a\") " pod="openshift-insights/insights-runtime-extractor-54956" Apr 16 22:16:26.002055 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.002036 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1a9ba876-91ba-49b9-87b0-90be2057ee0a-data-volume\") pod \"insights-runtime-extractor-54956\" (UID: \"1a9ba876-91ba-49b9-87b0-90be2057ee0a\") " pod="openshift-insights/insights-runtime-extractor-54956" Apr 16 22:16:26.003867 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.003847 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1a9ba876-91ba-49b9-87b0-90be2057ee0a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-54956\" (UID: \"1a9ba876-91ba-49b9-87b0-90be2057ee0a\") " pod="openshift-insights/insights-runtime-extractor-54956" Apr 16 22:16:26.003964 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.003948 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f8c9f944-9faf-4679-8737-bf5a9333ec82-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-t2x6j\" (UID: \"f8c9f944-9faf-4679-8737-bf5a9333ec82\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t2x6j" Apr 16 22:16:26.015881 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.015832 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdjld\" (UniqueName: \"kubernetes.io/projected/1a9ba876-91ba-49b9-87b0-90be2057ee0a-kube-api-access-jdjld\") pod \"insights-runtime-extractor-54956\" (UID: \"1a9ba876-91ba-49b9-87b0-90be2057ee0a\") " pod="openshift-insights/insights-runtime-extractor-54956" Apr 16 22:16:26.029389 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.029301 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqpvf\" (UniqueName: \"kubernetes.io/projected/20d3b92f-a244-4735-991f-c5e63025f301-kube-api-access-sqpvf\") pod \"downloads-6bcc868b7-4x9sz\" (UID: \"20d3b92f-a244-4735-991f-c5e63025f301\") " pod="openshift-console/downloads-6bcc868b7-4x9sz" Apr 16 22:16:26.047887 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.047856 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-54956" Apr 16 22:16:26.071446 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.071411 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t2x6j" Apr 16 22:16:26.093653 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.093619 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-4x9sz" Apr 16 22:16:26.192364 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.192332 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-54956"] Apr 16 22:16:26.197316 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:16:26.197287 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a9ba876_91ba_49b9_87b0_90be2057ee0a.slice/crio-f466f5b644e4ac25f80ab79b9833ab875ec78202577924e4021c28b1da2c12c6 WatchSource:0}: Error finding container f466f5b644e4ac25f80ab79b9833ab875ec78202577924e4021c28b1da2c12c6: Status 404 returned error can't find the container with id f466f5b644e4ac25f80ab79b9833ab875ec78202577924e4021c28b1da2c12c6 Apr 16 22:16:26.210500 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.210477 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t2x6j"] Apr 16 22:16:26.213605 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:16:26.213579 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8c9f944_9faf_4679_8737_bf5a9333ec82.slice/crio-ad98213dd50f1003e0f3af6ca11489ef7c6d91af85ae2237bb63ab02ba24a76e WatchSource:0}: Error finding container ad98213dd50f1003e0f3af6ca11489ef7c6d91af85ae2237bb63ab02ba24a76e: Status 404 returned error can't find the container with id ad98213dd50f1003e0f3af6ca11489ef7c6d91af85ae2237bb63ab02ba24a76e Apr 16 22:16:26.230824 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.230798 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-4x9sz"] Apr 16 22:16:26.233882 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:16:26.233858 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20d3b92f_a244_4735_991f_c5e63025f301.slice/crio-2deab0293c5954291a72052aa257afe680591aacaf3e5b3fce6678775cc44f00 WatchSource:0}: Error finding container 2deab0293c5954291a72052aa257afe680591aacaf3e5b3fce6678775cc44f00: Status 404 returned error can't find the container with id 2deab0293c5954291a72052aa257afe680591aacaf3e5b3fce6678775cc44f00 Apr 16 22:16:26.542426 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.542332 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-54956" event={"ID":"1a9ba876-91ba-49b9-87b0-90be2057ee0a","Type":"ContainerStarted","Data":"cd068bc39f849c0a6802d7d4b2067826e5e0af6e9366a721c77eb1edecc06b78"} Apr 16 22:16:26.542426 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.542372 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-54956" event={"ID":"1a9ba876-91ba-49b9-87b0-90be2057ee0a","Type":"ContainerStarted","Data":"f466f5b644e4ac25f80ab79b9833ab875ec78202577924e4021c28b1da2c12c6"} Apr 16 22:16:26.543258 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.543229 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-4x9sz" event={"ID":"20d3b92f-a244-4735-991f-c5e63025f301","Type":"ContainerStarted","Data":"2deab0293c5954291a72052aa257afe680591aacaf3e5b3fce6678775cc44f00"} Apr 16 22:16:26.544138 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:26.544120 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t2x6j" event={"ID":"f8c9f944-9faf-4679-8737-bf5a9333ec82","Type":"ContainerStarted","Data":"ad98213dd50f1003e0f3af6ca11489ef7c6d91af85ae2237bb63ab02ba24a76e"} Apr 16 22:16:27.551114 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:27.551066 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-54956" event={"ID":"1a9ba876-91ba-49b9-87b0-90be2057ee0a","Type":"ContainerStarted","Data":"c1c7d7670f26e94515785b645d6cb35fc90b80d46130f94cbde84c1d58d06a38"} Apr 16 22:16:28.555289 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:28.555259 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t2x6j" event={"ID":"f8c9f944-9faf-4679-8737-bf5a9333ec82","Type":"ContainerStarted","Data":"c4e0a1475216353e85c1b26016ffa7b4e0d214d70157a84234819fa1b843a2d5"} Apr 16 22:16:28.557078 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:28.557056 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t2x6j" Apr 16 22:16:28.562860 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:28.562839 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t2x6j" Apr 16 22:16:28.574873 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:28.574814 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t2x6j" podStartSLOduration=2.318945764 podStartE2EDuration="3.574795067s" podCreationTimestamp="2026-04-16 22:16:25 +0000 UTC" firstStartedPulling="2026-04-16 22:16:26.215737757 +0000 UTC m=+181.749078606" lastFinishedPulling="2026-04-16 22:16:27.471587071 +0000 UTC m=+183.004927909" observedRunningTime="2026-04-16 22:16:28.573868311 +0000 UTC m=+184.107209164" watchObservedRunningTime="2026-04-16 22:16:28.574795067 +0000 UTC m=+184.108135920" Apr 16 22:16:29.560543 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:29.560500 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-54956" event={"ID":"1a9ba876-91ba-49b9-87b0-90be2057ee0a","Type":"ContainerStarted","Data":"eb311f33542df19b69bd01f19f3b017720ce2704664b8299bbe674133c4f4d7d"} Apr 16 22:16:29.585773 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:29.585719 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-54956" podStartSLOduration=2.3080726 podStartE2EDuration="4.585698861s" podCreationTimestamp="2026-04-16 22:16:25 +0000 UTC" firstStartedPulling="2026-04-16 22:16:26.274037609 +0000 UTC m=+181.807378440" lastFinishedPulling="2026-04-16 22:16:28.551663855 +0000 UTC m=+184.085004701" observedRunningTime="2026-04-16 22:16:29.584666267 +0000 UTC m=+185.118007121" watchObservedRunningTime="2026-04-16 22:16:29.585698861 +0000 UTC m=+185.119039714" Apr 16 22:16:31.653128 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:31.653088 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f8ba909f-a641-4832-b7a1-a11849ea7211-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fcxmn\" (UID: \"f8ba909f-a641-4832-b7a1-a11849ea7211\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fcxmn" Apr 16 22:16:31.655909 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:31.655879 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f8ba909f-a641-4832-b7a1-a11849ea7211-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fcxmn\" (UID: \"f8ba909f-a641-4832-b7a1-a11849ea7211\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fcxmn" Apr 16 22:16:31.953734 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:31.953641 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-ms8ss\"" Apr 16 22:16:31.961755 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:31.961728 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fcxmn" Apr 16 22:16:32.088898 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:32.088708 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fcxmn"] Apr 16 22:16:32.091806 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:16:32.091761 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8ba909f_a641_4832_b7a1_a11849ea7211.slice/crio-0bebd89158b7ddcd81887ae551a001269f77bde5b822d69c26a5f9c8e06f4c88 WatchSource:0}: Error finding container 0bebd89158b7ddcd81887ae551a001269f77bde5b822d69c26a5f9c8e06f4c88: Status 404 returned error can't find the container with id 0bebd89158b7ddcd81887ae551a001269f77bde5b822d69c26a5f9c8e06f4c88 Apr 16 22:16:32.570919 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:32.570877 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fcxmn" event={"ID":"f8ba909f-a641-4832-b7a1-a11849ea7211","Type":"ContainerStarted","Data":"0bebd89158b7ddcd81887ae551a001269f77bde5b822d69c26a5f9c8e06f4c88"} Apr 16 22:16:33.343433 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.343402 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-r5jqt"] Apr 16 22:16:33.364790 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.364758 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-r5jqt"] Apr 16 22:16:33.364790 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.364791 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-mrwb7"] Apr 16 22:16:33.365036 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.364963 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r5jqt" Apr 16 22:16:33.367881 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.367856 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 22:16:33.368205 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.368180 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-9zjvd\"" Apr 16 22:16:33.369091 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.369070 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 22:16:33.370861 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.370762 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 22:16:33.370861 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.370783 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 22:16:33.370861 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.370788 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 22:16:33.383440 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.383418 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.385729 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.385706 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mfcw8\"" Apr 16 22:16:33.385850 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.385792 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 22:16:33.385950 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.385929 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 22:16:33.386245 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.386230 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 22:16:33.470037 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.470002 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/36a7c67c-1021-4738-8021-e4430bef3530-node-exporter-tls\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.470251 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.470133 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/36a7c67c-1021-4738-8021-e4430bef3530-node-exporter-wtmp\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.470251 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.470185 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36a7c67c-1021-4738-8021-e4430bef3530-metrics-client-ca\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.470251 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.470217 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/36a7c67c-1021-4738-8021-e4430bef3530-node-exporter-textfile\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.470251 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.470248 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-r5jqt\" (UID: \"9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r5jqt" Apr 16 22:16:33.470559 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.470267 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/36a7c67c-1021-4738-8021-e4430bef3530-node-exporter-accelerators-collector-config\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.470559 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.470290 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-r5jqt\" (UID: \"9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r5jqt" Apr 16 22:16:33.470559 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.470319 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-r5jqt\" (UID: \"9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r5jqt" Apr 16 22:16:33.470559 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.470343 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4zql\" (UniqueName: \"kubernetes.io/projected/9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7-kube-api-access-j4zql\") pod \"openshift-state-metrics-9d44df66c-r5jqt\" (UID: \"9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r5jqt" Apr 16 22:16:33.470559 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.470366 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/36a7c67c-1021-4738-8021-e4430bef3530-root\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.470559 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.470442 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/36a7c67c-1021-4738-8021-e4430bef3530-sys\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.470559 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.470500 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/36a7c67c-1021-4738-8021-e4430bef3530-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.470559 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.470531 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch4rg\" (UniqueName: \"kubernetes.io/projected/36a7c67c-1021-4738-8021-e4430bef3530-kube-api-access-ch4rg\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.571783 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.571576 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/36a7c67c-1021-4738-8021-e4430bef3530-node-exporter-textfile\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.571783 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.571638 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-r5jqt\" (UID: \"9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r5jqt" Apr 16 22:16:33.571783 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.571669 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/36a7c67c-1021-4738-8021-e4430bef3530-node-exporter-accelerators-collector-config\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.571783 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.571717 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-r5jqt\" (UID: \"9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r5jqt" Apr 16 22:16:33.571783 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.571748 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-r5jqt\" (UID: \"9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r5jqt" Apr 16 22:16:33.571783 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.571773 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4zql\" (UniqueName: \"kubernetes.io/projected/9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7-kube-api-access-j4zql\") pod \"openshift-state-metrics-9d44df66c-r5jqt\" (UID: \"9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r5jqt" Apr 16 22:16:33.572148 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.571795 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/36a7c67c-1021-4738-8021-e4430bef3530-root\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.572148 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.571832 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/36a7c67c-1021-4738-8021-e4430bef3530-sys\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.572148 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.571865 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/36a7c67c-1021-4738-8021-e4430bef3530-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.572148 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.571890 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ch4rg\" (UniqueName: \"kubernetes.io/projected/36a7c67c-1021-4738-8021-e4430bef3530-kube-api-access-ch4rg\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.572148 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.571913 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/36a7c67c-1021-4738-8021-e4430bef3530-node-exporter-textfile\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.572148 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.571935 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/36a7c67c-1021-4738-8021-e4430bef3530-node-exporter-tls\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.572148 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.572042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/36a7c67c-1021-4738-8021-e4430bef3530-node-exporter-wtmp\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.572148 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.572081 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36a7c67c-1021-4738-8021-e4430bef3530-metrics-client-ca\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.573235 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.572662 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36a7c67c-1021-4738-8021-e4430bef3530-metrics-client-ca\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.573235 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.572701 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/36a7c67c-1021-4738-8021-e4430bef3530-node-exporter-accelerators-collector-config\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.573235 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.573205 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/36a7c67c-1021-4738-8021-e4430bef3530-root\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.573235 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.573230 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-r5jqt\" (UID: \"9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r5jqt" Apr 16 22:16:33.573501 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.573264 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/36a7c67c-1021-4738-8021-e4430bef3530-sys\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.573501 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.573369 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/36a7c67c-1021-4738-8021-e4430bef3530-node-exporter-wtmp\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.574691 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.574644 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-r5jqt\" (UID: \"9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r5jqt" Apr 16 22:16:33.576324 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.576294 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fcxmn" event={"ID":"f8ba909f-a641-4832-b7a1-a11849ea7211","Type":"ContainerStarted","Data":"f04dc4d49f04b45acac96863169070fec554325dd07a099f4d6241d59522204f"} Apr 16 22:16:33.576422 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.576398 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-r5jqt\" (UID: \"9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r5jqt" Apr 16 22:16:33.580128 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.580105 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4zql\" (UniqueName: \"kubernetes.io/projected/9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7-kube-api-access-j4zql\") pod \"openshift-state-metrics-9d44df66c-r5jqt\" (UID: \"9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r5jqt" Apr 16 22:16:33.584071 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.584043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/36a7c67c-1021-4738-8021-e4430bef3530-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.584168 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.584149 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/36a7c67c-1021-4738-8021-e4430bef3530-node-exporter-tls\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.586638 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.586550 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch4rg\" (UniqueName: \"kubernetes.io/projected/36a7c67c-1021-4738-8021-e4430bef3530-kube-api-access-ch4rg\") pod \"node-exporter-mrwb7\" (UID: \"36a7c67c-1021-4738-8021-e4430bef3530\") " pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.676278 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.676196 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r5jqt" Apr 16 22:16:33.694188 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.694158 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mrwb7" Apr 16 22:16:33.708928 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:16:33.708879 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36a7c67c_1021_4738_8021_e4430bef3530.slice/crio-9dd015e15635913d5180323021ba9c75971171f53668330c44b38cfefa359bfd WatchSource:0}: Error finding container 9dd015e15635913d5180323021ba9c75971171f53668330c44b38cfefa359bfd: Status 404 returned error can't find the container with id 9dd015e15635913d5180323021ba9c75971171f53668330c44b38cfefa359bfd Apr 16 22:16:33.814650 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:33.814622 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-r5jqt"] Apr 16 22:16:33.817200 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:16:33.817173 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a6dac50_06b7_4f4a_80b3_ef0ce5e3d8f7.slice/crio-b1998d5a96f01e484c95e7bb459d1b2a15e447da23741374e5f8468bc7a1d594 WatchSource:0}: Error finding container b1998d5a96f01e484c95e7bb459d1b2a15e447da23741374e5f8468bc7a1d594: Status 404 returned error can't find the container with id b1998d5a96f01e484c95e7bb459d1b2a15e447da23741374e5f8468bc7a1d594 Apr 16 22:16:34.016559 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.016512 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c95c6cb97-cljn7"] Apr 16 22:16:34.032634 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.032593 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c95c6cb97-cljn7"] Apr 16 22:16:34.032823 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.032734 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:34.035656 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.035593 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 22:16:34.036032 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.035866 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 22:16:34.036032 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.035924 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 22:16:34.036032 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.036026 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 22:16:34.036224 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.036041 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-qmtph\"" Apr 16 22:16:34.036615 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.036591 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 22:16:34.178778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.178739 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-295kt\" (UniqueName: \"kubernetes.io/projected/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-kube-api-access-295kt\") pod \"console-6c95c6cb97-cljn7\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:34.178986 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.178868 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-oauth-serving-cert\") pod \"console-6c95c6cb97-cljn7\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:34.178986 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.178907 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-console-oauth-config\") pod \"console-6c95c6cb97-cljn7\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:34.178986 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.178977 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-console-config\") pod \"console-6c95c6cb97-cljn7\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:34.179173 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.179009 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-service-ca\") pod \"console-6c95c6cb97-cljn7\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:34.179173 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.179145 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-console-serving-cert\") pod \"console-6c95c6cb97-cljn7\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:34.279932 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.279657 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-295kt\" (UniqueName: \"kubernetes.io/projected/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-kube-api-access-295kt\") pod \"console-6c95c6cb97-cljn7\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:34.279932 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.279806 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-oauth-serving-cert\") pod \"console-6c95c6cb97-cljn7\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:34.279932 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.279844 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-console-oauth-config\") pod \"console-6c95c6cb97-cljn7\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:34.279932 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.279883 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-console-config\") pod \"console-6c95c6cb97-cljn7\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:34.279932 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.279916 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-service-ca\") pod \"console-6c95c6cb97-cljn7\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:34.280299 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.279987 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-console-serving-cert\") pod \"console-6c95c6cb97-cljn7\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:34.280700 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.280636 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-oauth-serving-cert\") pod \"console-6c95c6cb97-cljn7\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:34.280700 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.280693 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-service-ca\") pod \"console-6c95c6cb97-cljn7\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:34.283836 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.283811 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-console-oauth-config\") pod \"console-6c95c6cb97-cljn7\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:34.284277 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.284232 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-console-serving-cert\") pod \"console-6c95c6cb97-cljn7\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:34.290177 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.290152 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-console-config\") pod \"console-6c95c6cb97-cljn7\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:34.292064 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.292035 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-295kt\" (UniqueName: \"kubernetes.io/projected/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-kube-api-access-295kt\") pod \"console-6c95c6cb97-cljn7\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:34.345871 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.345824 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:34.484613 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.484578 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c95c6cb97-cljn7"] Apr 16 22:16:34.563925 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:16:34.563885 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6bd0cb3_d839_4036_ad36_1b9f7849d1fa.slice/crio-399dd8896df518eb8eae52847389e36d53586c05d0642a160183a5395ef63533 WatchSource:0}: Error finding container 399dd8896df518eb8eae52847389e36d53586c05d0642a160183a5395ef63533: Status 404 returned error can't find the container with id 399dd8896df518eb8eae52847389e36d53586c05d0642a160183a5395ef63533 Apr 16 22:16:34.580543 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.580511 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r5jqt" event={"ID":"9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7","Type":"ContainerStarted","Data":"67c551f9dc08333c3726cd769ed4d297182d7fe68830f00d1b6521f4249e86a3"} Apr 16 22:16:34.580693 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.580553 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r5jqt" event={"ID":"9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7","Type":"ContainerStarted","Data":"c555a512b921903ccc02aa18601f0c9277bfd5f8416019dde04d26ea9fb84cf7"} Apr 16 22:16:34.580693 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.580567 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r5jqt" event={"ID":"9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7","Type":"ContainerStarted","Data":"b1998d5a96f01e484c95e7bb459d1b2a15e447da23741374e5f8468bc7a1d594"} Apr 16 22:16:34.581737 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.581707 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mrwb7" event={"ID":"36a7c67c-1021-4738-8021-e4430bef3530","Type":"ContainerStarted","Data":"9dd015e15635913d5180323021ba9c75971171f53668330c44b38cfefa359bfd"} Apr 16 22:16:34.582952 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.582913 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c95c6cb97-cljn7" event={"ID":"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa","Type":"ContainerStarted","Data":"399dd8896df518eb8eae52847389e36d53586c05d0642a160183a5395ef63533"} Apr 16 22:16:34.601211 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:34.601164 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fcxmn" podStartSLOduration=34.249802367 podStartE2EDuration="35.601146102s" podCreationTimestamp="2026-04-16 22:15:59 +0000 UTC" firstStartedPulling="2026-04-16 22:16:32.094218623 +0000 UTC m=+187.627559453" lastFinishedPulling="2026-04-16 22:16:33.445562343 +0000 UTC m=+188.978903188" observedRunningTime="2026-04-16 22:16:34.600204732 +0000 UTC m=+190.133545586" watchObservedRunningTime="2026-04-16 22:16:34.601146102 +0000 UTC m=+190.134486958" Apr 16 22:16:35.587068 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:35.587017 2576 generic.go:358] "Generic (PLEG): container finished" podID="36a7c67c-1021-4738-8021-e4430bef3530" containerID="14651202edb3fee6cb2afc0ec8ef58976fce83be0b2a707fdaed1f0a38823833" exitCode=0 Apr 16 22:16:35.587531 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:35.587091 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mrwb7" event={"ID":"36a7c67c-1021-4738-8021-e4430bef3530","Type":"ContainerDied","Data":"14651202edb3fee6cb2afc0ec8ef58976fce83be0b2a707fdaed1f0a38823833"} Apr 16 22:16:39.736964 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.736928 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:16:39.741565 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.741539 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.753551 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.753527 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 22:16:39.753860 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.753829 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 22:16:39.754445 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.754429 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 22:16:39.754990 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.754970 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 22:16:39.755931 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.755911 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 22:16:39.756168 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.756154 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-hqvcv\"" Apr 16 22:16:39.756388 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.756238 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 22:16:39.756388 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.756262 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 22:16:39.756388 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.756383 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 22:16:39.756627 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.756607 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-569v80od7lahq\"" Apr 16 22:16:39.768777 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.768748 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 22:16:39.782453 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.782381 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 22:16:39.782453 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.782435 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 22:16:39.782771 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.782539 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 22:16:39.811313 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.811285 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 22:16:39.829167 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.829134 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0a49b216-e0ad-48d0-8f67-c06fe0627ece-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.829337 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.829185 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.829337 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.829216 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.829337 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.829242 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0a49b216-e0ad-48d0-8f67-c06fe0627ece-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.829337 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.829322 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.829700 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.829386 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.829700 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.829410 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mm9j\" (UniqueName: \"kubernetes.io/projected/0a49b216-e0ad-48d0-8f67-c06fe0627ece-kube-api-access-5mm9j\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.829700 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.829470 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.829700 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.829532 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0a49b216-e0ad-48d0-8f67-c06fe0627ece-config-out\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.829700 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.829561 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-config\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.829700 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.829587 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.829700 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.829629 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.829700 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.829653 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.830077 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.829747 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.830077 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.829782 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.830077 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.829809 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.830077 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.829855 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.830077 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.829909 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-web-config\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.931277 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.931230 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.931277 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.931276 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.931518 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.931319 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.931518 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.931358 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.931518 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.931434 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-web-config\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.931518 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.931465 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0a49b216-e0ad-48d0-8f67-c06fe0627ece-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.931518 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.931499 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.931801 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.931523 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.931801 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.931552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0a49b216-e0ad-48d0-8f67-c06fe0627ece-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.931801 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.931602 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.931801 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.931711 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.931801 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.931742 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mm9j\" (UniqueName: \"kubernetes.io/projected/0a49b216-e0ad-48d0-8f67-c06fe0627ece-kube-api-access-5mm9j\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.931801 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.931791 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.932091 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.931844 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0a49b216-e0ad-48d0-8f67-c06fe0627ece-config-out\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.932091 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.931878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-config\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.932091 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.931903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.932091 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.931944 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.932091 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.931970 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.932333 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.932278 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.932448 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.932422 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.933487 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.932858 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.934620 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.934321 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.935187 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.935167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.936267 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.936104 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0a49b216-e0ad-48d0-8f67-c06fe0627ece-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.936376 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.936308 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.937120 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.937095 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0a49b216-e0ad-48d0-8f67-c06fe0627ece-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.937421 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.937273 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-config\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.937597 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.937549 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.938113 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.938091 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.938209 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.938158 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.938333 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.938302 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0a49b216-e0ad-48d0-8f67-c06fe0627ece-config-out\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.938508 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.938486 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.938600 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.938542 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.938866 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.938830 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-web-config\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.938949 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.938899 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.963784 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.963753 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mm9j\" (UniqueName: \"kubernetes.io/projected/0a49b216-e0ad-48d0-8f67-c06fe0627ece-kube-api-access-5mm9j\") pod \"prometheus-k8s-0\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:39.964755 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:39.964719 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:16:40.052535 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:40.052492 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:43.532160 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:43.531623 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:16:43.538242 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:16:43.538206 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a49b216_e0ad_48d0_8f67_c06fe0627ece.slice/crio-6c5dfaebe1585b4263f0019a1b93764f5f139c3757ad6425e500284b120633fd WatchSource:0}: Error finding container 6c5dfaebe1585b4263f0019a1b93764f5f139c3757ad6425e500284b120633fd: Status 404 returned error can't find the container with id 6c5dfaebe1585b4263f0019a1b93764f5f139c3757ad6425e500284b120633fd Apr 16 22:16:43.618844 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:43.618774 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mrwb7" event={"ID":"36a7c67c-1021-4738-8021-e4430bef3530","Type":"ContainerStarted","Data":"9753e4bb7649fb5b397afe0d4a232366c93fe1febfc1f9066bab5444c90cf2a0"} Apr 16 22:16:43.618844 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:43.618818 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mrwb7" event={"ID":"36a7c67c-1021-4738-8021-e4430bef3530","Type":"ContainerStarted","Data":"4b424e1f46414d628ac1facbc9830469cb9106c514d1545876920a8e8a4fef0a"} Apr 16 22:16:43.620076 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:43.620023 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-4x9sz" event={"ID":"20d3b92f-a244-4735-991f-c5e63025f301","Type":"ContainerStarted","Data":"e14d217a3ecd2555767b6319c1860a5d54538b7773a5d38e0a0e336afab8f62b"} Apr 16 22:16:43.620267 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:43.620254 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-4x9sz" Apr 16 22:16:43.621053 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:43.621018 2576 patch_prober.go:28] interesting pod/downloads-6bcc868b7-4x9sz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.132.0.19:8080/\": dial tcp 10.132.0.19:8080: connect: connection refused" start-of-body= Apr 16 22:16:43.621151 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:43.621087 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-4x9sz" podUID="20d3b92f-a244-4735-991f-c5e63025f301" containerName="download-server" probeResult="failure" output="Get \"http://10.132.0.19:8080/\": dial tcp 10.132.0.19:8080: connect: connection refused" Apr 16 22:16:43.621698 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:43.621636 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c95c6cb97-cljn7" event={"ID":"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa","Type":"ContainerStarted","Data":"d529d8070d6af1b1aa6d4c92ff14238db2c3b8d6b69ff2ff83c13a4d0629b10f"} Apr 16 22:16:43.624102 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:43.624076 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r5jqt" event={"ID":"9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7","Type":"ContainerStarted","Data":"21a5c4420f51eebd88f921edb57c0857e3a344edce66c45dd7c59469ea5d7496"} Apr 16 22:16:43.625243 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:43.625223 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a49b216-e0ad-48d0-8f67-c06fe0627ece","Type":"ContainerStarted","Data":"6c5dfaebe1585b4263f0019a1b93764f5f139c3757ad6425e500284b120633fd"} Apr 16 22:16:43.640052 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:43.640009 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-mrwb7" podStartSLOduration=9.734152749 podStartE2EDuration="10.639995775s" podCreationTimestamp="2026-04-16 22:16:33 +0000 UTC" firstStartedPulling="2026-04-16 22:16:33.711054506 +0000 UTC m=+189.244395339" lastFinishedPulling="2026-04-16 22:16:34.616897524 +0000 UTC m=+190.150238365" observedRunningTime="2026-04-16 22:16:43.63884222 +0000 UTC m=+199.172183086" watchObservedRunningTime="2026-04-16 22:16:43.639995775 +0000 UTC m=+199.173336628" Apr 16 22:16:43.658309 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:43.658244 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-r5jqt" podStartSLOduration=1.222661341 podStartE2EDuration="10.658229508s" podCreationTimestamp="2026-04-16 22:16:33 +0000 UTC" firstStartedPulling="2026-04-16 22:16:33.959623237 +0000 UTC m=+189.492964081" lastFinishedPulling="2026-04-16 22:16:43.395191403 +0000 UTC m=+198.928532248" observedRunningTime="2026-04-16 22:16:43.657624234 +0000 UTC m=+199.190965087" watchObservedRunningTime="2026-04-16 22:16:43.658229508 +0000 UTC m=+199.191570362" Apr 16 22:16:43.675725 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:43.675657 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c95c6cb97-cljn7" podStartSLOduration=1.833062945 podStartE2EDuration="10.675639921s" podCreationTimestamp="2026-04-16 22:16:33 +0000 UTC" firstStartedPulling="2026-04-16 22:16:34.566205856 +0000 UTC m=+190.099546700" lastFinishedPulling="2026-04-16 22:16:43.408782831 +0000 UTC m=+198.942123676" observedRunningTime="2026-04-16 22:16:43.675411538 +0000 UTC m=+199.208752387" watchObservedRunningTime="2026-04-16 22:16:43.675639921 +0000 UTC m=+199.208980775" Apr 16 22:16:43.693632 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:43.693584 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-4x9sz" podStartSLOduration=1.483836151 podStartE2EDuration="18.693570423s" podCreationTimestamp="2026-04-16 22:16:25 +0000 UTC" firstStartedPulling="2026-04-16 22:16:26.235756535 +0000 UTC m=+181.769097367" lastFinishedPulling="2026-04-16 22:16:43.445490808 +0000 UTC m=+198.978831639" observedRunningTime="2026-04-16 22:16:43.693095955 +0000 UTC m=+199.226436810" watchObservedRunningTime="2026-04-16 22:16:43.693570423 +0000 UTC m=+199.226911324" Apr 16 22:16:44.346466 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:44.346426 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:44.346757 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:44.346733 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:44.353007 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:44.352972 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:44.630748 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:44.630709 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a49b216-e0ad-48d0-8f67-c06fe0627ece","Type":"ContainerStarted","Data":"39d63e25e9efbfb936a1a5d61398d43a47acc78ccfef08e6a94fa6d7aedb46fd"} Apr 16 22:16:44.636783 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:44.636757 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:16:44.641654 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:44.641628 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-4x9sz" Apr 16 22:16:45.634967 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:45.634928 2576 generic.go:358] "Generic (PLEG): container finished" podID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerID="39d63e25e9efbfb936a1a5d61398d43a47acc78ccfef08e6a94fa6d7aedb46fd" exitCode=0 Apr 16 22:16:45.635481 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:45.635011 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a49b216-e0ad-48d0-8f67-c06fe0627ece","Type":"ContainerDied","Data":"39d63e25e9efbfb936a1a5d61398d43a47acc78ccfef08e6a94fa6d7aedb46fd"} Apr 16 22:16:48.044931 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.044885 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-85b77b7fd5-n94qh"] Apr 16 22:16:48.045418 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:16:48.045188 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" podUID="4af26ebf-8c98-4962-a2da-cdbdf212d8a7" Apr 16 22:16:48.645273 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.645238 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:16:48.651059 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.651034 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:16:48.733995 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.733957 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-image-registry-private-configuration\") pod \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " Apr 16 22:16:48.733995 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.734001 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-installation-pull-secrets\") pod \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " Apr 16 22:16:48.734225 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.734034 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-trusted-ca\") pod \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " Apr 16 22:16:48.734225 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.734055 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-bound-sa-token\") pod \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " Apr 16 22:16:48.734225 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.734095 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-certificates\") pod \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " Apr 16 22:16:48.734225 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.734125 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6nc6\" (UniqueName: \"kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-kube-api-access-s6nc6\") pod \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " Apr 16 22:16:48.734225 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.734148 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-ca-trust-extracted\") pod \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\" (UID: \"4af26ebf-8c98-4962-a2da-cdbdf212d8a7\") " Apr 16 22:16:48.734751 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.734520 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4af26ebf-8c98-4962-a2da-cdbdf212d8a7" (UID: "4af26ebf-8c98-4962-a2da-cdbdf212d8a7"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:16:48.734751 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.734611 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4af26ebf-8c98-4962-a2da-cdbdf212d8a7" (UID: "4af26ebf-8c98-4962-a2da-cdbdf212d8a7"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:16:48.734751 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.734620 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4af26ebf-8c98-4962-a2da-cdbdf212d8a7" (UID: "4af26ebf-8c98-4962-a2da-cdbdf212d8a7"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:16:48.736966 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.736935 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-kube-api-access-s6nc6" (OuterVolumeSpecName: "kube-api-access-s6nc6") pod "4af26ebf-8c98-4962-a2da-cdbdf212d8a7" (UID: "4af26ebf-8c98-4962-a2da-cdbdf212d8a7"). InnerVolumeSpecName "kube-api-access-s6nc6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:16:48.736966 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.736943 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "4af26ebf-8c98-4962-a2da-cdbdf212d8a7" (UID: "4af26ebf-8c98-4962-a2da-cdbdf212d8a7"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:16:48.737181 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.736999 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4af26ebf-8c98-4962-a2da-cdbdf212d8a7" (UID: "4af26ebf-8c98-4962-a2da-cdbdf212d8a7"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:16:48.737444 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.737421 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4af26ebf-8c98-4962-a2da-cdbdf212d8a7" (UID: "4af26ebf-8c98-4962-a2da-cdbdf212d8a7"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:16:48.835548 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.835510 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s6nc6\" (UniqueName: \"kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-kube-api-access-s6nc6\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:16:48.835548 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.835541 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-ca-trust-extracted\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:16:48.835548 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.835557 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-image-registry-private-configuration\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:16:48.835817 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.835574 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-installation-pull-secrets\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:16:48.835817 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.835587 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-trusted-ca\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:16:48.835817 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.835602 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-bound-sa-token\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:16:48.835817 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:48.835615 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-certificates\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:16:49.650893 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:49.650799 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a49b216-e0ad-48d0-8f67-c06fe0627ece","Type":"ContainerStarted","Data":"220fea67850c7a871fa9d731d4f35dd83481a94ea4902195cd0b1ac251de37e6"} Apr 16 22:16:49.650893 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:49.650839 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85b77b7fd5-n94qh" Apr 16 22:16:49.651390 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:49.650846 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a49b216-e0ad-48d0-8f67-c06fe0627ece","Type":"ContainerStarted","Data":"1f3fd500dd70aa2ccdd3b953d606e98bb45406a9c9abc1652852ac39b1bb0b33"} Apr 16 22:16:49.699910 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:49.699875 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-85b77b7fd5-n94qh"] Apr 16 22:16:49.709388 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:49.709354 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-85b77b7fd5-n94qh"] Apr 16 22:16:49.845730 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:49.845691 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4af26ebf-8c98-4962-a2da-cdbdf212d8a7-registry-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:16:51.021325 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:51.021283 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4af26ebf-8c98-4962-a2da-cdbdf212d8a7" path="/var/lib/kubelet/pods/4af26ebf-8c98-4962-a2da-cdbdf212d8a7/volumes" Apr 16 22:16:52.086857 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:52.086828 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c95c6cb97-cljn7"] Apr 16 22:16:52.664226 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:52.664185 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a49b216-e0ad-48d0-8f67-c06fe0627ece","Type":"ContainerStarted","Data":"54ca758bcc5e27b9d0b3954c43bf8ca8db130c3fdb4316533b4bfcd632a98626"} Apr 16 22:16:52.664226 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:52.664232 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a49b216-e0ad-48d0-8f67-c06fe0627ece","Type":"ContainerStarted","Data":"4d030f618f2dab789403ad05c2f713d9da7ba887bf4c8ca9a77dda1657187a87"} Apr 16 22:16:52.664462 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:52.664245 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a49b216-e0ad-48d0-8f67-c06fe0627ece","Type":"ContainerStarted","Data":"340f3ad5f5cbc8749f846dc157ebb946b68570d45dcdb5c9d3fe348258708ba5"} Apr 16 22:16:52.664462 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:52.664259 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a49b216-e0ad-48d0-8f67-c06fe0627ece","Type":"ContainerStarted","Data":"7b42a84ea1c4bb6d7b905e7aa997f60c43e4d3c73b6c9f8123ebc3d5a55428bb"} Apr 16 22:16:52.699602 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:52.699551 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=5.467279672 podStartE2EDuration="13.69953499s" podCreationTimestamp="2026-04-16 22:16:39 +0000 UTC" firstStartedPulling="2026-04-16 22:16:43.54104961 +0000 UTC m=+199.074390459" lastFinishedPulling="2026-04-16 22:16:51.773304943 +0000 UTC m=+207.306645777" observedRunningTime="2026-04-16 22:16:52.697110509 +0000 UTC m=+208.230451383" watchObservedRunningTime="2026-04-16 22:16:52.69953499 +0000 UTC m=+208.232875844" Apr 16 22:16:55.052927 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:16:55.052892 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:17.113444 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.113379 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6c95c6cb97-cljn7" podUID="c6bd0cb3-d839-4036-ad36-1b9f7849d1fa" containerName="console" containerID="cri-o://d529d8070d6af1b1aa6d4c92ff14238db2c3b8d6b69ff2ff83c13a4d0629b10f" gracePeriod=15 Apr 16 22:17:17.393945 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.393923 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c95c6cb97-cljn7_c6bd0cb3-d839-4036-ad36-1b9f7849d1fa/console/0.log" Apr 16 22:17:17.394073 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.393983 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:17:17.492735 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.492700 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-oauth-serving-cert\") pod \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " Apr 16 22:17:17.492925 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.492746 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-console-config\") pod \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " Apr 16 22:17:17.492925 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.492773 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-console-serving-cert\") pod \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " Apr 16 22:17:17.492925 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.492800 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-service-ca\") pod \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " Apr 16 22:17:17.492925 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.492848 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-console-oauth-config\") pod \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " Apr 16 22:17:17.492925 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.492886 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-295kt\" (UniqueName: \"kubernetes.io/projected/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-kube-api-access-295kt\") pod \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\" (UID: \"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa\") " Apr 16 22:17:17.493234 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.493190 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c6bd0cb3-d839-4036-ad36-1b9f7849d1fa" (UID: "c6bd0cb3-d839-4036-ad36-1b9f7849d1fa"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:17.493389 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.493357 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-service-ca" (OuterVolumeSpecName: "service-ca") pod "c6bd0cb3-d839-4036-ad36-1b9f7849d1fa" (UID: "c6bd0cb3-d839-4036-ad36-1b9f7849d1fa"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:17.493516 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.493468 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-console-config" (OuterVolumeSpecName: "console-config") pod "c6bd0cb3-d839-4036-ad36-1b9f7849d1fa" (UID: "c6bd0cb3-d839-4036-ad36-1b9f7849d1fa"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:17.495326 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.495299 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-kube-api-access-295kt" (OuterVolumeSpecName: "kube-api-access-295kt") pod "c6bd0cb3-d839-4036-ad36-1b9f7849d1fa" (UID: "c6bd0cb3-d839-4036-ad36-1b9f7849d1fa"). InnerVolumeSpecName "kube-api-access-295kt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:17.495436 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.495398 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c6bd0cb3-d839-4036-ad36-1b9f7849d1fa" (UID: "c6bd0cb3-d839-4036-ad36-1b9f7849d1fa"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:17.495436 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.495412 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c6bd0cb3-d839-4036-ad36-1b9f7849d1fa" (UID: "c6bd0cb3-d839-4036-ad36-1b9f7849d1fa"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:17.594360 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.594326 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-console-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:17.594360 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.594353 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-console-serving-cert\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:17.594360 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.594364 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-service-ca\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:17.594360 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.594372 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-console-oauth-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:17.594620 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.594381 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-295kt\" (UniqueName: \"kubernetes.io/projected/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-kube-api-access-295kt\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:17.594620 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.594390 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa-oauth-serving-cert\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:17.738814 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.738735 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c95c6cb97-cljn7_c6bd0cb3-d839-4036-ad36-1b9f7849d1fa/console/0.log" Apr 16 22:17:17.738814 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.738774 2576 generic.go:358] "Generic (PLEG): container finished" podID="c6bd0cb3-d839-4036-ad36-1b9f7849d1fa" containerID="d529d8070d6af1b1aa6d4c92ff14238db2c3b8d6b69ff2ff83c13a4d0629b10f" exitCode=2 Apr 16 22:17:17.739030 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.738811 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c95c6cb97-cljn7" event={"ID":"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa","Type":"ContainerDied","Data":"d529d8070d6af1b1aa6d4c92ff14238db2c3b8d6b69ff2ff83c13a4d0629b10f"} Apr 16 22:17:17.739030 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.738859 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c95c6cb97-cljn7" event={"ID":"c6bd0cb3-d839-4036-ad36-1b9f7849d1fa","Type":"ContainerDied","Data":"399dd8896df518eb8eae52847389e36d53586c05d0642a160183a5395ef63533"} Apr 16 22:17:17.739030 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.738858 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c95c6cb97-cljn7" Apr 16 22:17:17.739030 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.738874 2576 scope.go:117] "RemoveContainer" containerID="d529d8070d6af1b1aa6d4c92ff14238db2c3b8d6b69ff2ff83c13a4d0629b10f" Apr 16 22:17:17.752708 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.752684 2576 scope.go:117] "RemoveContainer" containerID="d529d8070d6af1b1aa6d4c92ff14238db2c3b8d6b69ff2ff83c13a4d0629b10f" Apr 16 22:17:17.753004 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:17:17.752982 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d529d8070d6af1b1aa6d4c92ff14238db2c3b8d6b69ff2ff83c13a4d0629b10f\": container with ID starting with d529d8070d6af1b1aa6d4c92ff14238db2c3b8d6b69ff2ff83c13a4d0629b10f not found: ID does not exist" containerID="d529d8070d6af1b1aa6d4c92ff14238db2c3b8d6b69ff2ff83c13a4d0629b10f" Apr 16 22:17:17.753074 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.753013 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d529d8070d6af1b1aa6d4c92ff14238db2c3b8d6b69ff2ff83c13a4d0629b10f"} err="failed to get container status \"d529d8070d6af1b1aa6d4c92ff14238db2c3b8d6b69ff2ff83c13a4d0629b10f\": rpc error: code = NotFound desc = could not find container \"d529d8070d6af1b1aa6d4c92ff14238db2c3b8d6b69ff2ff83c13a4d0629b10f\": container with ID starting with d529d8070d6af1b1aa6d4c92ff14238db2c3b8d6b69ff2ff83c13a4d0629b10f not found: ID does not exist" Apr 16 22:17:17.763413 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.763390 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c95c6cb97-cljn7"] Apr 16 22:17:17.767701 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:17.767668 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c95c6cb97-cljn7"] Apr 16 22:17:19.019023 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:19.018990 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6bd0cb3-d839-4036-ad36-1b9f7849d1fa" path="/var/lib/kubelet/pods/c6bd0cb3-d839-4036-ad36-1b9f7849d1fa/volumes" Apr 16 22:17:24.760502 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:24.760472 2576 generic.go:358] "Generic (PLEG): container finished" podID="cbfb5b76-e002-4f88-a22a-9d55fd6b9348" containerID="801fc4d1204f094039b401257c94fc7657a8ea7214e3f38c66d0df6a9dc9f41c" exitCode=0 Apr 16 22:17:24.761036 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:24.760549 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-749dh" event={"ID":"cbfb5b76-e002-4f88-a22a-9d55fd6b9348","Type":"ContainerDied","Data":"801fc4d1204f094039b401257c94fc7657a8ea7214e3f38c66d0df6a9dc9f41c"} Apr 16 22:17:24.761036 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:24.761001 2576 scope.go:117] "RemoveContainer" containerID="801fc4d1204f094039b401257c94fc7657a8ea7214e3f38c66d0df6a9dc9f41c" Apr 16 22:17:25.764822 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:25.764789 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-749dh" event={"ID":"cbfb5b76-e002-4f88-a22a-9d55fd6b9348","Type":"ContainerStarted","Data":"736dacf78c820551c18992dcbf3e2c60e1c7979481b5ba8ef62b7b005fc9686d"} Apr 16 22:17:33.787237 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:33.787156 2576 generic.go:358] "Generic (PLEG): container finished" podID="ed1e6036-1beb-445f-8b61-65e736181605" containerID="e0a1ef7ee1dab85e2377285e579c21b2e5194196057fd9f6af73606f68dbcc30" exitCode=0 Apr 16 22:17:33.787237 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:33.787197 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-rpsvs" event={"ID":"ed1e6036-1beb-445f-8b61-65e736181605","Type":"ContainerDied","Data":"e0a1ef7ee1dab85e2377285e579c21b2e5194196057fd9f6af73606f68dbcc30"} Apr 16 22:17:33.787603 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:33.787530 2576 scope.go:117] "RemoveContainer" containerID="e0a1ef7ee1dab85e2377285e579c21b2e5194196057fd9f6af73606f68dbcc30" Apr 16 22:17:34.795990 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:34.795954 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-rpsvs" event={"ID":"ed1e6036-1beb-445f-8b61-65e736181605","Type":"ContainerStarted","Data":"4efba2cad488b34875b57b64cc50764bab477212b56b449b1da1a12bb6ca399f"} Apr 16 22:17:35.853144 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:35.853106 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs\") pod \"network-metrics-daemon-lpwn6\" (UID: \"ddd2c4f1-e24c-431c-a59a-9936d01e4667\") " pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:17:35.855336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:35.855316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ddd2c4f1-e24c-431c-a59a-9936d01e4667-metrics-certs\") pod \"network-metrics-daemon-lpwn6\" (UID: \"ddd2c4f1-e24c-431c-a59a-9936d01e4667\") " pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:17:36.119093 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:36.119004 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zhbqt\"" Apr 16 22:17:36.127386 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:36.127357 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lpwn6" Apr 16 22:17:36.242091 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:36.242060 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lpwn6"] Apr 16 22:17:36.244508 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:17:36.244474 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddd2c4f1_e24c_431c_a59a_9936d01e4667.slice/crio-7ff3e3b57bc53b843f440c95f860a1aaa2cf0cc7c8e603dafd89e8f3a6f5a0b0 WatchSource:0}: Error finding container 7ff3e3b57bc53b843f440c95f860a1aaa2cf0cc7c8e603dafd89e8f3a6f5a0b0: Status 404 returned error can't find the container with id 7ff3e3b57bc53b843f440c95f860a1aaa2cf0cc7c8e603dafd89e8f3a6f5a0b0 Apr 16 22:17:36.803381 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:36.803342 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lpwn6" event={"ID":"ddd2c4f1-e24c-431c-a59a-9936d01e4667","Type":"ContainerStarted","Data":"7ff3e3b57bc53b843f440c95f860a1aaa2cf0cc7c8e603dafd89e8f3a6f5a0b0"} Apr 16 22:17:37.809914 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:37.809880 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lpwn6" event={"ID":"ddd2c4f1-e24c-431c-a59a-9936d01e4667","Type":"ContainerStarted","Data":"72c2efe2a498fca6bac34e01a8566cf4abd48718f8daba56b43bf9b702226720"} Apr 16 22:17:37.809914 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:37.809916 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lpwn6" event={"ID":"ddd2c4f1-e24c-431c-a59a-9936d01e4667","Type":"ContainerStarted","Data":"c74301c93b8abe89cf392a1c5111decbd07107e4acf076f7872d39e5c070b64c"} Apr 16 22:17:37.826947 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:37.826249 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lpwn6" podStartSLOduration=251.834377981 podStartE2EDuration="4m12.826229028s" podCreationTimestamp="2026-04-16 22:13:25 +0000 UTC" firstStartedPulling="2026-04-16 22:17:36.246838449 +0000 UTC m=+251.780179285" lastFinishedPulling="2026-04-16 22:17:37.238689482 +0000 UTC m=+252.772030332" observedRunningTime="2026-04-16 22:17:37.824813644 +0000 UTC m=+253.358154496" watchObservedRunningTime="2026-04-16 22:17:37.826229028 +0000 UTC m=+253.359569881" Apr 16 22:17:40.053657 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:40.053615 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:40.072575 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:40.072544 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:40.833539 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:40.833509 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:58.078026 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:58.077940 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:17:58.078494 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:58.078409 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="prometheus" containerID="cri-o://1f3fd500dd70aa2ccdd3b953d606e98bb45406a9c9abc1652852ac39b1bb0b33" gracePeriod=600 Apr 16 22:17:58.078494 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:58.078449 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="kube-rbac-proxy" containerID="cri-o://4d030f618f2dab789403ad05c2f713d9da7ba887bf4c8ca9a77dda1657187a87" gracePeriod=600 Apr 16 22:17:58.078603 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:58.078476 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="kube-rbac-proxy-web" containerID="cri-o://340f3ad5f5cbc8749f846dc157ebb946b68570d45dcdb5c9d3fe348258708ba5" gracePeriod=600 Apr 16 22:17:58.078603 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:58.078484 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="kube-rbac-proxy-thanos" containerID="cri-o://54ca758bcc5e27b9d0b3954c43bf8ca8db130c3fdb4316533b4bfcd632a98626" gracePeriod=600 Apr 16 22:17:58.078603 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:58.078514 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="config-reloader" containerID="cri-o://220fea67850c7a871fa9d731d4f35dd83481a94ea4902195cd0b1ac251de37e6" gracePeriod=600 Apr 16 22:17:58.078603 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:58.078469 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="thanos-sidecar" containerID="cri-o://7b42a84ea1c4bb6d7b905e7aa997f60c43e4d3c73b6c9f8123ebc3d5a55428bb" gracePeriod=600 Apr 16 22:17:58.874206 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:58.874171 2576 generic.go:358] "Generic (PLEG): container finished" podID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerID="54ca758bcc5e27b9d0b3954c43bf8ca8db130c3fdb4316533b4bfcd632a98626" exitCode=0 Apr 16 22:17:58.874206 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:58.874198 2576 generic.go:358] "Generic (PLEG): container finished" podID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerID="4d030f618f2dab789403ad05c2f713d9da7ba887bf4c8ca9a77dda1657187a87" exitCode=0 Apr 16 22:17:58.874206 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:58.874204 2576 generic.go:358] "Generic (PLEG): container finished" podID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerID="7b42a84ea1c4bb6d7b905e7aa997f60c43e4d3c73b6c9f8123ebc3d5a55428bb" exitCode=0 Apr 16 22:17:58.874206 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:58.874210 2576 generic.go:358] "Generic (PLEG): container finished" podID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerID="220fea67850c7a871fa9d731d4f35dd83481a94ea4902195cd0b1ac251de37e6" exitCode=0 Apr 16 22:17:58.874206 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:58.874215 2576 generic.go:358] "Generic (PLEG): container finished" podID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerID="1f3fd500dd70aa2ccdd3b953d606e98bb45406a9c9abc1652852ac39b1bb0b33" exitCode=0 Apr 16 22:17:58.874502 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:58.874241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a49b216-e0ad-48d0-8f67-c06fe0627ece","Type":"ContainerDied","Data":"54ca758bcc5e27b9d0b3954c43bf8ca8db130c3fdb4316533b4bfcd632a98626"} Apr 16 22:17:58.874502 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:58.874280 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a49b216-e0ad-48d0-8f67-c06fe0627ece","Type":"ContainerDied","Data":"4d030f618f2dab789403ad05c2f713d9da7ba887bf4c8ca9a77dda1657187a87"} Apr 16 22:17:58.874502 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:58.874293 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a49b216-e0ad-48d0-8f67-c06fe0627ece","Type":"ContainerDied","Data":"7b42a84ea1c4bb6d7b905e7aa997f60c43e4d3c73b6c9f8123ebc3d5a55428bb"} Apr 16 22:17:58.874502 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:58.874306 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a49b216-e0ad-48d0-8f67-c06fe0627ece","Type":"ContainerDied","Data":"220fea67850c7a871fa9d731d4f35dd83481a94ea4902195cd0b1ac251de37e6"} Apr 16 22:17:58.874502 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:58.874319 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a49b216-e0ad-48d0-8f67-c06fe0627ece","Type":"ContainerDied","Data":"1f3fd500dd70aa2ccdd3b953d606e98bb45406a9c9abc1652852ac39b1bb0b33"} Apr 16 22:17:59.323412 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.323390 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:59.467411 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.467319 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-grpc-tls\") pod \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " Apr 16 22:17:59.467411 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.467363 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0a49b216-e0ad-48d0-8f67-c06fe0627ece-tls-assets\") pod \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " Apr 16 22:17:59.467411 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.467382 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-config\") pod \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " Apr 16 22:17:59.467411 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.467401 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-kube-rbac-proxy\") pod \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " Apr 16 22:17:59.467773 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.467438 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-prometheus-trusted-ca-bundle\") pod \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " Apr 16 22:17:59.467773 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.467470 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0a49b216-e0ad-48d0-8f67-c06fe0627ece-config-out\") pod \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " Apr 16 22:17:59.467773 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.467500 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0a49b216-e0ad-48d0-8f67-c06fe0627ece-prometheus-k8s-db\") pod \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " Apr 16 22:17:59.467773 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.467531 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-configmap-serving-certs-ca-bundle\") pod \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " Apr 16 22:17:59.467773 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.467556 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " Apr 16 22:17:59.467773 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.467596 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-metrics-client-certs\") pod \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " Apr 16 22:17:59.467773 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.467627 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-configmap-kubelet-serving-ca-bundle\") pod \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " Apr 16 22:17:59.467773 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.467656 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-web-config\") pod \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " Apr 16 22:17:59.467773 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.467713 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-prometheus-k8s-tls\") pod \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " Apr 16 22:17:59.467773 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.467741 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " Apr 16 22:17:59.468248 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.467783 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-prometheus-k8s-rulefiles-0\") pod \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " Apr 16 22:17:59.468248 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.467852 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mm9j\" (UniqueName: \"kubernetes.io/projected/0a49b216-e0ad-48d0-8f67-c06fe0627ece-kube-api-access-5mm9j\") pod \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " Apr 16 22:17:59.468248 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.467898 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-configmap-metrics-client-ca\") pod \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " Apr 16 22:17:59.468248 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.467927 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-thanos-prometheus-http-client-file\") pod \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\" (UID: \"0a49b216-e0ad-48d0-8f67-c06fe0627ece\") " Apr 16 22:17:59.468248 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.468094 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "0a49b216-e0ad-48d0-8f67-c06fe0627ece" (UID: "0a49b216-e0ad-48d0-8f67-c06fe0627ece"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:59.468248 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.468118 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "0a49b216-e0ad-48d0-8f67-c06fe0627ece" (UID: "0a49b216-e0ad-48d0-8f67-c06fe0627ece"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:59.468248 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.468231 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-prometheus-trusted-ca-bundle\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.468248 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.468251 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.470118 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.469983 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "0a49b216-e0ad-48d0-8f67-c06fe0627ece" (UID: "0a49b216-e0ad-48d0-8f67-c06fe0627ece"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:59.470118 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.469994 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "0a49b216-e0ad-48d0-8f67-c06fe0627ece" (UID: "0a49b216-e0ad-48d0-8f67-c06fe0627ece"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:59.470118 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.470083 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a49b216-e0ad-48d0-8f67-c06fe0627ece-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "0a49b216-e0ad-48d0-8f67-c06fe0627ece" (UID: "0a49b216-e0ad-48d0-8f67-c06fe0627ece"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:17:59.470366 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.470181 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-config" (OuterVolumeSpecName: "config") pod "0a49b216-e0ad-48d0-8f67-c06fe0627ece" (UID: "0a49b216-e0ad-48d0-8f67-c06fe0627ece"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:59.470903 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.470857 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "0a49b216-e0ad-48d0-8f67-c06fe0627ece" (UID: "0a49b216-e0ad-48d0-8f67-c06fe0627ece"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:59.470903 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.470860 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a49b216-e0ad-48d0-8f67-c06fe0627ece-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "0a49b216-e0ad-48d0-8f67-c06fe0627ece" (UID: "0a49b216-e0ad-48d0-8f67-c06fe0627ece"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:59.471108 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.471079 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "0a49b216-e0ad-48d0-8f67-c06fe0627ece" (UID: "0a49b216-e0ad-48d0-8f67-c06fe0627ece"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:59.471535 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.471492 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "0a49b216-e0ad-48d0-8f67-c06fe0627ece" (UID: "0a49b216-e0ad-48d0-8f67-c06fe0627ece"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:59.471624 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.471572 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "0a49b216-e0ad-48d0-8f67-c06fe0627ece" (UID: "0a49b216-e0ad-48d0-8f67-c06fe0627ece"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:59.471990 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.471957 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a49b216-e0ad-48d0-8f67-c06fe0627ece-config-out" (OuterVolumeSpecName: "config-out") pod "0a49b216-e0ad-48d0-8f67-c06fe0627ece" (UID: "0a49b216-e0ad-48d0-8f67-c06fe0627ece"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:17:59.472256 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.472236 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "0a49b216-e0ad-48d0-8f67-c06fe0627ece" (UID: "0a49b216-e0ad-48d0-8f67-c06fe0627ece"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:59.472597 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.472576 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "0a49b216-e0ad-48d0-8f67-c06fe0627ece" (UID: "0a49b216-e0ad-48d0-8f67-c06fe0627ece"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:59.472832 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.472810 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "0a49b216-e0ad-48d0-8f67-c06fe0627ece" (UID: "0a49b216-e0ad-48d0-8f67-c06fe0627ece"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:59.473556 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.473533 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "0a49b216-e0ad-48d0-8f67-c06fe0627ece" (UID: "0a49b216-e0ad-48d0-8f67-c06fe0627ece"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:59.473642 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.473547 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a49b216-e0ad-48d0-8f67-c06fe0627ece-kube-api-access-5mm9j" (OuterVolumeSpecName: "kube-api-access-5mm9j") pod "0a49b216-e0ad-48d0-8f67-c06fe0627ece" (UID: "0a49b216-e0ad-48d0-8f67-c06fe0627ece"). InnerVolumeSpecName "kube-api-access-5mm9j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:59.482519 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.482496 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-web-config" (OuterVolumeSpecName: "web-config") pod "0a49b216-e0ad-48d0-8f67-c06fe0627ece" (UID: "0a49b216-e0ad-48d0-8f67-c06fe0627ece"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:59.569221 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.569171 2576 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0a49b216-e0ad-48d0-8f67-c06fe0627ece-config-out\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.569221 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.569216 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0a49b216-e0ad-48d0-8f67-c06fe0627ece-prometheus-k8s-db\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.569221 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.569228 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.569221 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.569241 2576 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-metrics-client-certs\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.569486 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.569250 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.569486 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.569260 2576 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-web-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.569486 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.569270 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-prometheus-k8s-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.569486 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.569280 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.569486 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.569288 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.569486 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.569298 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5mm9j\" (UniqueName: \"kubernetes.io/projected/0a49b216-e0ad-48d0-8f67-c06fe0627ece-kube-api-access-5mm9j\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.569486 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.569306 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a49b216-e0ad-48d0-8f67-c06fe0627ece-configmap-metrics-client-ca\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.569486 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.569315 2576 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-thanos-prometheus-http-client-file\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.569486 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.569324 2576 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-grpc-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.569486 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.569333 2576 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0a49b216-e0ad-48d0-8f67-c06fe0627ece-tls-assets\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.569486 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.569342 2576 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.569486 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.569351 2576 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0a49b216-e0ad-48d0-8f67-c06fe0627ece-secret-kube-rbac-proxy\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.879688 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.879645 2576 generic.go:358] "Generic (PLEG): container finished" podID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerID="340f3ad5f5cbc8749f846dc157ebb946b68570d45dcdb5c9d3fe348258708ba5" exitCode=0 Apr 16 22:17:59.879854 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.879733 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a49b216-e0ad-48d0-8f67-c06fe0627ece","Type":"ContainerDied","Data":"340f3ad5f5cbc8749f846dc157ebb946b68570d45dcdb5c9d3fe348258708ba5"} Apr 16 22:17:59.879854 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.879774 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0a49b216-e0ad-48d0-8f67-c06fe0627ece","Type":"ContainerDied","Data":"6c5dfaebe1585b4263f0019a1b93764f5f139c3757ad6425e500284b120633fd"} Apr 16 22:17:59.879854 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.879779 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:59.879854 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.879791 2576 scope.go:117] "RemoveContainer" containerID="54ca758bcc5e27b9d0b3954c43bf8ca8db130c3fdb4316533b4bfcd632a98626" Apr 16 22:17:59.887503 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.887409 2576 scope.go:117] "RemoveContainer" containerID="4d030f618f2dab789403ad05c2f713d9da7ba887bf4c8ca9a77dda1657187a87" Apr 16 22:17:59.894050 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.894034 2576 scope.go:117] "RemoveContainer" containerID="340f3ad5f5cbc8749f846dc157ebb946b68570d45dcdb5c9d3fe348258708ba5" Apr 16 22:17:59.900312 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.900293 2576 scope.go:117] "RemoveContainer" containerID="7b42a84ea1c4bb6d7b905e7aa997f60c43e4d3c73b6c9f8123ebc3d5a55428bb" Apr 16 22:17:59.904736 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.904712 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:17:59.907635 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.907533 2576 scope.go:117] "RemoveContainer" containerID="220fea67850c7a871fa9d731d4f35dd83481a94ea4902195cd0b1ac251de37e6" Apr 16 22:17:59.909335 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.909313 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:17:59.914296 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.914279 2576 scope.go:117] "RemoveContainer" containerID="1f3fd500dd70aa2ccdd3b953d606e98bb45406a9c9abc1652852ac39b1bb0b33" Apr 16 22:17:59.921010 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.920988 2576 scope.go:117] "RemoveContainer" containerID="39d63e25e9efbfb936a1a5d61398d43a47acc78ccfef08e6a94fa6d7aedb46fd" Apr 16 22:17:59.927192 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.927175 2576 scope.go:117] "RemoveContainer" containerID="54ca758bcc5e27b9d0b3954c43bf8ca8db130c3fdb4316533b4bfcd632a98626" Apr 16 22:17:59.927460 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:17:59.927432 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54ca758bcc5e27b9d0b3954c43bf8ca8db130c3fdb4316533b4bfcd632a98626\": container with ID starting with 54ca758bcc5e27b9d0b3954c43bf8ca8db130c3fdb4316533b4bfcd632a98626 not found: ID does not exist" containerID="54ca758bcc5e27b9d0b3954c43bf8ca8db130c3fdb4316533b4bfcd632a98626" Apr 16 22:17:59.927543 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.927459 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54ca758bcc5e27b9d0b3954c43bf8ca8db130c3fdb4316533b4bfcd632a98626"} err="failed to get container status \"54ca758bcc5e27b9d0b3954c43bf8ca8db130c3fdb4316533b4bfcd632a98626\": rpc error: code = NotFound desc = could not find container \"54ca758bcc5e27b9d0b3954c43bf8ca8db130c3fdb4316533b4bfcd632a98626\": container with ID starting with 54ca758bcc5e27b9d0b3954c43bf8ca8db130c3fdb4316533b4bfcd632a98626 not found: ID does not exist" Apr 16 22:17:59.927543 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.927478 2576 scope.go:117] "RemoveContainer" containerID="4d030f618f2dab789403ad05c2f713d9da7ba887bf4c8ca9a77dda1657187a87" Apr 16 22:17:59.927711 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:17:59.927694 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d030f618f2dab789403ad05c2f713d9da7ba887bf4c8ca9a77dda1657187a87\": container with ID starting with 4d030f618f2dab789403ad05c2f713d9da7ba887bf4c8ca9a77dda1657187a87 not found: ID does not exist" containerID="4d030f618f2dab789403ad05c2f713d9da7ba887bf4c8ca9a77dda1657187a87" Apr 16 22:17:59.927761 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.927718 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d030f618f2dab789403ad05c2f713d9da7ba887bf4c8ca9a77dda1657187a87"} err="failed to get container status \"4d030f618f2dab789403ad05c2f713d9da7ba887bf4c8ca9a77dda1657187a87\": rpc error: code = NotFound desc = could not find container \"4d030f618f2dab789403ad05c2f713d9da7ba887bf4c8ca9a77dda1657187a87\": container with ID starting with 4d030f618f2dab789403ad05c2f713d9da7ba887bf4c8ca9a77dda1657187a87 not found: ID does not exist" Apr 16 22:17:59.927761 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.927736 2576 scope.go:117] "RemoveContainer" containerID="340f3ad5f5cbc8749f846dc157ebb946b68570d45dcdb5c9d3fe348258708ba5" Apr 16 22:17:59.927969 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:17:59.927953 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"340f3ad5f5cbc8749f846dc157ebb946b68570d45dcdb5c9d3fe348258708ba5\": container with ID starting with 340f3ad5f5cbc8749f846dc157ebb946b68570d45dcdb5c9d3fe348258708ba5 not found: ID does not exist" containerID="340f3ad5f5cbc8749f846dc157ebb946b68570d45dcdb5c9d3fe348258708ba5" Apr 16 22:17:59.928006 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.927975 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"340f3ad5f5cbc8749f846dc157ebb946b68570d45dcdb5c9d3fe348258708ba5"} err="failed to get container status \"340f3ad5f5cbc8749f846dc157ebb946b68570d45dcdb5c9d3fe348258708ba5\": rpc error: code = NotFound desc = could not find container \"340f3ad5f5cbc8749f846dc157ebb946b68570d45dcdb5c9d3fe348258708ba5\": container with ID starting with 340f3ad5f5cbc8749f846dc157ebb946b68570d45dcdb5c9d3fe348258708ba5 not found: ID does not exist" Apr 16 22:17:59.928006 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.927990 2576 scope.go:117] "RemoveContainer" containerID="7b42a84ea1c4bb6d7b905e7aa997f60c43e4d3c73b6c9f8123ebc3d5a55428bb" Apr 16 22:17:59.928192 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:17:59.928167 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b42a84ea1c4bb6d7b905e7aa997f60c43e4d3c73b6c9f8123ebc3d5a55428bb\": container with ID starting with 7b42a84ea1c4bb6d7b905e7aa997f60c43e4d3c73b6c9f8123ebc3d5a55428bb not found: ID does not exist" containerID="7b42a84ea1c4bb6d7b905e7aa997f60c43e4d3c73b6c9f8123ebc3d5a55428bb" Apr 16 22:17:59.928236 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.928196 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b42a84ea1c4bb6d7b905e7aa997f60c43e4d3c73b6c9f8123ebc3d5a55428bb"} err="failed to get container status \"7b42a84ea1c4bb6d7b905e7aa997f60c43e4d3c73b6c9f8123ebc3d5a55428bb\": rpc error: code = NotFound desc = could not find container \"7b42a84ea1c4bb6d7b905e7aa997f60c43e4d3c73b6c9f8123ebc3d5a55428bb\": container with ID starting with 7b42a84ea1c4bb6d7b905e7aa997f60c43e4d3c73b6c9f8123ebc3d5a55428bb not found: ID does not exist" Apr 16 22:17:59.928236 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.928209 2576 scope.go:117] "RemoveContainer" containerID="220fea67850c7a871fa9d731d4f35dd83481a94ea4902195cd0b1ac251de37e6" Apr 16 22:17:59.928445 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:17:59.928428 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"220fea67850c7a871fa9d731d4f35dd83481a94ea4902195cd0b1ac251de37e6\": container with ID starting with 220fea67850c7a871fa9d731d4f35dd83481a94ea4902195cd0b1ac251de37e6 not found: ID does not exist" containerID="220fea67850c7a871fa9d731d4f35dd83481a94ea4902195cd0b1ac251de37e6" Apr 16 22:17:59.928559 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.928447 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"220fea67850c7a871fa9d731d4f35dd83481a94ea4902195cd0b1ac251de37e6"} err="failed to get container status \"220fea67850c7a871fa9d731d4f35dd83481a94ea4902195cd0b1ac251de37e6\": rpc error: code = NotFound desc = could not find container \"220fea67850c7a871fa9d731d4f35dd83481a94ea4902195cd0b1ac251de37e6\": container with ID starting with 220fea67850c7a871fa9d731d4f35dd83481a94ea4902195cd0b1ac251de37e6 not found: ID does not exist" Apr 16 22:17:59.928559 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.928460 2576 scope.go:117] "RemoveContainer" containerID="1f3fd500dd70aa2ccdd3b953d606e98bb45406a9c9abc1652852ac39b1bb0b33" Apr 16 22:17:59.928693 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:17:59.928658 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f3fd500dd70aa2ccdd3b953d606e98bb45406a9c9abc1652852ac39b1bb0b33\": container with ID starting with 1f3fd500dd70aa2ccdd3b953d606e98bb45406a9c9abc1652852ac39b1bb0b33 not found: ID does not exist" containerID="1f3fd500dd70aa2ccdd3b953d606e98bb45406a9c9abc1652852ac39b1bb0b33" Apr 16 22:17:59.928753 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.928700 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f3fd500dd70aa2ccdd3b953d606e98bb45406a9c9abc1652852ac39b1bb0b33"} err="failed to get container status \"1f3fd500dd70aa2ccdd3b953d606e98bb45406a9c9abc1652852ac39b1bb0b33\": rpc error: code = NotFound desc = could not find container \"1f3fd500dd70aa2ccdd3b953d606e98bb45406a9c9abc1652852ac39b1bb0b33\": container with ID starting with 1f3fd500dd70aa2ccdd3b953d606e98bb45406a9c9abc1652852ac39b1bb0b33 not found: ID does not exist" Apr 16 22:17:59.928753 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.928719 2576 scope.go:117] "RemoveContainer" containerID="39d63e25e9efbfb936a1a5d61398d43a47acc78ccfef08e6a94fa6d7aedb46fd" Apr 16 22:17:59.929000 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:17:59.928982 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39d63e25e9efbfb936a1a5d61398d43a47acc78ccfef08e6a94fa6d7aedb46fd\": container with ID starting with 39d63e25e9efbfb936a1a5d61398d43a47acc78ccfef08e6a94fa6d7aedb46fd not found: ID does not exist" containerID="39d63e25e9efbfb936a1a5d61398d43a47acc78ccfef08e6a94fa6d7aedb46fd" Apr 16 22:17:59.929072 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.929004 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39d63e25e9efbfb936a1a5d61398d43a47acc78ccfef08e6a94fa6d7aedb46fd"} err="failed to get container status \"39d63e25e9efbfb936a1a5d61398d43a47acc78ccfef08e6a94fa6d7aedb46fd\": rpc error: code = NotFound desc = could not find container \"39d63e25e9efbfb936a1a5d61398d43a47acc78ccfef08e6a94fa6d7aedb46fd\": container with ID starting with 39d63e25e9efbfb936a1a5d61398d43a47acc78ccfef08e6a94fa6d7aedb46fd not found: ID does not exist" Apr 16 22:17:59.937843 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.937821 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:17:59.938101 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938088 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="init-config-reloader" Apr 16 22:17:59.938147 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938113 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="init-config-reloader" Apr 16 22:17:59.938147 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938127 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="thanos-sidecar" Apr 16 22:17:59.938147 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938133 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="thanos-sidecar" Apr 16 22:17:59.938147 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938138 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="kube-rbac-proxy-thanos" Apr 16 22:17:59.938147 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938143 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="kube-rbac-proxy-thanos" Apr 16 22:17:59.938289 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938150 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="kube-rbac-proxy" Apr 16 22:17:59.938289 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938155 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="kube-rbac-proxy" Apr 16 22:17:59.938289 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938161 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="config-reloader" Apr 16 22:17:59.938289 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938166 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="config-reloader" Apr 16 22:17:59.938289 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938174 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="kube-rbac-proxy-web" Apr 16 22:17:59.938289 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938179 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="kube-rbac-proxy-web" Apr 16 22:17:59.938289 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938187 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="prometheus" Apr 16 22:17:59.938289 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938194 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="prometheus" Apr 16 22:17:59.938289 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938200 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6bd0cb3-d839-4036-ad36-1b9f7849d1fa" containerName="console" Apr 16 22:17:59.938289 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938206 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6bd0cb3-d839-4036-ad36-1b9f7849d1fa" containerName="console" Apr 16 22:17:59.938289 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938254 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="kube-rbac-proxy" Apr 16 22:17:59.938289 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938263 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="prometheus" Apr 16 22:17:59.938289 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938269 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6bd0cb3-d839-4036-ad36-1b9f7849d1fa" containerName="console" Apr 16 22:17:59.938289 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938275 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="config-reloader" Apr 16 22:17:59.938289 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938281 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="thanos-sidecar" Apr 16 22:17:59.938289 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938287 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="kube-rbac-proxy-thanos" Apr 16 22:17:59.938289 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.938295 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" containerName="kube-rbac-proxy-web" Apr 16 22:17:59.943746 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.943729 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:59.946492 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.946474 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 22:17:59.946643 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.946479 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 22:17:59.946800 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.946757 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 22:17:59.946862 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.946840 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 22:17:59.946862 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.946849 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-569v80od7lahq\"" Apr 16 22:17:59.946994 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.946977 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 22:17:59.947103 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.947043 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 22:17:59.947103 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.946981 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 22:17:59.947317 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.947298 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 22:17:59.947361 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.947298 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 22:17:59.947564 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.947535 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 22:17:59.947612 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.947599 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-hqvcv\"" Apr 16 22:17:59.947926 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.947903 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 22:17:59.950438 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.950415 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 22:17:59.954295 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.954277 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 22:17:59.956644 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.956602 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:17:59.973283 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.973256 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56ec1120-485a-41be-b08a-da982885fb24-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:59.973408 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.973292 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:59.973408 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.973328 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:59.973408 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.973353 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-web-config\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:59.973408 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.973377 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj6zx\" (UniqueName: \"kubernetes.io/projected/56ec1120-485a-41be-b08a-da982885fb24-kube-api-access-hj6zx\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:59.973408 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.973401 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:59.973661 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.973427 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:59.973661 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.973466 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/56ec1120-485a-41be-b08a-da982885fb24-config-out\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:59.973661 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.973494 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:59.973661 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.973524 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56ec1120-485a-41be-b08a-da982885fb24-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:59.973661 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.973549 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/56ec1120-485a-41be-b08a-da982885fb24-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:59.973661 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.973586 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56ec1120-485a-41be-b08a-da982885fb24-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:59.973661 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.973633 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56ec1120-485a-41be-b08a-da982885fb24-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:59.973964 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.973702 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-config\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:59.973964 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.973726 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/56ec1120-485a-41be-b08a-da982885fb24-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:59.973964 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.973748 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:59.973964 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.973776 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:59.973964 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:17:59.973844 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/56ec1120-485a-41be-b08a-da982885fb24-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.074974 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.074937 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-config\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.074974 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.074976 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/56ec1120-485a-41be-b08a-da982885fb24-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.075218 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.074995 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.075218 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.075016 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.075218 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.075052 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/56ec1120-485a-41be-b08a-da982885fb24-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.075218 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.075093 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56ec1120-485a-41be-b08a-da982885fb24-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.075218 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.075121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.075218 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.075151 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.075218 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.075174 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-web-config\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.075548 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.075300 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hj6zx\" (UniqueName: \"kubernetes.io/projected/56ec1120-485a-41be-b08a-da982885fb24-kube-api-access-hj6zx\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.075548 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.075347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.075548 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.075378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.075548 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.075416 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/56ec1120-485a-41be-b08a-da982885fb24-config-out\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.075548 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.075446 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.075548 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.075481 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56ec1120-485a-41be-b08a-da982885fb24-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.075548 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.075505 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/56ec1120-485a-41be-b08a-da982885fb24-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.075548 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.075540 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56ec1120-485a-41be-b08a-da982885fb24-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.075967 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.075595 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56ec1120-485a-41be-b08a-da982885fb24-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.076976 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.076849 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56ec1120-485a-41be-b08a-da982885fb24-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.077329 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.077088 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/56ec1120-485a-41be-b08a-da982885fb24-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.077766 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.077742 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56ec1120-485a-41be-b08a-da982885fb24-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.077927 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.077903 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56ec1120-485a-41be-b08a-da982885fb24-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.078443 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.078416 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.078852 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.078826 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-config\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.079377 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.079353 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.080062 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.079743 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/56ec1120-485a-41be-b08a-da982885fb24-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.080062 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.080017 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.080214 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.080066 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/56ec1120-485a-41be-b08a-da982885fb24-config-out\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.080214 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.080111 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.080214 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.080172 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56ec1120-485a-41be-b08a-da982885fb24-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.080328 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.080219 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.080328 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.080243 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.080328 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.080302 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/56ec1120-485a-41be-b08a-da982885fb24-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.080416 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.080393 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.081373 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.081355 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/56ec1120-485a-41be-b08a-da982885fb24-web-config\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.089208 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.089186 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj6zx\" (UniqueName: \"kubernetes.io/projected/56ec1120-485a-41be-b08a-da982885fb24-kube-api-access-hj6zx\") pod \"prometheus-k8s-0\" (UID: \"56ec1120-485a-41be-b08a-da982885fb24\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.253230 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.253197 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:00.375834 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.375811 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:18:00.378768 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:18:00.378739 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56ec1120_485a_41be_b08a_da982885fb24.slice/crio-9f58c61b2547987626df2b2f159b381a527bbef00878279d7f8183c3b67716b6 WatchSource:0}: Error finding container 9f58c61b2547987626df2b2f159b381a527bbef00878279d7f8183c3b67716b6: Status 404 returned error can't find the container with id 9f58c61b2547987626df2b2f159b381a527bbef00878279d7f8183c3b67716b6 Apr 16 22:18:00.883894 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.883853 2576 generic.go:358] "Generic (PLEG): container finished" podID="56ec1120-485a-41be-b08a-da982885fb24" containerID="57b568383d339038faf8b2956888350da4e245babab42c968d79bcae549177ae" exitCode=0 Apr 16 22:18:00.884066 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.883937 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56ec1120-485a-41be-b08a-da982885fb24","Type":"ContainerDied","Data":"57b568383d339038faf8b2956888350da4e245babab42c968d79bcae549177ae"} Apr 16 22:18:00.884066 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:00.883973 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56ec1120-485a-41be-b08a-da982885fb24","Type":"ContainerStarted","Data":"9f58c61b2547987626df2b2f159b381a527bbef00878279d7f8183c3b67716b6"} Apr 16 22:18:01.019941 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:01.019910 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a49b216-e0ad-48d0-8f67-c06fe0627ece" path="/var/lib/kubelet/pods/0a49b216-e0ad-48d0-8f67-c06fe0627ece/volumes" Apr 16 22:18:01.890010 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:01.889977 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56ec1120-485a-41be-b08a-da982885fb24","Type":"ContainerStarted","Data":"0cfc5db552d44e0df21f351b1d5c8fad385126273c3d2149af38a9ccee7e7c54"} Apr 16 22:18:01.890010 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:01.890011 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56ec1120-485a-41be-b08a-da982885fb24","Type":"ContainerStarted","Data":"e0afa21dc7b5b1e5bc50b115c0bc8db27acc30685573e90f75ef4a9d9683c1e8"} Apr 16 22:18:01.890516 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:01.890023 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56ec1120-485a-41be-b08a-da982885fb24","Type":"ContainerStarted","Data":"f1528fe4205cb68c321de118bbfbd7be8cb8454a6452c6626b8ac59cdd31465a"} Apr 16 22:18:01.890516 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:01.890033 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56ec1120-485a-41be-b08a-da982885fb24","Type":"ContainerStarted","Data":"607c1696480bca8a05c250fb7b5b2f39d2e408c753cb85b18bae474212f5c5a2"} Apr 16 22:18:01.890516 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:01.890042 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56ec1120-485a-41be-b08a-da982885fb24","Type":"ContainerStarted","Data":"d6fd490f0c6b4658724a274c89c71dee0bb65ca7ccd8bcc023269fff1eaeba26"} Apr 16 22:18:01.890516 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:01.890050 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"56ec1120-485a-41be-b08a-da982885fb24","Type":"ContainerStarted","Data":"50c4486f7d0c0c6568f0252b2f1d1891c9bcbb527685e2865f4d7fd56d1bd205"} Apr 16 22:18:01.916281 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:01.916230 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.916214954 podStartE2EDuration="2.916214954s" podCreationTimestamp="2026-04-16 22:17:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:18:01.914798104 +0000 UTC m=+277.448138956" watchObservedRunningTime="2026-04-16 22:18:01.916214954 +0000 UTC m=+277.449555807" Apr 16 22:18:03.451458 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:18:03.451395 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-fbgc2" podUID="a290a4ed-5ccc-47be-bd46-836ba21fea56" Apr 16 22:18:03.896822 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:03.896783 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fbgc2" Apr 16 22:18:05.253509 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:05.253471 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:07.337144 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:07.337097 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls\") pod \"dns-default-fbgc2\" (UID: \"a290a4ed-5ccc-47be-bd46-836ba21fea56\") " pod="openshift-dns/dns-default-fbgc2" Apr 16 22:18:07.337144 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:07.337147 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert\") pod \"ingress-canary-s8bss\" (UID: \"49f46636-9c8e-44e1-88ec-d5c207868f31\") " pod="openshift-ingress-canary/ingress-canary-s8bss" Apr 16 22:18:07.339458 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:07.339429 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a290a4ed-5ccc-47be-bd46-836ba21fea56-metrics-tls\") pod \"dns-default-fbgc2\" (UID: \"a290a4ed-5ccc-47be-bd46-836ba21fea56\") " pod="openshift-dns/dns-default-fbgc2" Apr 16 22:18:07.339574 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:07.339557 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49f46636-9c8e-44e1-88ec-d5c207868f31-cert\") pod \"ingress-canary-s8bss\" (UID: \"49f46636-9c8e-44e1-88ec-d5c207868f31\") " pod="openshift-ingress-canary/ingress-canary-s8bss" Apr 16 22:18:07.500056 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:07.500022 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-66k94\"" Apr 16 22:18:07.508389 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:07.508350 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fbgc2" Apr 16 22:18:07.618962 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:07.618895 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-b4hjg\"" Apr 16 22:18:07.626742 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:07.626713 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s8bss" Apr 16 22:18:07.629473 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:07.629456 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fbgc2"] Apr 16 22:18:07.633156 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:18:07.633133 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda290a4ed_5ccc_47be_bd46_836ba21fea56.slice/crio-6ac58cfd4f2b59106cc11ea08e56a599d57f11d413441953ef865531ff2b91a1 WatchSource:0}: Error finding container 6ac58cfd4f2b59106cc11ea08e56a599d57f11d413441953ef865531ff2b91a1: Status 404 returned error can't find the container with id 6ac58cfd4f2b59106cc11ea08e56a599d57f11d413441953ef865531ff2b91a1 Apr 16 22:18:07.744926 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:07.744876 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-s8bss"] Apr 16 22:18:07.747018 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:18:07.746990 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49f46636_9c8e_44e1_88ec_d5c207868f31.slice/crio-44f553c1dbdaebe973ae5f258d415c701577f6f6b219791792a9a781d29a5ca0 WatchSource:0}: Error finding container 44f553c1dbdaebe973ae5f258d415c701577f6f6b219791792a9a781d29a5ca0: Status 404 returned error can't find the container with id 44f553c1dbdaebe973ae5f258d415c701577f6f6b219791792a9a781d29a5ca0 Apr 16 22:18:07.911501 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:07.911405 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-s8bss" event={"ID":"49f46636-9c8e-44e1-88ec-d5c207868f31","Type":"ContainerStarted","Data":"44f553c1dbdaebe973ae5f258d415c701577f6f6b219791792a9a781d29a5ca0"} Apr 16 22:18:07.912392 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:07.912368 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fbgc2" event={"ID":"a290a4ed-5ccc-47be-bd46-836ba21fea56","Type":"ContainerStarted","Data":"6ac58cfd4f2b59106cc11ea08e56a599d57f11d413441953ef865531ff2b91a1"} Apr 16 22:18:09.918991 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:09.918955 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-s8bss" event={"ID":"49f46636-9c8e-44e1-88ec-d5c207868f31","Type":"ContainerStarted","Data":"3775bdd542e4d5e83af27d134814a848f1aa9306863995757d43461b86073a1e"} Apr 16 22:18:09.920502 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:09.920445 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fbgc2" event={"ID":"a290a4ed-5ccc-47be-bd46-836ba21fea56","Type":"ContainerStarted","Data":"f63ac63ead4cf18df689d32d9f9f531463eb0cd9cc63f6b75eeb20a29652106e"} Apr 16 22:18:09.920502 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:09.920476 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fbgc2" event={"ID":"a290a4ed-5ccc-47be-bd46-836ba21fea56","Type":"ContainerStarted","Data":"83b5ac7453b82fec7994f2b06d61f3f694e2fdf032b80866a7f878da65353d6b"} Apr 16 22:18:09.920635 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:09.920554 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-fbgc2" Apr 16 22:18:09.934786 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:09.934743 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-s8bss" podStartSLOduration=251.265516016 podStartE2EDuration="4m12.934729765s" podCreationTimestamp="2026-04-16 22:13:57 +0000 UTC" firstStartedPulling="2026-04-16 22:18:07.74883557 +0000 UTC m=+283.282176400" lastFinishedPulling="2026-04-16 22:18:09.418049318 +0000 UTC m=+284.951390149" observedRunningTime="2026-04-16 22:18:09.933492289 +0000 UTC m=+285.466833143" watchObservedRunningTime="2026-04-16 22:18:09.934729765 +0000 UTC m=+285.468070618" Apr 16 22:18:09.949754 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:09.949699 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fbgc2" podStartSLOduration=251.170057085 podStartE2EDuration="4m12.9496599s" podCreationTimestamp="2026-04-16 22:13:57 +0000 UTC" firstStartedPulling="2026-04-16 22:18:07.635260812 +0000 UTC m=+283.168601642" lastFinishedPulling="2026-04-16 22:18:09.414863627 +0000 UTC m=+284.948204457" observedRunningTime="2026-04-16 22:18:09.948745132 +0000 UTC m=+285.482085986" watchObservedRunningTime="2026-04-16 22:18:09.9496599 +0000 UTC m=+285.483000754" Apr 16 22:18:19.925360 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:19.925327 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fbgc2" Apr 16 22:18:24.955951 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:24.955407 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 22:18:24.955951 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:24.955846 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 22:18:24.965201 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:18:24.965180 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 22:19:00.253947 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:19:00.253902 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:19:00.269639 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:19:00.269613 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:19:01.083121 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:19:01.083091 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:21:51.213876 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:21:51.213844 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-s8g9n"] Apr 16 22:21:51.217116 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:21:51.217099 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-s8g9n" Apr 16 22:21:51.219792 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:21:51.219771 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 22:21:51.219908 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:21:51.219794 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 22:21:51.219908 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:21:51.219807 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-8mtsr\"" Apr 16 22:21:51.220692 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:21:51.220658 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 22:21:51.229195 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:21:51.229174 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-s8g9n"] Apr 16 22:21:51.328488 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:21:51.328434 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/845f0fa7-49c2-4f4b-a017-9f376b6c1499-data\") pod \"seaweedfs-86cc847c5c-s8g9n\" (UID: \"845f0fa7-49c2-4f4b-a017-9f376b6c1499\") " pod="kserve/seaweedfs-86cc847c5c-s8g9n" Apr 16 22:21:51.328703 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:21:51.328573 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qhh9\" (UniqueName: \"kubernetes.io/projected/845f0fa7-49c2-4f4b-a017-9f376b6c1499-kube-api-access-4qhh9\") pod \"seaweedfs-86cc847c5c-s8g9n\" (UID: \"845f0fa7-49c2-4f4b-a017-9f376b6c1499\") " pod="kserve/seaweedfs-86cc847c5c-s8g9n" Apr 16 22:21:51.429416 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:21:51.429375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/845f0fa7-49c2-4f4b-a017-9f376b6c1499-data\") pod \"seaweedfs-86cc847c5c-s8g9n\" (UID: \"845f0fa7-49c2-4f4b-a017-9f376b6c1499\") " pod="kserve/seaweedfs-86cc847c5c-s8g9n" Apr 16 22:21:51.429608 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:21:51.429455 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qhh9\" (UniqueName: \"kubernetes.io/projected/845f0fa7-49c2-4f4b-a017-9f376b6c1499-kube-api-access-4qhh9\") pod \"seaweedfs-86cc847c5c-s8g9n\" (UID: \"845f0fa7-49c2-4f4b-a017-9f376b6c1499\") " pod="kserve/seaweedfs-86cc847c5c-s8g9n" Apr 16 22:21:51.429872 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:21:51.429852 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/845f0fa7-49c2-4f4b-a017-9f376b6c1499-data\") pod \"seaweedfs-86cc847c5c-s8g9n\" (UID: \"845f0fa7-49c2-4f4b-a017-9f376b6c1499\") " pod="kserve/seaweedfs-86cc847c5c-s8g9n" Apr 16 22:21:51.437555 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:21:51.437535 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qhh9\" (UniqueName: \"kubernetes.io/projected/845f0fa7-49c2-4f4b-a017-9f376b6c1499-kube-api-access-4qhh9\") pod \"seaweedfs-86cc847c5c-s8g9n\" (UID: \"845f0fa7-49c2-4f4b-a017-9f376b6c1499\") " pod="kserve/seaweedfs-86cc847c5c-s8g9n" Apr 16 22:21:51.525868 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:21:51.525838 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-s8g9n" Apr 16 22:21:51.647815 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:21:51.647787 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-s8g9n"] Apr 16 22:21:51.650379 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:21:51.650339 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod845f0fa7_49c2_4f4b_a017_9f376b6c1499.slice/crio-4d8848ecfcbf723d589b74763086f393b6ec9f83630f0b8bad681fbd67babde8 WatchSource:0}: Error finding container 4d8848ecfcbf723d589b74763086f393b6ec9f83630f0b8bad681fbd67babde8: Status 404 returned error can't find the container with id 4d8848ecfcbf723d589b74763086f393b6ec9f83630f0b8bad681fbd67babde8 Apr 16 22:21:51.651907 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:21:51.651886 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:21:52.549268 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:21:52.549232 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-s8g9n" event={"ID":"845f0fa7-49c2-4f4b-a017-9f376b6c1499","Type":"ContainerStarted","Data":"4d8848ecfcbf723d589b74763086f393b6ec9f83630f0b8bad681fbd67babde8"} Apr 16 22:21:54.557197 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:21:54.557104 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-s8g9n" event={"ID":"845f0fa7-49c2-4f4b-a017-9f376b6c1499","Type":"ContainerStarted","Data":"2c76f769213e6dbccd4703da835c91863aa3770339a1e9c79646131ca313e3cb"} Apr 16 22:21:54.557554 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:21:54.557216 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-s8g9n" Apr 16 22:21:54.573917 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:21:54.573871 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-s8g9n" podStartSLOduration=0.951776573 podStartE2EDuration="3.573858413s" podCreationTimestamp="2026-04-16 22:21:51 +0000 UTC" firstStartedPulling="2026-04-16 22:21:51.652074823 +0000 UTC m=+507.185415668" lastFinishedPulling="2026-04-16 22:21:54.274156674 +0000 UTC m=+509.807497508" observedRunningTime="2026-04-16 22:21:54.572728409 +0000 UTC m=+510.106069264" watchObservedRunningTime="2026-04-16 22:21:54.573858413 +0000 UTC m=+510.107199319" Apr 16 22:22:00.561355 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:22:00.561322 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-s8g9n" Apr 16 22:23:01.821135 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:01.821097 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-59r9p"] Apr 16 22:23:01.824220 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:01.824202 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-59r9p" Apr 16 22:23:01.827287 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:01.827268 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-cldbc\"" Apr 16 22:23:01.828597 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:01.828582 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 22:23:01.839464 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:01.839439 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-59r9p"] Apr 16 22:23:01.840650 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:01.840623 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-qgtjz"] Apr 16 22:23:01.843860 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:01.843843 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-qgtjz" Apr 16 22:23:01.846594 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:01.846576 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 22:23:01.846690 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:01.846597 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-hrsfs\"" Apr 16 22:23:01.855331 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:01.855307 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-qgtjz"] Apr 16 22:23:02.003789 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:02.003752 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d876dfd-77c0-49a2-9c5a-eee24f383f55-cert\") pod \"odh-model-controller-696fc77849-qgtjz\" (UID: \"7d876dfd-77c0-49a2-9c5a-eee24f383f55\") " pod="kserve/odh-model-controller-696fc77849-qgtjz" Apr 16 22:23:02.003965 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:02.003796 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1aa1eaad-9f70-457a-8176-c63289f84a68-tls-certs\") pod \"model-serving-api-86f7b4b499-59r9p\" (UID: \"1aa1eaad-9f70-457a-8176-c63289f84a68\") " pod="kserve/model-serving-api-86f7b4b499-59r9p" Apr 16 22:23:02.003965 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:02.003909 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khmps\" (UniqueName: \"kubernetes.io/projected/7d876dfd-77c0-49a2-9c5a-eee24f383f55-kube-api-access-khmps\") pod \"odh-model-controller-696fc77849-qgtjz\" (UID: \"7d876dfd-77c0-49a2-9c5a-eee24f383f55\") " pod="kserve/odh-model-controller-696fc77849-qgtjz" Apr 16 22:23:02.003965 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:02.003937 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmmhc\" (UniqueName: \"kubernetes.io/projected/1aa1eaad-9f70-457a-8176-c63289f84a68-kube-api-access-tmmhc\") pod \"model-serving-api-86f7b4b499-59r9p\" (UID: \"1aa1eaad-9f70-457a-8176-c63289f84a68\") " pod="kserve/model-serving-api-86f7b4b499-59r9p" Apr 16 22:23:02.105322 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:02.105236 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khmps\" (UniqueName: \"kubernetes.io/projected/7d876dfd-77c0-49a2-9c5a-eee24f383f55-kube-api-access-khmps\") pod \"odh-model-controller-696fc77849-qgtjz\" (UID: \"7d876dfd-77c0-49a2-9c5a-eee24f383f55\") " pod="kserve/odh-model-controller-696fc77849-qgtjz" Apr 16 22:23:02.105322 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:02.105276 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmmhc\" (UniqueName: \"kubernetes.io/projected/1aa1eaad-9f70-457a-8176-c63289f84a68-kube-api-access-tmmhc\") pod \"model-serving-api-86f7b4b499-59r9p\" (UID: \"1aa1eaad-9f70-457a-8176-c63289f84a68\") " pod="kserve/model-serving-api-86f7b4b499-59r9p" Apr 16 22:23:02.105322 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:02.105304 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d876dfd-77c0-49a2-9c5a-eee24f383f55-cert\") pod \"odh-model-controller-696fc77849-qgtjz\" (UID: \"7d876dfd-77c0-49a2-9c5a-eee24f383f55\") " pod="kserve/odh-model-controller-696fc77849-qgtjz" Apr 16 22:23:02.105546 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:02.105468 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1aa1eaad-9f70-457a-8176-c63289f84a68-tls-certs\") pod \"model-serving-api-86f7b4b499-59r9p\" (UID: \"1aa1eaad-9f70-457a-8176-c63289f84a68\") " pod="kserve/model-serving-api-86f7b4b499-59r9p" Apr 16 22:23:02.105623 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:23:02.105601 2576 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 16 22:23:02.105709 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:23:02.105699 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1aa1eaad-9f70-457a-8176-c63289f84a68-tls-certs podName:1aa1eaad-9f70-457a-8176-c63289f84a68 nodeName:}" failed. No retries permitted until 2026-04-16 22:23:02.605661053 +0000 UTC m=+578.139001891 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/1aa1eaad-9f70-457a-8176-c63289f84a68-tls-certs") pod "model-serving-api-86f7b4b499-59r9p" (UID: "1aa1eaad-9f70-457a-8176-c63289f84a68") : secret "model-serving-api-tls" not found Apr 16 22:23:02.107838 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:02.107821 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d876dfd-77c0-49a2-9c5a-eee24f383f55-cert\") pod \"odh-model-controller-696fc77849-qgtjz\" (UID: \"7d876dfd-77c0-49a2-9c5a-eee24f383f55\") " pod="kserve/odh-model-controller-696fc77849-qgtjz" Apr 16 22:23:02.114061 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:02.114037 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khmps\" (UniqueName: \"kubernetes.io/projected/7d876dfd-77c0-49a2-9c5a-eee24f383f55-kube-api-access-khmps\") pod \"odh-model-controller-696fc77849-qgtjz\" (UID: \"7d876dfd-77c0-49a2-9c5a-eee24f383f55\") " pod="kserve/odh-model-controller-696fc77849-qgtjz" Apr 16 22:23:02.114451 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:02.114430 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmmhc\" (UniqueName: \"kubernetes.io/projected/1aa1eaad-9f70-457a-8176-c63289f84a68-kube-api-access-tmmhc\") pod \"model-serving-api-86f7b4b499-59r9p\" (UID: \"1aa1eaad-9f70-457a-8176-c63289f84a68\") " pod="kserve/model-serving-api-86f7b4b499-59r9p" Apr 16 22:23:02.153879 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:02.153835 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-qgtjz" Apr 16 22:23:02.275228 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:02.275204 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-qgtjz"] Apr 16 22:23:02.277321 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:23:02.277293 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d876dfd_77c0_49a2_9c5a_eee24f383f55.slice/crio-84d107155ef5dcce4fe4c362db5eae8730edbd1c79d854605b990aa491cbc5ed WatchSource:0}: Error finding container 84d107155ef5dcce4fe4c362db5eae8730edbd1c79d854605b990aa491cbc5ed: Status 404 returned error can't find the container with id 84d107155ef5dcce4fe4c362db5eae8730edbd1c79d854605b990aa491cbc5ed Apr 16 22:23:02.610451 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:02.610410 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1aa1eaad-9f70-457a-8176-c63289f84a68-tls-certs\") pod \"model-serving-api-86f7b4b499-59r9p\" (UID: \"1aa1eaad-9f70-457a-8176-c63289f84a68\") " pod="kserve/model-serving-api-86f7b4b499-59r9p" Apr 16 22:23:02.612799 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:02.612780 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1aa1eaad-9f70-457a-8176-c63289f84a68-tls-certs\") pod \"model-serving-api-86f7b4b499-59r9p\" (UID: \"1aa1eaad-9f70-457a-8176-c63289f84a68\") " pod="kserve/model-serving-api-86f7b4b499-59r9p" Apr 16 22:23:02.733691 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:02.733635 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-59r9p" Apr 16 22:23:02.750720 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:02.750663 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-qgtjz" event={"ID":"7d876dfd-77c0-49a2-9c5a-eee24f383f55","Type":"ContainerStarted","Data":"84d107155ef5dcce4fe4c362db5eae8730edbd1c79d854605b990aa491cbc5ed"} Apr 16 22:23:02.856784 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:02.856756 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-59r9p"] Apr 16 22:23:02.859340 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:23:02.859304 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1aa1eaad_9f70_457a_8176_c63289f84a68.slice/crio-689b76f907220044f88f7d9cd2d1f4feb47c75e78c8b1afc44e35a1232a2f789 WatchSource:0}: Error finding container 689b76f907220044f88f7d9cd2d1f4feb47c75e78c8b1afc44e35a1232a2f789: Status 404 returned error can't find the container with id 689b76f907220044f88f7d9cd2d1f4feb47c75e78c8b1afc44e35a1232a2f789 Apr 16 22:23:03.756167 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:03.756094 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-59r9p" event={"ID":"1aa1eaad-9f70-457a-8176-c63289f84a68","Type":"ContainerStarted","Data":"689b76f907220044f88f7d9cd2d1f4feb47c75e78c8b1afc44e35a1232a2f789"} Apr 16 22:23:06.766511 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:06.766479 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-qgtjz" event={"ID":"7d876dfd-77c0-49a2-9c5a-eee24f383f55","Type":"ContainerStarted","Data":"7fc5275eb997ee4ecaf729674d0846968f046b0198d043e0a5cb299284ad2d8a"} Apr 16 22:23:06.766971 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:06.766561 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-qgtjz" Apr 16 22:23:06.767864 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:06.767843 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-59r9p" event={"ID":"1aa1eaad-9f70-457a-8176-c63289f84a68","Type":"ContainerStarted","Data":"9681de876281019375f0dbf9aeae35772a1c3b3e07281cff767a1a637a29192c"} Apr 16 22:23:06.767964 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:06.767950 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-59r9p" Apr 16 22:23:06.799640 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:06.799592 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-qgtjz" podStartSLOduration=2.068613696 podStartE2EDuration="5.79957951s" podCreationTimestamp="2026-04-16 22:23:01 +0000 UTC" firstStartedPulling="2026-04-16 22:23:02.278432743 +0000 UTC m=+577.811773574" lastFinishedPulling="2026-04-16 22:23:06.009398556 +0000 UTC m=+581.542739388" observedRunningTime="2026-04-16 22:23:06.798106257 +0000 UTC m=+582.331447111" watchObservedRunningTime="2026-04-16 22:23:06.79957951 +0000 UTC m=+582.332920363" Apr 16 22:23:06.846301 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:06.846246 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-59r9p" podStartSLOduration=2.642558262 podStartE2EDuration="5.846232571s" podCreationTimestamp="2026-04-16 22:23:01 +0000 UTC" firstStartedPulling="2026-04-16 22:23:02.861294598 +0000 UTC m=+578.394635428" lastFinishedPulling="2026-04-16 22:23:06.064968904 +0000 UTC m=+581.598309737" observedRunningTime="2026-04-16 22:23:06.845498606 +0000 UTC m=+582.378839460" watchObservedRunningTime="2026-04-16 22:23:06.846232571 +0000 UTC m=+582.379573424" Apr 16 22:23:07.664004 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.663971 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-69bdffc497-pfzk8"] Apr 16 22:23:07.667442 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.667416 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:07.671400 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.671377 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 22:23:07.671740 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.671670 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 22:23:07.671914 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.671745 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-qmtph\"" Apr 16 22:23:07.672509 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.672481 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 22:23:07.672509 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.672498 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 22:23:07.672653 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.672640 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 22:23:07.677234 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.677214 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 22:23:07.680548 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.680530 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69bdffc497-pfzk8"] Apr 16 22:23:07.752309 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.752280 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be0d1b5c-ef47-43d8-8f95-2bf7ab155349-console-oauth-config\") pod \"console-69bdffc497-pfzk8\" (UID: \"be0d1b5c-ef47-43d8-8f95-2bf7ab155349\") " pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:07.752495 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.752319 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be0d1b5c-ef47-43d8-8f95-2bf7ab155349-oauth-serving-cert\") pod \"console-69bdffc497-pfzk8\" (UID: \"be0d1b5c-ef47-43d8-8f95-2bf7ab155349\") " pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:07.752495 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.752341 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w75s6\" (UniqueName: \"kubernetes.io/projected/be0d1b5c-ef47-43d8-8f95-2bf7ab155349-kube-api-access-w75s6\") pod \"console-69bdffc497-pfzk8\" (UID: \"be0d1b5c-ef47-43d8-8f95-2bf7ab155349\") " pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:07.752495 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.752468 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be0d1b5c-ef47-43d8-8f95-2bf7ab155349-console-serving-cert\") pod \"console-69bdffc497-pfzk8\" (UID: \"be0d1b5c-ef47-43d8-8f95-2bf7ab155349\") " pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:07.752645 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.752501 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be0d1b5c-ef47-43d8-8f95-2bf7ab155349-trusted-ca-bundle\") pod \"console-69bdffc497-pfzk8\" (UID: \"be0d1b5c-ef47-43d8-8f95-2bf7ab155349\") " pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:07.752645 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.752580 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be0d1b5c-ef47-43d8-8f95-2bf7ab155349-service-ca\") pod \"console-69bdffc497-pfzk8\" (UID: \"be0d1b5c-ef47-43d8-8f95-2bf7ab155349\") " pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:07.752645 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.752629 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be0d1b5c-ef47-43d8-8f95-2bf7ab155349-console-config\") pod \"console-69bdffc497-pfzk8\" (UID: \"be0d1b5c-ef47-43d8-8f95-2bf7ab155349\") " pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:07.852975 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.852937 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be0d1b5c-ef47-43d8-8f95-2bf7ab155349-console-oauth-config\") pod \"console-69bdffc497-pfzk8\" (UID: \"be0d1b5c-ef47-43d8-8f95-2bf7ab155349\") " pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:07.853448 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.852988 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be0d1b5c-ef47-43d8-8f95-2bf7ab155349-oauth-serving-cert\") pod \"console-69bdffc497-pfzk8\" (UID: \"be0d1b5c-ef47-43d8-8f95-2bf7ab155349\") " pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:07.853448 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.853009 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w75s6\" (UniqueName: \"kubernetes.io/projected/be0d1b5c-ef47-43d8-8f95-2bf7ab155349-kube-api-access-w75s6\") pod \"console-69bdffc497-pfzk8\" (UID: \"be0d1b5c-ef47-43d8-8f95-2bf7ab155349\") " pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:07.853448 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.853075 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be0d1b5c-ef47-43d8-8f95-2bf7ab155349-console-serving-cert\") pod \"console-69bdffc497-pfzk8\" (UID: \"be0d1b5c-ef47-43d8-8f95-2bf7ab155349\") " pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:07.853448 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.853109 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be0d1b5c-ef47-43d8-8f95-2bf7ab155349-trusted-ca-bundle\") pod \"console-69bdffc497-pfzk8\" (UID: \"be0d1b5c-ef47-43d8-8f95-2bf7ab155349\") " pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:07.853448 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.853162 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be0d1b5c-ef47-43d8-8f95-2bf7ab155349-service-ca\") pod \"console-69bdffc497-pfzk8\" (UID: \"be0d1b5c-ef47-43d8-8f95-2bf7ab155349\") " pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:07.853448 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.853228 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be0d1b5c-ef47-43d8-8f95-2bf7ab155349-console-config\") pod \"console-69bdffc497-pfzk8\" (UID: \"be0d1b5c-ef47-43d8-8f95-2bf7ab155349\") " pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:07.853978 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.853956 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/be0d1b5c-ef47-43d8-8f95-2bf7ab155349-oauth-serving-cert\") pod \"console-69bdffc497-pfzk8\" (UID: \"be0d1b5c-ef47-43d8-8f95-2bf7ab155349\") " pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:07.854183 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.854165 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/be0d1b5c-ef47-43d8-8f95-2bf7ab155349-console-config\") pod \"console-69bdffc497-pfzk8\" (UID: \"be0d1b5c-ef47-43d8-8f95-2bf7ab155349\") " pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:07.854254 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.854231 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be0d1b5c-ef47-43d8-8f95-2bf7ab155349-service-ca\") pod \"console-69bdffc497-pfzk8\" (UID: \"be0d1b5c-ef47-43d8-8f95-2bf7ab155349\") " pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:07.854310 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.854295 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be0d1b5c-ef47-43d8-8f95-2bf7ab155349-trusted-ca-bundle\") pod \"console-69bdffc497-pfzk8\" (UID: \"be0d1b5c-ef47-43d8-8f95-2bf7ab155349\") " pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:07.855640 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.855616 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/be0d1b5c-ef47-43d8-8f95-2bf7ab155349-console-oauth-config\") pod \"console-69bdffc497-pfzk8\" (UID: \"be0d1b5c-ef47-43d8-8f95-2bf7ab155349\") " pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:07.855749 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.855711 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/be0d1b5c-ef47-43d8-8f95-2bf7ab155349-console-serving-cert\") pod \"console-69bdffc497-pfzk8\" (UID: \"be0d1b5c-ef47-43d8-8f95-2bf7ab155349\") " pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:07.860875 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.860852 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w75s6\" (UniqueName: \"kubernetes.io/projected/be0d1b5c-ef47-43d8-8f95-2bf7ab155349-kube-api-access-w75s6\") pod \"console-69bdffc497-pfzk8\" (UID: \"be0d1b5c-ef47-43d8-8f95-2bf7ab155349\") " pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:07.978246 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:07.978155 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:08.099899 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:08.099875 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69bdffc497-pfzk8"] Apr 16 22:23:08.102099 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:23:08.102067 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe0d1b5c_ef47_43d8_8f95_2bf7ab155349.slice/crio-a5a15c2c3a69b580c5014e3dc6950d7174b93780623f2c41a0c832e7b4c837aa WatchSource:0}: Error finding container a5a15c2c3a69b580c5014e3dc6950d7174b93780623f2c41a0c832e7b4c837aa: Status 404 returned error can't find the container with id a5a15c2c3a69b580c5014e3dc6950d7174b93780623f2c41a0c832e7b4c837aa Apr 16 22:23:08.774427 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:08.774391 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69bdffc497-pfzk8" event={"ID":"be0d1b5c-ef47-43d8-8f95-2bf7ab155349","Type":"ContainerStarted","Data":"e8dae84b2144b1136f3beaea3a4dea52262adc72a3710e90b90c908712177646"} Apr 16 22:23:08.774427 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:08.774429 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69bdffc497-pfzk8" event={"ID":"be0d1b5c-ef47-43d8-8f95-2bf7ab155349","Type":"ContainerStarted","Data":"a5a15c2c3a69b580c5014e3dc6950d7174b93780623f2c41a0c832e7b4c837aa"} Apr 16 22:23:08.792738 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:08.792686 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69bdffc497-pfzk8" podStartSLOduration=1.792655726 podStartE2EDuration="1.792655726s" podCreationTimestamp="2026-04-16 22:23:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:23:08.790483102 +0000 UTC m=+584.323823959" watchObservedRunningTime="2026-04-16 22:23:08.792655726 +0000 UTC m=+584.325996579" Apr 16 22:23:17.773083 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:17.773051 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-qgtjz" Apr 16 22:23:17.774771 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:17.774746 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-59r9p" Apr 16 22:23:17.978851 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:17.978792 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:17.978851 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:17.978859 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:17.983665 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:17.983639 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:18.543910 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:18.543874 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-zdfks"] Apr 16 22:23:18.547089 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:18.547048 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-zdfks" Apr 16 22:23:18.552515 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:18.552489 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-zdfks"] Apr 16 22:23:18.647534 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:18.647491 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-527cj\" (UniqueName: \"kubernetes.io/projected/7a9f6297-f5f9-4dff-9ef5-05f34355699d-kube-api-access-527cj\") pod \"s3-init-zdfks\" (UID: \"7a9f6297-f5f9-4dff-9ef5-05f34355699d\") " pod="kserve/s3-init-zdfks" Apr 16 22:23:18.748967 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:18.748928 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-527cj\" (UniqueName: \"kubernetes.io/projected/7a9f6297-f5f9-4dff-9ef5-05f34355699d-kube-api-access-527cj\") pod \"s3-init-zdfks\" (UID: \"7a9f6297-f5f9-4dff-9ef5-05f34355699d\") " pod="kserve/s3-init-zdfks" Apr 16 22:23:18.757177 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:18.757147 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-527cj\" (UniqueName: \"kubernetes.io/projected/7a9f6297-f5f9-4dff-9ef5-05f34355699d-kube-api-access-527cj\") pod \"s3-init-zdfks\" (UID: \"7a9f6297-f5f9-4dff-9ef5-05f34355699d\") " pod="kserve/s3-init-zdfks" Apr 16 22:23:18.809605 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:18.809523 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69bdffc497-pfzk8" Apr 16 22:23:18.870898 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:18.870856 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-zdfks" Apr 16 22:23:19.003768 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:19.003730 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-zdfks"] Apr 16 22:23:19.006373 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:23:19.006347 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a9f6297_f5f9_4dff_9ef5_05f34355699d.slice/crio-6806a888faa9ec0a983abfec6b3b17ee4bd1b0dbe18d6fc743e284526508832e WatchSource:0}: Error finding container 6806a888faa9ec0a983abfec6b3b17ee4bd1b0dbe18d6fc743e284526508832e: Status 404 returned error can't find the container with id 6806a888faa9ec0a983abfec6b3b17ee4bd1b0dbe18d6fc743e284526508832e Apr 16 22:23:19.810546 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:19.810499 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-zdfks" event={"ID":"7a9f6297-f5f9-4dff-9ef5-05f34355699d","Type":"ContainerStarted","Data":"6806a888faa9ec0a983abfec6b3b17ee4bd1b0dbe18d6fc743e284526508832e"} Apr 16 22:23:23.824296 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:23.824261 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-zdfks" event={"ID":"7a9f6297-f5f9-4dff-9ef5-05f34355699d","Type":"ContainerStarted","Data":"cb2542ab63b6b5f825529de560d2446ada9109f6eb26a0be29184c5666870446"} Apr 16 22:23:23.838810 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:23.838762 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-zdfks" podStartSLOduration=1.401123434 podStartE2EDuration="5.838748469s" podCreationTimestamp="2026-04-16 22:23:18 +0000 UTC" firstStartedPulling="2026-04-16 22:23:19.008275958 +0000 UTC m=+594.541616789" lastFinishedPulling="2026-04-16 22:23:23.445900982 +0000 UTC m=+598.979241824" observedRunningTime="2026-04-16 22:23:23.836797006 +0000 UTC m=+599.370137860" watchObservedRunningTime="2026-04-16 22:23:23.838748469 +0000 UTC m=+599.372089322" Apr 16 22:23:24.987427 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:24.987390 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 22:23:24.988315 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:24.988293 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 22:23:26.834523 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:26.834488 2576 generic.go:358] "Generic (PLEG): container finished" podID="7a9f6297-f5f9-4dff-9ef5-05f34355699d" containerID="cb2542ab63b6b5f825529de560d2446ada9109f6eb26a0be29184c5666870446" exitCode=0 Apr 16 22:23:26.834911 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:26.834564 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-zdfks" event={"ID":"7a9f6297-f5f9-4dff-9ef5-05f34355699d","Type":"ContainerDied","Data":"cb2542ab63b6b5f825529de560d2446ada9109f6eb26a0be29184c5666870446"} Apr 16 22:23:27.959451 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:27.959430 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-zdfks" Apr 16 22:23:28.035499 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:28.035457 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-527cj\" (UniqueName: \"kubernetes.io/projected/7a9f6297-f5f9-4dff-9ef5-05f34355699d-kube-api-access-527cj\") pod \"7a9f6297-f5f9-4dff-9ef5-05f34355699d\" (UID: \"7a9f6297-f5f9-4dff-9ef5-05f34355699d\") " Apr 16 22:23:28.037642 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:28.037620 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a9f6297-f5f9-4dff-9ef5-05f34355699d-kube-api-access-527cj" (OuterVolumeSpecName: "kube-api-access-527cj") pod "7a9f6297-f5f9-4dff-9ef5-05f34355699d" (UID: "7a9f6297-f5f9-4dff-9ef5-05f34355699d"). InnerVolumeSpecName "kube-api-access-527cj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:23:28.136915 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:28.136827 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-527cj\" (UniqueName: \"kubernetes.io/projected/7a9f6297-f5f9-4dff-9ef5-05f34355699d-kube-api-access-527cj\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:23:28.840737 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:28.840704 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-zdfks" event={"ID":"7a9f6297-f5f9-4dff-9ef5-05f34355699d","Type":"ContainerDied","Data":"6806a888faa9ec0a983abfec6b3b17ee4bd1b0dbe18d6fc743e284526508832e"} Apr 16 22:23:28.840737 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:28.840739 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6806a888faa9ec0a983abfec6b3b17ee4bd1b0dbe18d6fc743e284526508832e" Apr 16 22:23:28.840953 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:28.840717 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-zdfks" Apr 16 22:23:38.198341 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.198291 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk"] Apr 16 22:23:38.198802 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.198692 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a9f6297-f5f9-4dff-9ef5-05f34355699d" containerName="s3-init" Apr 16 22:23:38.198802 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.198707 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9f6297-f5f9-4dff-9ef5-05f34355699d" containerName="s3-init" Apr 16 22:23:38.198802 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.198760 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a9f6297-f5f9-4dff-9ef5-05f34355699d" containerName="s3-init" Apr 16 22:23:38.202068 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.202052 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" Apr 16 22:23:38.204572 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.204538 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-37a6e-predictor-serving-cert\"" Apr 16 22:23:38.204704 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.204574 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 22:23:38.204704 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.204629 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 22:23:38.205643 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.205626 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-37a6e-kube-rbac-proxy-sar-config\"" Apr 16 22:23:38.205763 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.205631 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4hqzk\"" Apr 16 22:23:38.210637 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.210614 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk"] Apr 16 22:23:38.327813 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.327778 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-37a6e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8034d9e1-e8f8-4dbd-b1cf-111598e79d7d-error-404-isvc-37a6e-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk\" (UID: \"8034d9e1-e8f8-4dbd-b1cf-111598e79d7d\") " pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" Apr 16 22:23:38.327979 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.327843 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8034d9e1-e8f8-4dbd-b1cf-111598e79d7d-proxy-tls\") pod \"error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk\" (UID: \"8034d9e1-e8f8-4dbd-b1cf-111598e79d7d\") " pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" Apr 16 22:23:38.327979 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.327901 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtkz7\" (UniqueName: \"kubernetes.io/projected/8034d9e1-e8f8-4dbd-b1cf-111598e79d7d-kube-api-access-wtkz7\") pod \"error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk\" (UID: \"8034d9e1-e8f8-4dbd-b1cf-111598e79d7d\") " pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" Apr 16 22:23:38.428468 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.428431 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-37a6e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8034d9e1-e8f8-4dbd-b1cf-111598e79d7d-error-404-isvc-37a6e-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk\" (UID: \"8034d9e1-e8f8-4dbd-b1cf-111598e79d7d\") " pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" Apr 16 22:23:38.428644 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.428486 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8034d9e1-e8f8-4dbd-b1cf-111598e79d7d-proxy-tls\") pod \"error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk\" (UID: \"8034d9e1-e8f8-4dbd-b1cf-111598e79d7d\") " pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" Apr 16 22:23:38.428644 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.428520 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtkz7\" (UniqueName: \"kubernetes.io/projected/8034d9e1-e8f8-4dbd-b1cf-111598e79d7d-kube-api-access-wtkz7\") pod \"error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk\" (UID: \"8034d9e1-e8f8-4dbd-b1cf-111598e79d7d\") " pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" Apr 16 22:23:38.429183 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.429160 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-37a6e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8034d9e1-e8f8-4dbd-b1cf-111598e79d7d-error-404-isvc-37a6e-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk\" (UID: \"8034d9e1-e8f8-4dbd-b1cf-111598e79d7d\") " pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" Apr 16 22:23:38.430979 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.430961 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8034d9e1-e8f8-4dbd-b1cf-111598e79d7d-proxy-tls\") pod \"error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk\" (UID: \"8034d9e1-e8f8-4dbd-b1cf-111598e79d7d\") " pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" Apr 16 22:23:38.437256 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.437224 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtkz7\" (UniqueName: \"kubernetes.io/projected/8034d9e1-e8f8-4dbd-b1cf-111598e79d7d-kube-api-access-wtkz7\") pod \"error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk\" (UID: \"8034d9e1-e8f8-4dbd-b1cf-111598e79d7d\") " pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" Apr 16 22:23:38.514104 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.514065 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" Apr 16 22:23:38.638851 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.638812 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk"] Apr 16 22:23:38.643220 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:23:38.643188 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8034d9e1_e8f8_4dbd_b1cf_111598e79d7d.slice/crio-09c952e914d02d4adf4ea68e15d04f5e876d14440089feae183a379360d91987 WatchSource:0}: Error finding container 09c952e914d02d4adf4ea68e15d04f5e876d14440089feae183a379360d91987: Status 404 returned error can't find the container with id 09c952e914d02d4adf4ea68e15d04f5e876d14440089feae183a379360d91987 Apr 16 22:23:38.872275 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:38.872187 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" event={"ID":"8034d9e1-e8f8-4dbd-b1cf-111598e79d7d","Type":"ContainerStarted","Data":"09c952e914d02d4adf4ea68e15d04f5e876d14440089feae183a379360d91987"} Apr 16 22:23:39.029160 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:39.029127 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr"] Apr 16 22:23:39.096110 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:39.096074 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr"] Apr 16 22:23:39.096272 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:39.096213 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" Apr 16 22:23:39.098958 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:39.098932 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\"" Apr 16 22:23:39.099080 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:39.099007 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-predictor-serving-cert\"" Apr 16 22:23:39.235254 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:39.235166 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bebb89fc-9a91-4f94-bf3f-3891b05410c3-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr\" (UID: \"bebb89fc-9a91-4f94-bf3f-3891b05410c3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" Apr 16 22:23:39.235720 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:39.235300 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bebb89fc-9a91-4f94-bf3f-3891b05410c3-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr\" (UID: \"bebb89fc-9a91-4f94-bf3f-3891b05410c3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" Apr 16 22:23:39.235720 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:39.235356 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw7m4\" (UniqueName: \"kubernetes.io/projected/bebb89fc-9a91-4f94-bf3f-3891b05410c3-kube-api-access-cw7m4\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr\" (UID: \"bebb89fc-9a91-4f94-bf3f-3891b05410c3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" Apr 16 22:23:39.235720 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:39.235401 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bebb89fc-9a91-4f94-bf3f-3891b05410c3-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr\" (UID: \"bebb89fc-9a91-4f94-bf3f-3891b05410c3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" Apr 16 22:23:39.336121 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:39.336082 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bebb89fc-9a91-4f94-bf3f-3891b05410c3-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr\" (UID: \"bebb89fc-9a91-4f94-bf3f-3891b05410c3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" Apr 16 22:23:39.336356 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:39.336167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bebb89fc-9a91-4f94-bf3f-3891b05410c3-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr\" (UID: \"bebb89fc-9a91-4f94-bf3f-3891b05410c3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" Apr 16 22:23:39.336356 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:39.336212 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cw7m4\" (UniqueName: \"kubernetes.io/projected/bebb89fc-9a91-4f94-bf3f-3891b05410c3-kube-api-access-cw7m4\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr\" (UID: \"bebb89fc-9a91-4f94-bf3f-3891b05410c3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" Apr 16 22:23:39.336356 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:39.336252 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bebb89fc-9a91-4f94-bf3f-3891b05410c3-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr\" (UID: \"bebb89fc-9a91-4f94-bf3f-3891b05410c3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" Apr 16 22:23:39.336700 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:39.336655 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bebb89fc-9a91-4f94-bf3f-3891b05410c3-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr\" (UID: \"bebb89fc-9a91-4f94-bf3f-3891b05410c3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" Apr 16 22:23:39.336976 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:39.336952 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bebb89fc-9a91-4f94-bf3f-3891b05410c3-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr\" (UID: \"bebb89fc-9a91-4f94-bf3f-3891b05410c3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" Apr 16 22:23:39.339819 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:39.339779 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bebb89fc-9a91-4f94-bf3f-3891b05410c3-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr\" (UID: \"bebb89fc-9a91-4f94-bf3f-3891b05410c3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" Apr 16 22:23:39.345710 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:39.345655 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw7m4\" (UniqueName: \"kubernetes.io/projected/bebb89fc-9a91-4f94-bf3f-3891b05410c3-kube-api-access-cw7m4\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr\" (UID: \"bebb89fc-9a91-4f94-bf3f-3891b05410c3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" Apr 16 22:23:39.407588 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:39.407299 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" Apr 16 22:23:39.580116 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:39.577984 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr"] Apr 16 22:23:39.582797 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:23:39.582765 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbebb89fc_9a91_4f94_bf3f_3891b05410c3.slice/crio-379aa82dd4caca805f0079af02899ddafc7d66590c9cd9c579681fdd9cb28db9 WatchSource:0}: Error finding container 379aa82dd4caca805f0079af02899ddafc7d66590c9cd9c579681fdd9cb28db9: Status 404 returned error can't find the container with id 379aa82dd4caca805f0079af02899ddafc7d66590c9cd9c579681fdd9cb28db9 Apr 16 22:23:39.884889 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:39.884794 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" event={"ID":"bebb89fc-9a91-4f94-bf3f-3891b05410c3","Type":"ContainerStarted","Data":"379aa82dd4caca805f0079af02899ddafc7d66590c9cd9c579681fdd9cb28db9"} Apr 16 22:23:50.935374 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:50.935324 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" event={"ID":"bebb89fc-9a91-4f94-bf3f-3891b05410c3","Type":"ContainerStarted","Data":"1d2838cb75dbf849c3134de48372e9c261d4d528f5154d247f87ee57740306fa"} Apr 16 22:23:50.937583 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:50.937557 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" event={"ID":"8034d9e1-e8f8-4dbd-b1cf-111598e79d7d","Type":"ContainerStarted","Data":"5c464710ad353e3abcc8ebe298769b3b11341c42b98aae74c694dc28999ed088"} Apr 16 22:23:52.951247 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:52.951212 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" event={"ID":"8034d9e1-e8f8-4dbd-b1cf-111598e79d7d","Type":"ContainerStarted","Data":"777089d8a3fb42feb634b219be47bbadd7bf078b63cfc0af511c18710c6ede8b"} Apr 16 22:23:52.951706 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:52.951412 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" Apr 16 22:23:52.972427 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:52.972379 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" podStartSLOduration=0.844207211 podStartE2EDuration="14.972366644s" podCreationTimestamp="2026-04-16 22:23:38 +0000 UTC" firstStartedPulling="2026-04-16 22:23:38.645272758 +0000 UTC m=+614.178613603" lastFinishedPulling="2026-04-16 22:23:52.773432202 +0000 UTC m=+628.306773036" observedRunningTime="2026-04-16 22:23:52.970641003 +0000 UTC m=+628.503981856" watchObservedRunningTime="2026-04-16 22:23:52.972366644 +0000 UTC m=+628.505707497" Apr 16 22:23:53.954961 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:53.954921 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" Apr 16 22:23:53.956041 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:53.956017 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" podUID="8034d9e1-e8f8-4dbd-b1cf-111598e79d7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 22:23:54.958551 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:54.958515 2576 generic.go:358] "Generic (PLEG): container finished" podID="bebb89fc-9a91-4f94-bf3f-3891b05410c3" containerID="1d2838cb75dbf849c3134de48372e9c261d4d528f5154d247f87ee57740306fa" exitCode=0 Apr 16 22:23:54.958952 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:54.958584 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" event={"ID":"bebb89fc-9a91-4f94-bf3f-3891b05410c3","Type":"ContainerDied","Data":"1d2838cb75dbf849c3134de48372e9c261d4d528f5154d247f87ee57740306fa"} Apr 16 22:23:54.959100 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:54.959073 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" podUID="8034d9e1-e8f8-4dbd-b1cf-111598e79d7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 22:23:59.965354 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:59.965320 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" Apr 16 22:23:59.965922 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:23:59.965896 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" podUID="8034d9e1-e8f8-4dbd-b1cf-111598e79d7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 22:24:00.982803 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:24:00.982771 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" event={"ID":"bebb89fc-9a91-4f94-bf3f-3891b05410c3","Type":"ContainerStarted","Data":"1566c8bee5d8e9706f0c6b0e7127806a4b1a10f2bcd996a99c03288562e29d7e"} Apr 16 22:24:00.983165 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:24:00.982814 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" event={"ID":"bebb89fc-9a91-4f94-bf3f-3891b05410c3","Type":"ContainerStarted","Data":"8d60a8fe8c76bd10477c6670055ff132f0c50c1e4d952780916711b00aa0ffde"} Apr 16 22:24:00.983165 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:24:00.982999 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" Apr 16 22:24:01.000439 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:24:01.000395 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" podStartSLOduration=0.791518013 podStartE2EDuration="22.000382513s" podCreationTimestamp="2026-04-16 22:23:39 +0000 UTC" firstStartedPulling="2026-04-16 22:23:39.587117529 +0000 UTC m=+615.120458365" lastFinishedPulling="2026-04-16 22:24:00.795982034 +0000 UTC m=+636.329322865" observedRunningTime="2026-04-16 22:24:00.999287276 +0000 UTC m=+636.532628130" watchObservedRunningTime="2026-04-16 22:24:01.000382513 +0000 UTC m=+636.533723364" Apr 16 22:24:01.986525 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:24:01.986485 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" Apr 16 22:24:01.987757 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:24:01.987731 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" podUID="bebb89fc-9a91-4f94-bf3f-3891b05410c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 22:24:02.989723 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:24:02.989667 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" podUID="bebb89fc-9a91-4f94-bf3f-3891b05410c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 22:24:07.993790 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:24:07.993760 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" Apr 16 22:24:07.994322 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:24:07.994296 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" podUID="bebb89fc-9a91-4f94-bf3f-3891b05410c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 22:24:09.966412 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:24:09.966376 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" podUID="8034d9e1-e8f8-4dbd-b1cf-111598e79d7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 22:24:17.995177 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:24:17.995135 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" podUID="bebb89fc-9a91-4f94-bf3f-3891b05410c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 22:24:19.966425 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:24:19.966380 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" podUID="8034d9e1-e8f8-4dbd-b1cf-111598e79d7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 22:24:27.994629 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:24:27.994576 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" podUID="bebb89fc-9a91-4f94-bf3f-3891b05410c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 22:24:29.966016 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:24:29.965980 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" podUID="8034d9e1-e8f8-4dbd-b1cf-111598e79d7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 22:24:37.994823 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:24:37.994786 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" podUID="bebb89fc-9a91-4f94-bf3f-3891b05410c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 22:24:39.966552 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:24:39.966524 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" Apr 16 22:24:47.994278 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:24:47.994238 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" podUID="bebb89fc-9a91-4f94-bf3f-3891b05410c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 22:24:57.995005 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:24:57.994962 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" podUID="bebb89fc-9a91-4f94-bf3f-3891b05410c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 22:25:07.994828 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:07.994795 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" Apr 16 22:25:12.314342 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:12.314311 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk"] Apr 16 22:25:12.314870 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:12.314574 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" podUID="8034d9e1-e8f8-4dbd-b1cf-111598e79d7d" containerName="kserve-container" containerID="cri-o://5c464710ad353e3abcc8ebe298769b3b11341c42b98aae74c694dc28999ed088" gracePeriod=30 Apr 16 22:25:12.314870 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:12.314636 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" podUID="8034d9e1-e8f8-4dbd-b1cf-111598e79d7d" containerName="kube-rbac-proxy" containerID="cri-o://777089d8a3fb42feb634b219be47bbadd7bf078b63cfc0af511c18710c6ede8b" gracePeriod=30 Apr 16 22:25:12.368622 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:12.368596 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d"] Apr 16 22:25:12.371438 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:12.371421 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" Apr 16 22:25:12.374034 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:12.374011 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-73a06-predictor-serving-cert\"" Apr 16 22:25:12.374121 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:12.374019 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-73a06-kube-rbac-proxy-sar-config\"" Apr 16 22:25:12.381162 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:12.381140 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d"] Apr 16 22:25:12.551715 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:12.551653 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j84gg\" (UniqueName: \"kubernetes.io/projected/df00fd8e-6611-4196-bf94-b543d085d8cf-kube-api-access-j84gg\") pod \"error-404-isvc-73a06-predictor-ccd458b55-5zv2d\" (UID: \"df00fd8e-6611-4196-bf94-b543d085d8cf\") " pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" Apr 16 22:25:12.551868 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:12.551801 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df00fd8e-6611-4196-bf94-b543d085d8cf-proxy-tls\") pod \"error-404-isvc-73a06-predictor-ccd458b55-5zv2d\" (UID: \"df00fd8e-6611-4196-bf94-b543d085d8cf\") " pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" Apr 16 22:25:12.551868 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:12.551838 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-73a06-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df00fd8e-6611-4196-bf94-b543d085d8cf-error-404-isvc-73a06-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-73a06-predictor-ccd458b55-5zv2d\" (UID: \"df00fd8e-6611-4196-bf94-b543d085d8cf\") " pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" Apr 16 22:25:12.652917 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:12.652834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j84gg\" (UniqueName: \"kubernetes.io/projected/df00fd8e-6611-4196-bf94-b543d085d8cf-kube-api-access-j84gg\") pod \"error-404-isvc-73a06-predictor-ccd458b55-5zv2d\" (UID: \"df00fd8e-6611-4196-bf94-b543d085d8cf\") " pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" Apr 16 22:25:12.652917 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:12.652912 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df00fd8e-6611-4196-bf94-b543d085d8cf-proxy-tls\") pod \"error-404-isvc-73a06-predictor-ccd458b55-5zv2d\" (UID: \"df00fd8e-6611-4196-bf94-b543d085d8cf\") " pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" Apr 16 22:25:12.653114 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:12.652944 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-73a06-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df00fd8e-6611-4196-bf94-b543d085d8cf-error-404-isvc-73a06-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-73a06-predictor-ccd458b55-5zv2d\" (UID: \"df00fd8e-6611-4196-bf94-b543d085d8cf\") " pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" Apr 16 22:25:12.653604 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:12.653580 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-73a06-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df00fd8e-6611-4196-bf94-b543d085d8cf-error-404-isvc-73a06-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-73a06-predictor-ccd458b55-5zv2d\" (UID: \"df00fd8e-6611-4196-bf94-b543d085d8cf\") " pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" Apr 16 22:25:12.655325 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:12.655303 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df00fd8e-6611-4196-bf94-b543d085d8cf-proxy-tls\") pod \"error-404-isvc-73a06-predictor-ccd458b55-5zv2d\" (UID: \"df00fd8e-6611-4196-bf94-b543d085d8cf\") " pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" Apr 16 22:25:12.661783 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:12.661764 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j84gg\" (UniqueName: \"kubernetes.io/projected/df00fd8e-6611-4196-bf94-b543d085d8cf-kube-api-access-j84gg\") pod \"error-404-isvc-73a06-predictor-ccd458b55-5zv2d\" (UID: \"df00fd8e-6611-4196-bf94-b543d085d8cf\") " pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" Apr 16 22:25:12.681706 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:12.681667 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" Apr 16 22:25:12.802300 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:12.802279 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d"] Apr 16 22:25:12.804623 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:25:12.804594 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf00fd8e_6611_4196_bf94_b543d085d8cf.slice/crio-1069b6c8458f7f2669506073d16142e782246a6a48e7a5729396f56dcca64782 WatchSource:0}: Error finding container 1069b6c8458f7f2669506073d16142e782246a6a48e7a5729396f56dcca64782: Status 404 returned error can't find the container with id 1069b6c8458f7f2669506073d16142e782246a6a48e7a5729396f56dcca64782 Apr 16 22:25:13.227883 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:13.227793 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" event={"ID":"df00fd8e-6611-4196-bf94-b543d085d8cf","Type":"ContainerStarted","Data":"7a6a6655240b4d9d480d4b55db9f3e6b35cf0335c8936deeb2b5b48086ace803"} Apr 16 22:25:13.227883 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:13.227834 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" event={"ID":"df00fd8e-6611-4196-bf94-b543d085d8cf","Type":"ContainerStarted","Data":"0cb28798f7036fbffa22103c6b402138a8f504fd79650f4ab1bd975df160b9b9"} Apr 16 22:25:13.227883 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:13.227846 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" event={"ID":"df00fd8e-6611-4196-bf94-b543d085d8cf","Type":"ContainerStarted","Data":"1069b6c8458f7f2669506073d16142e782246a6a48e7a5729396f56dcca64782"} Apr 16 22:25:13.228186 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:13.227907 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" Apr 16 22:25:13.229310 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:13.229286 2576 generic.go:358] "Generic (PLEG): container finished" podID="8034d9e1-e8f8-4dbd-b1cf-111598e79d7d" containerID="777089d8a3fb42feb634b219be47bbadd7bf078b63cfc0af511c18710c6ede8b" exitCode=2 Apr 16 22:25:13.229397 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:13.229310 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" event={"ID":"8034d9e1-e8f8-4dbd-b1cf-111598e79d7d","Type":"ContainerDied","Data":"777089d8a3fb42feb634b219be47bbadd7bf078b63cfc0af511c18710c6ede8b"} Apr 16 22:25:13.246101 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:13.246061 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" podStartSLOduration=1.246050091 podStartE2EDuration="1.246050091s" podCreationTimestamp="2026-04-16 22:25:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:25:13.244515419 +0000 UTC m=+708.777856283" watchObservedRunningTime="2026-04-16 22:25:13.246050091 +0000 UTC m=+708.779390944" Apr 16 22:25:14.233525 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:14.233453 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" Apr 16 22:25:14.234667 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:14.234636 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" podUID="df00fd8e-6611-4196-bf94-b543d085d8cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 22:25:14.959628 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:14.959586 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" podUID="8034d9e1-e8f8-4dbd-b1cf-111598e79d7d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.29:8643/healthz\": dial tcp 10.132.0.29:8643: connect: connection refused" Apr 16 22:25:15.239734 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:15.239636 2576 generic.go:358] "Generic (PLEG): container finished" podID="8034d9e1-e8f8-4dbd-b1cf-111598e79d7d" containerID="5c464710ad353e3abcc8ebe298769b3b11341c42b98aae74c694dc28999ed088" exitCode=0 Apr 16 22:25:15.239734 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:15.239716 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" event={"ID":"8034d9e1-e8f8-4dbd-b1cf-111598e79d7d","Type":"ContainerDied","Data":"5c464710ad353e3abcc8ebe298769b3b11341c42b98aae74c694dc28999ed088"} Apr 16 22:25:15.240193 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:15.239982 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" podUID="df00fd8e-6611-4196-bf94-b543d085d8cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 22:25:15.258325 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:15.258301 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" Apr 16 22:25:15.379491 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:15.379461 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtkz7\" (UniqueName: \"kubernetes.io/projected/8034d9e1-e8f8-4dbd-b1cf-111598e79d7d-kube-api-access-wtkz7\") pod \"8034d9e1-e8f8-4dbd-b1cf-111598e79d7d\" (UID: \"8034d9e1-e8f8-4dbd-b1cf-111598e79d7d\") " Apr 16 22:25:15.379653 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:15.379536 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-37a6e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8034d9e1-e8f8-4dbd-b1cf-111598e79d7d-error-404-isvc-37a6e-kube-rbac-proxy-sar-config\") pod \"8034d9e1-e8f8-4dbd-b1cf-111598e79d7d\" (UID: \"8034d9e1-e8f8-4dbd-b1cf-111598e79d7d\") " Apr 16 22:25:15.379653 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:15.379604 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8034d9e1-e8f8-4dbd-b1cf-111598e79d7d-proxy-tls\") pod \"8034d9e1-e8f8-4dbd-b1cf-111598e79d7d\" (UID: \"8034d9e1-e8f8-4dbd-b1cf-111598e79d7d\") " Apr 16 22:25:15.380039 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:15.379999 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8034d9e1-e8f8-4dbd-b1cf-111598e79d7d-error-404-isvc-37a6e-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-37a6e-kube-rbac-proxy-sar-config") pod "8034d9e1-e8f8-4dbd-b1cf-111598e79d7d" (UID: "8034d9e1-e8f8-4dbd-b1cf-111598e79d7d"). InnerVolumeSpecName "error-404-isvc-37a6e-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:25:15.381682 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:15.381648 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8034d9e1-e8f8-4dbd-b1cf-111598e79d7d-kube-api-access-wtkz7" (OuterVolumeSpecName: "kube-api-access-wtkz7") pod "8034d9e1-e8f8-4dbd-b1cf-111598e79d7d" (UID: "8034d9e1-e8f8-4dbd-b1cf-111598e79d7d"). InnerVolumeSpecName "kube-api-access-wtkz7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:25:15.381803 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:15.381775 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8034d9e1-e8f8-4dbd-b1cf-111598e79d7d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8034d9e1-e8f8-4dbd-b1cf-111598e79d7d" (UID: "8034d9e1-e8f8-4dbd-b1cf-111598e79d7d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:25:15.480834 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:15.480793 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8034d9e1-e8f8-4dbd-b1cf-111598e79d7d-proxy-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:25:15.480834 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:15.480831 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wtkz7\" (UniqueName: \"kubernetes.io/projected/8034d9e1-e8f8-4dbd-b1cf-111598e79d7d-kube-api-access-wtkz7\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:25:15.481019 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:15.480852 2576 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-37a6e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8034d9e1-e8f8-4dbd-b1cf-111598e79d7d-error-404-isvc-37a6e-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:25:16.244195 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:16.244169 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" Apr 16 22:25:16.244554 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:16.244169 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk" event={"ID":"8034d9e1-e8f8-4dbd-b1cf-111598e79d7d","Type":"ContainerDied","Data":"09c952e914d02d4adf4ea68e15d04f5e876d14440089feae183a379360d91987"} Apr 16 22:25:16.244554 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:16.244292 2576 scope.go:117] "RemoveContainer" containerID="777089d8a3fb42feb634b219be47bbadd7bf078b63cfc0af511c18710c6ede8b" Apr 16 22:25:16.253540 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:16.253525 2576 scope.go:117] "RemoveContainer" containerID="5c464710ad353e3abcc8ebe298769b3b11341c42b98aae74c694dc28999ed088" Apr 16 22:25:16.266253 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:16.266229 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk"] Apr 16 22:25:16.267863 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:16.267842 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-37a6e-predictor-7b78d56f57-tn2kk"] Apr 16 22:25:17.020406 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:17.020371 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8034d9e1-e8f8-4dbd-b1cf-111598e79d7d" path="/var/lib/kubelet/pods/8034d9e1-e8f8-4dbd-b1cf-111598e79d7d/volumes" Apr 16 22:25:20.244664 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:20.244633 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" Apr 16 22:25:20.245130 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:20.245077 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" podUID="df00fd8e-6611-4196-bf94-b543d085d8cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 22:25:30.245452 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:30.245411 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" podUID="df00fd8e-6611-4196-bf94-b543d085d8cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 22:25:40.245363 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:40.245328 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" podUID="df00fd8e-6611-4196-bf94-b543d085d8cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 22:25:48.264755 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.264666 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9"] Apr 16 22:25:48.265337 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.265315 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8034d9e1-e8f8-4dbd-b1cf-111598e79d7d" containerName="kube-rbac-proxy" Apr 16 22:25:48.265413 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.265340 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8034d9e1-e8f8-4dbd-b1cf-111598e79d7d" containerName="kube-rbac-proxy" Apr 16 22:25:48.265413 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.265359 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8034d9e1-e8f8-4dbd-b1cf-111598e79d7d" containerName="kserve-container" Apr 16 22:25:48.265413 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.265369 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8034d9e1-e8f8-4dbd-b1cf-111598e79d7d" containerName="kserve-container" Apr 16 22:25:48.265567 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.265469 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8034d9e1-e8f8-4dbd-b1cf-111598e79d7d" containerName="kserve-container" Apr 16 22:25:48.265567 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.265483 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8034d9e1-e8f8-4dbd-b1cf-111598e79d7d" containerName="kube-rbac-proxy" Apr 16 22:25:48.267511 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.267490 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" Apr 16 22:25:48.270015 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.269990 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-d7c57-kube-rbac-proxy-sar-config\"" Apr 16 22:25:48.270398 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.270377 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-d7c57-predictor-serving-cert\"" Apr 16 22:25:48.271371 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.271350 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr"] Apr 16 22:25:48.271638 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.271617 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" podUID="bebb89fc-9a91-4f94-bf3f-3891b05410c3" containerName="kserve-container" containerID="cri-o://8d60a8fe8c76bd10477c6670055ff132f0c50c1e4d952780916711b00aa0ffde" gracePeriod=30 Apr 16 22:25:48.271752 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.271714 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" podUID="bebb89fc-9a91-4f94-bf3f-3891b05410c3" containerName="kube-rbac-proxy" containerID="cri-o://1566c8bee5d8e9706f0c6b0e7127806a4b1a10f2bcd996a99c03288562e29d7e" gracePeriod=30 Apr 16 22:25:48.277297 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.277275 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9"] Apr 16 22:25:48.356365 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.356339 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjdst\" (UniqueName: \"kubernetes.io/projected/91926b85-6938-49da-a9ab-eb5c67bad4e9-kube-api-access-hjdst\") pod \"error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9\" (UID: \"91926b85-6938-49da-a9ab-eb5c67bad4e9\") " pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" Apr 16 22:25:48.356480 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.356381 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91926b85-6938-49da-a9ab-eb5c67bad4e9-proxy-tls\") pod \"error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9\" (UID: \"91926b85-6938-49da-a9ab-eb5c67bad4e9\") " pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" Apr 16 22:25:48.356480 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.356419 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-d7c57-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/91926b85-6938-49da-a9ab-eb5c67bad4e9-error-404-isvc-d7c57-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9\" (UID: \"91926b85-6938-49da-a9ab-eb5c67bad4e9\") " pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" Apr 16 22:25:48.457007 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.456981 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hjdst\" (UniqueName: \"kubernetes.io/projected/91926b85-6938-49da-a9ab-eb5c67bad4e9-kube-api-access-hjdst\") pod \"error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9\" (UID: \"91926b85-6938-49da-a9ab-eb5c67bad4e9\") " pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" Apr 16 22:25:48.457137 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.457013 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91926b85-6938-49da-a9ab-eb5c67bad4e9-proxy-tls\") pod \"error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9\" (UID: \"91926b85-6938-49da-a9ab-eb5c67bad4e9\") " pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" Apr 16 22:25:48.457137 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.457042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-d7c57-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/91926b85-6938-49da-a9ab-eb5c67bad4e9-error-404-isvc-d7c57-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9\" (UID: \"91926b85-6938-49da-a9ab-eb5c67bad4e9\") " pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" Apr 16 22:25:48.457137 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:25:48.457123 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-serving-cert: secret "error-404-isvc-d7c57-predictor-serving-cert" not found Apr 16 22:25:48.457264 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:25:48.457196 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91926b85-6938-49da-a9ab-eb5c67bad4e9-proxy-tls podName:91926b85-6938-49da-a9ab-eb5c67bad4e9 nodeName:}" failed. No retries permitted until 2026-04-16 22:25:48.957174511 +0000 UTC m=+744.490515345 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/91926b85-6938-49da-a9ab-eb5c67bad4e9-proxy-tls") pod "error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" (UID: "91926b85-6938-49da-a9ab-eb5c67bad4e9") : secret "error-404-isvc-d7c57-predictor-serving-cert" not found Apr 16 22:25:48.457568 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.457551 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-d7c57-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/91926b85-6938-49da-a9ab-eb5c67bad4e9-error-404-isvc-d7c57-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9\" (UID: \"91926b85-6938-49da-a9ab-eb5c67bad4e9\") " pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" Apr 16 22:25:48.465896 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.465870 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjdst\" (UniqueName: \"kubernetes.io/projected/91926b85-6938-49da-a9ab-eb5c67bad4e9-kube-api-access-hjdst\") pod \"error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9\" (UID: \"91926b85-6938-49da-a9ab-eb5c67bad4e9\") " pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" Apr 16 22:25:48.961613 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.961571 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91926b85-6938-49da-a9ab-eb5c67bad4e9-proxy-tls\") pod \"error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9\" (UID: \"91926b85-6938-49da-a9ab-eb5c67bad4e9\") " pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" Apr 16 22:25:48.963933 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:48.963902 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91926b85-6938-49da-a9ab-eb5c67bad4e9-proxy-tls\") pod \"error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9\" (UID: \"91926b85-6938-49da-a9ab-eb5c67bad4e9\") " pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" Apr 16 22:25:49.179751 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:49.179720 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" Apr 16 22:25:49.298687 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:49.298650 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9"] Apr 16 22:25:49.300890 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:25:49.300857 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91926b85_6938_49da_a9ab_eb5c67bad4e9.slice/crio-7945c65b6104df45829ca4e5fac0151565ea74c22df112a98d3fdc20dbdf8fde WatchSource:0}: Error finding container 7945c65b6104df45829ca4e5fac0151565ea74c22df112a98d3fdc20dbdf8fde: Status 404 returned error can't find the container with id 7945c65b6104df45829ca4e5fac0151565ea74c22df112a98d3fdc20dbdf8fde Apr 16 22:25:49.356753 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:49.356727 2576 generic.go:358] "Generic (PLEG): container finished" podID="bebb89fc-9a91-4f94-bf3f-3891b05410c3" containerID="1566c8bee5d8e9706f0c6b0e7127806a4b1a10f2bcd996a99c03288562e29d7e" exitCode=2 Apr 16 22:25:49.356863 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:49.356811 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" event={"ID":"bebb89fc-9a91-4f94-bf3f-3891b05410c3","Type":"ContainerDied","Data":"1566c8bee5d8e9706f0c6b0e7127806a4b1a10f2bcd996a99c03288562e29d7e"} Apr 16 22:25:49.358247 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:49.358223 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" event={"ID":"91926b85-6938-49da-a9ab-eb5c67bad4e9","Type":"ContainerStarted","Data":"7945c65b6104df45829ca4e5fac0151565ea74c22df112a98d3fdc20dbdf8fde"} Apr 16 22:25:50.245238 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:50.245200 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" podUID="df00fd8e-6611-4196-bf94-b543d085d8cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 22:25:50.363198 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:50.363163 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" event={"ID":"91926b85-6938-49da-a9ab-eb5c67bad4e9","Type":"ContainerStarted","Data":"44d346319bd6d43f0d3f273ed94ebeb245ba3a76729fc441899c8c0b9a48c844"} Apr 16 22:25:50.363571 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:50.363203 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" event={"ID":"91926b85-6938-49da-a9ab-eb5c67bad4e9","Type":"ContainerStarted","Data":"11778fde438ca290063e8dd76a89f838e9451c11baadc239defc1fb5afa95594"} Apr 16 22:25:50.363571 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:50.363297 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" Apr 16 22:25:50.382701 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:50.382623 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" podStartSLOduration=2.382608841 podStartE2EDuration="2.382608841s" podCreationTimestamp="2026-04-16 22:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:25:50.380917924 +0000 UTC m=+745.914258778" watchObservedRunningTime="2026-04-16 22:25:50.382608841 +0000 UTC m=+745.915949695" Apr 16 22:25:51.366423 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:51.366386 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" Apr 16 22:25:51.367645 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:51.367619 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" podUID="91926b85-6938-49da-a9ab-eb5c67bad4e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 22:25:52.369506 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:52.369466 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" podUID="91926b85-6938-49da-a9ab-eb5c67bad4e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 22:25:52.616968 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:52.616934 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" Apr 16 22:25:52.698009 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:52.697938 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bebb89fc-9a91-4f94-bf3f-3891b05410c3-proxy-tls\") pod \"bebb89fc-9a91-4f94-bf3f-3891b05410c3\" (UID: \"bebb89fc-9a91-4f94-bf3f-3891b05410c3\") " Apr 16 22:25:52.698009 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:52.698007 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bebb89fc-9a91-4f94-bf3f-3891b05410c3-kserve-provision-location\") pod \"bebb89fc-9a91-4f94-bf3f-3891b05410c3\" (UID: \"bebb89fc-9a91-4f94-bf3f-3891b05410c3\") " Apr 16 22:25:52.698220 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:52.698052 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw7m4\" (UniqueName: \"kubernetes.io/projected/bebb89fc-9a91-4f94-bf3f-3891b05410c3-kube-api-access-cw7m4\") pod \"bebb89fc-9a91-4f94-bf3f-3891b05410c3\" (UID: \"bebb89fc-9a91-4f94-bf3f-3891b05410c3\") " Apr 16 22:25:52.698220 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:52.698082 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bebb89fc-9a91-4f94-bf3f-3891b05410c3-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"bebb89fc-9a91-4f94-bf3f-3891b05410c3\" (UID: \"bebb89fc-9a91-4f94-bf3f-3891b05410c3\") " Apr 16 22:25:52.698347 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:52.698323 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bebb89fc-9a91-4f94-bf3f-3891b05410c3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bebb89fc-9a91-4f94-bf3f-3891b05410c3" (UID: "bebb89fc-9a91-4f94-bf3f-3891b05410c3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:25:52.698501 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:52.698477 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bebb89fc-9a91-4f94-bf3f-3891b05410c3-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config") pod "bebb89fc-9a91-4f94-bf3f-3891b05410c3" (UID: "bebb89fc-9a91-4f94-bf3f-3891b05410c3"). InnerVolumeSpecName "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:25:52.700114 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:52.700095 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bebb89fc-9a91-4f94-bf3f-3891b05410c3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bebb89fc-9a91-4f94-bf3f-3891b05410c3" (UID: "bebb89fc-9a91-4f94-bf3f-3891b05410c3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:25:52.700205 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:52.700151 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bebb89fc-9a91-4f94-bf3f-3891b05410c3-kube-api-access-cw7m4" (OuterVolumeSpecName: "kube-api-access-cw7m4") pod "bebb89fc-9a91-4f94-bf3f-3891b05410c3" (UID: "bebb89fc-9a91-4f94-bf3f-3891b05410c3"). InnerVolumeSpecName "kube-api-access-cw7m4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:25:52.799548 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:52.799518 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bebb89fc-9a91-4f94-bf3f-3891b05410c3-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:25:52.799548 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:52.799545 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bebb89fc-9a91-4f94-bf3f-3891b05410c3-proxy-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:25:52.799548 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:52.799556 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bebb89fc-9a91-4f94-bf3f-3891b05410c3-kserve-provision-location\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:25:52.799770 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:52.799565 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cw7m4\" (UniqueName: \"kubernetes.io/projected/bebb89fc-9a91-4f94-bf3f-3891b05410c3-kube-api-access-cw7m4\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:25:53.375233 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:53.375128 2576 generic.go:358] "Generic (PLEG): container finished" podID="bebb89fc-9a91-4f94-bf3f-3891b05410c3" containerID="8d60a8fe8c76bd10477c6670055ff132f0c50c1e4d952780916711b00aa0ffde" exitCode=0 Apr 16 22:25:53.375233 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:53.375216 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" event={"ID":"bebb89fc-9a91-4f94-bf3f-3891b05410c3","Type":"ContainerDied","Data":"8d60a8fe8c76bd10477c6670055ff132f0c50c1e4d952780916711b00aa0ffde"} Apr 16 22:25:53.375728 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:53.375260 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" Apr 16 22:25:53.375728 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:53.375280 2576 scope.go:117] "RemoveContainer" containerID="1566c8bee5d8e9706f0c6b0e7127806a4b1a10f2bcd996a99c03288562e29d7e" Apr 16 22:25:53.375728 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:53.375266 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr" event={"ID":"bebb89fc-9a91-4f94-bf3f-3891b05410c3","Type":"ContainerDied","Data":"379aa82dd4caca805f0079af02899ddafc7d66590c9cd9c579681fdd9cb28db9"} Apr 16 22:25:53.387175 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:53.387158 2576 scope.go:117] "RemoveContainer" containerID="8d60a8fe8c76bd10477c6670055ff132f0c50c1e4d952780916711b00aa0ffde" Apr 16 22:25:53.394527 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:53.394510 2576 scope.go:117] "RemoveContainer" containerID="1d2838cb75dbf849c3134de48372e9c261d4d528f5154d247f87ee57740306fa" Apr 16 22:25:53.396212 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:53.396193 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr"] Apr 16 22:25:53.397805 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:53.397783 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-v56vr"] Apr 16 22:25:53.401870 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:53.401856 2576 scope.go:117] "RemoveContainer" containerID="1566c8bee5d8e9706f0c6b0e7127806a4b1a10f2bcd996a99c03288562e29d7e" Apr 16 22:25:53.402093 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:25:53.402077 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1566c8bee5d8e9706f0c6b0e7127806a4b1a10f2bcd996a99c03288562e29d7e\": container with ID starting with 1566c8bee5d8e9706f0c6b0e7127806a4b1a10f2bcd996a99c03288562e29d7e not found: ID does not exist" containerID="1566c8bee5d8e9706f0c6b0e7127806a4b1a10f2bcd996a99c03288562e29d7e" Apr 16 22:25:53.402141 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:53.402100 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1566c8bee5d8e9706f0c6b0e7127806a4b1a10f2bcd996a99c03288562e29d7e"} err="failed to get container status \"1566c8bee5d8e9706f0c6b0e7127806a4b1a10f2bcd996a99c03288562e29d7e\": rpc error: code = NotFound desc = could not find container \"1566c8bee5d8e9706f0c6b0e7127806a4b1a10f2bcd996a99c03288562e29d7e\": container with ID starting with 1566c8bee5d8e9706f0c6b0e7127806a4b1a10f2bcd996a99c03288562e29d7e not found: ID does not exist" Apr 16 22:25:53.402141 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:53.402121 2576 scope.go:117] "RemoveContainer" containerID="8d60a8fe8c76bd10477c6670055ff132f0c50c1e4d952780916711b00aa0ffde" Apr 16 22:25:53.402354 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:25:53.402335 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d60a8fe8c76bd10477c6670055ff132f0c50c1e4d952780916711b00aa0ffde\": container with ID starting with 8d60a8fe8c76bd10477c6670055ff132f0c50c1e4d952780916711b00aa0ffde not found: ID does not exist" containerID="8d60a8fe8c76bd10477c6670055ff132f0c50c1e4d952780916711b00aa0ffde" Apr 16 22:25:53.402406 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:53.402359 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d60a8fe8c76bd10477c6670055ff132f0c50c1e4d952780916711b00aa0ffde"} err="failed to get container status \"8d60a8fe8c76bd10477c6670055ff132f0c50c1e4d952780916711b00aa0ffde\": rpc error: code = NotFound desc = could not find container \"8d60a8fe8c76bd10477c6670055ff132f0c50c1e4d952780916711b00aa0ffde\": container with ID starting with 8d60a8fe8c76bd10477c6670055ff132f0c50c1e4d952780916711b00aa0ffde not found: ID does not exist" Apr 16 22:25:53.402406 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:53.402376 2576 scope.go:117] "RemoveContainer" containerID="1d2838cb75dbf849c3134de48372e9c261d4d528f5154d247f87ee57740306fa" Apr 16 22:25:53.402576 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:25:53.402562 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d2838cb75dbf849c3134de48372e9c261d4d528f5154d247f87ee57740306fa\": container with ID starting with 1d2838cb75dbf849c3134de48372e9c261d4d528f5154d247f87ee57740306fa not found: ID does not exist" containerID="1d2838cb75dbf849c3134de48372e9c261d4d528f5154d247f87ee57740306fa" Apr 16 22:25:53.402620 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:53.402580 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d2838cb75dbf849c3134de48372e9c261d4d528f5154d247f87ee57740306fa"} err="failed to get container status \"1d2838cb75dbf849c3134de48372e9c261d4d528f5154d247f87ee57740306fa\": rpc error: code = NotFound desc = could not find container \"1d2838cb75dbf849c3134de48372e9c261d4d528f5154d247f87ee57740306fa\": container with ID starting with 1d2838cb75dbf849c3134de48372e9c261d4d528f5154d247f87ee57740306fa not found: ID does not exist" Apr 16 22:25:55.019718 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:55.019665 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bebb89fc-9a91-4f94-bf3f-3891b05410c3" path="/var/lib/kubelet/pods/bebb89fc-9a91-4f94-bf3f-3891b05410c3/volumes" Apr 16 22:25:57.373772 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:57.373744 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" Apr 16 22:25:57.374358 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:25:57.374331 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" podUID="91926b85-6938-49da-a9ab-eb5c67bad4e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 22:26:00.245858 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:26:00.245822 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" Apr 16 22:26:07.374402 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:26:07.374354 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" podUID="91926b85-6938-49da-a9ab-eb5c67bad4e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 22:26:17.374878 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:26:17.374839 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" podUID="91926b85-6938-49da-a9ab-eb5c67bad4e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 22:26:27.374395 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:26:27.374356 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" podUID="91926b85-6938-49da-a9ab-eb5c67bad4e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 22:26:37.374826 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:26:37.374795 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" Apr 16 22:28:25.010371 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:28:25.010302 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 22:28:25.012661 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:28:25.012635 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 22:33:25.033596 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:33:25.033567 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 22:33:25.037555 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:33:25.037533 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 22:34:26.946991 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:26.946916 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d"] Apr 16 22:34:26.947474 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:26.947203 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" podUID="df00fd8e-6611-4196-bf94-b543d085d8cf" containerName="kserve-container" containerID="cri-o://0cb28798f7036fbffa22103c6b402138a8f504fd79650f4ab1bd975df160b9b9" gracePeriod=30 Apr 16 22:34:26.947474 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:26.947256 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" podUID="df00fd8e-6611-4196-bf94-b543d085d8cf" containerName="kube-rbac-proxy" containerID="cri-o://7a6a6655240b4d9d480d4b55db9f3e6b35cf0335c8936deeb2b5b48086ace803" gracePeriod=30 Apr 16 22:34:27.033461 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.033427 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb"] Apr 16 22:34:27.033816 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.033801 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bebb89fc-9a91-4f94-bf3f-3891b05410c3" containerName="storage-initializer" Apr 16 22:34:27.033887 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.033818 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bebb89fc-9a91-4f94-bf3f-3891b05410c3" containerName="storage-initializer" Apr 16 22:34:27.033887 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.033828 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bebb89fc-9a91-4f94-bf3f-3891b05410c3" containerName="kube-rbac-proxy" Apr 16 22:34:27.033887 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.033833 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bebb89fc-9a91-4f94-bf3f-3891b05410c3" containerName="kube-rbac-proxy" Apr 16 22:34:27.033887 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.033843 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bebb89fc-9a91-4f94-bf3f-3891b05410c3" containerName="kserve-container" Apr 16 22:34:27.033887 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.033849 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bebb89fc-9a91-4f94-bf3f-3891b05410c3" containerName="kserve-container" Apr 16 22:34:27.034032 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.033920 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bebb89fc-9a91-4f94-bf3f-3891b05410c3" containerName="kserve-container" Apr 16 22:34:27.034032 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.033930 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bebb89fc-9a91-4f94-bf3f-3891b05410c3" containerName="kube-rbac-proxy" Apr 16 22:34:27.037015 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.037000 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" Apr 16 22:34:27.039161 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.039144 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-1148f-predictor-serving-cert\"" Apr 16 22:34:27.039236 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.039194 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-1148f-kube-rbac-proxy-sar-config\"" Apr 16 22:34:27.044648 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.044622 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb"] Apr 16 22:34:27.145381 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.145353 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64br8\" (UniqueName: \"kubernetes.io/projected/ba4f64bc-75af-4bbd-a028-a8cd4c7257ab-kube-api-access-64br8\") pod \"error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb\" (UID: \"ba4f64bc-75af-4bbd-a028-a8cd4c7257ab\") " pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" Apr 16 22:34:27.145554 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.145405 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-1148f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ba4f64bc-75af-4bbd-a028-a8cd4c7257ab-error-404-isvc-1148f-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb\" (UID: \"ba4f64bc-75af-4bbd-a028-a8cd4c7257ab\") " pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" Apr 16 22:34:27.145554 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.145431 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba4f64bc-75af-4bbd-a028-a8cd4c7257ab-proxy-tls\") pod \"error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb\" (UID: \"ba4f64bc-75af-4bbd-a028-a8cd4c7257ab\") " pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" Apr 16 22:34:27.246681 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.246646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba4f64bc-75af-4bbd-a028-a8cd4c7257ab-proxy-tls\") pod \"error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb\" (UID: \"ba4f64bc-75af-4bbd-a028-a8cd4c7257ab\") " pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" Apr 16 22:34:27.246847 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.246767 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64br8\" (UniqueName: \"kubernetes.io/projected/ba4f64bc-75af-4bbd-a028-a8cd4c7257ab-kube-api-access-64br8\") pod \"error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb\" (UID: \"ba4f64bc-75af-4bbd-a028-a8cd4c7257ab\") " pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" Apr 16 22:34:27.246847 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:34:27.246791 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-1148f-predictor-serving-cert: secret "error-404-isvc-1148f-predictor-serving-cert" not found Apr 16 22:34:27.246963 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.246845 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-1148f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ba4f64bc-75af-4bbd-a028-a8cd4c7257ab-error-404-isvc-1148f-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb\" (UID: \"ba4f64bc-75af-4bbd-a028-a8cd4c7257ab\") " pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" Apr 16 22:34:27.246963 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:34:27.246860 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba4f64bc-75af-4bbd-a028-a8cd4c7257ab-proxy-tls podName:ba4f64bc-75af-4bbd-a028-a8cd4c7257ab nodeName:}" failed. No retries permitted until 2026-04-16 22:34:27.746844165 +0000 UTC m=+1263.280184996 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ba4f64bc-75af-4bbd-a028-a8cd4c7257ab-proxy-tls") pod "error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" (UID: "ba4f64bc-75af-4bbd-a028-a8cd4c7257ab") : secret "error-404-isvc-1148f-predictor-serving-cert" not found Apr 16 22:34:27.247450 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.247426 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-1148f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ba4f64bc-75af-4bbd-a028-a8cd4c7257ab-error-404-isvc-1148f-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb\" (UID: \"ba4f64bc-75af-4bbd-a028-a8cd4c7257ab\") " pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" Apr 16 22:34:27.255114 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.255092 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64br8\" (UniqueName: \"kubernetes.io/projected/ba4f64bc-75af-4bbd-a028-a8cd4c7257ab-kube-api-access-64br8\") pod \"error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb\" (UID: \"ba4f64bc-75af-4bbd-a028-a8cd4c7257ab\") " pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" Apr 16 22:34:27.750867 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.750839 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba4f64bc-75af-4bbd-a028-a8cd4c7257ab-proxy-tls\") pod \"error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb\" (UID: \"ba4f64bc-75af-4bbd-a028-a8cd4c7257ab\") " pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" Apr 16 22:34:27.753137 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.753108 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba4f64bc-75af-4bbd-a028-a8cd4c7257ab-proxy-tls\") pod \"error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb\" (UID: \"ba4f64bc-75af-4bbd-a028-a8cd4c7257ab\") " pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" Apr 16 22:34:27.947833 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:27.947791 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" Apr 16 22:34:28.061055 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:28.061022 2576 generic.go:358] "Generic (PLEG): container finished" podID="df00fd8e-6611-4196-bf94-b543d085d8cf" containerID="7a6a6655240b4d9d480d4b55db9f3e6b35cf0335c8936deeb2b5b48086ace803" exitCode=2 Apr 16 22:34:28.061207 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:28.061091 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" event={"ID":"df00fd8e-6611-4196-bf94-b543d085d8cf","Type":"ContainerDied","Data":"7a6a6655240b4d9d480d4b55db9f3e6b35cf0335c8936deeb2b5b48086ace803"} Apr 16 22:34:28.066501 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:28.066477 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb"] Apr 16 22:34:28.070025 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:34:28.069999 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba4f64bc_75af_4bbd_a028_a8cd4c7257ab.slice/crio-0d78eba5cde70728c937dcfced48cf30c8c84bbdb98025a0ebdc11bb6b7fbc17 WatchSource:0}: Error finding container 0d78eba5cde70728c937dcfced48cf30c8c84bbdb98025a0ebdc11bb6b7fbc17: Status 404 returned error can't find the container with id 0d78eba5cde70728c937dcfced48cf30c8c84bbdb98025a0ebdc11bb6b7fbc17 Apr 16 22:34:28.071685 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:28.071658 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:34:29.065865 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:29.065830 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" event={"ID":"ba4f64bc-75af-4bbd-a028-a8cd4c7257ab","Type":"ContainerStarted","Data":"09c9df206841a0e2a5190d1c459e46bf93f34d90dd653c1989fc2d1e05413887"} Apr 16 22:34:29.065865 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:29.065866 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" event={"ID":"ba4f64bc-75af-4bbd-a028-a8cd4c7257ab","Type":"ContainerStarted","Data":"266b1b23077ac326a8fb174615338cefa35a6e1fca2f7f4a3b13f032443bb171"} Apr 16 22:34:29.066376 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:29.065878 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" event={"ID":"ba4f64bc-75af-4bbd-a028-a8cd4c7257ab","Type":"ContainerStarted","Data":"0d78eba5cde70728c937dcfced48cf30c8c84bbdb98025a0ebdc11bb6b7fbc17"} Apr 16 22:34:29.066376 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:29.065964 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" Apr 16 22:34:29.082853 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:29.082808 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" podStartSLOduration=2.082794498 podStartE2EDuration="2.082794498s" podCreationTimestamp="2026-04-16 22:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:34:29.082408959 +0000 UTC m=+1264.615749819" watchObservedRunningTime="2026-04-16 22:34:29.082794498 +0000 UTC m=+1264.616135350" Apr 16 22:34:29.790048 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:29.790027 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" Apr 16 22:34:29.868198 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:29.868128 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j84gg\" (UniqueName: \"kubernetes.io/projected/df00fd8e-6611-4196-bf94-b543d085d8cf-kube-api-access-j84gg\") pod \"df00fd8e-6611-4196-bf94-b543d085d8cf\" (UID: \"df00fd8e-6611-4196-bf94-b543d085d8cf\") " Apr 16 22:34:29.868198 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:29.868165 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-73a06-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df00fd8e-6611-4196-bf94-b543d085d8cf-error-404-isvc-73a06-kube-rbac-proxy-sar-config\") pod \"df00fd8e-6611-4196-bf94-b543d085d8cf\" (UID: \"df00fd8e-6611-4196-bf94-b543d085d8cf\") " Apr 16 22:34:29.868399 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:29.868201 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df00fd8e-6611-4196-bf94-b543d085d8cf-proxy-tls\") pod \"df00fd8e-6611-4196-bf94-b543d085d8cf\" (UID: \"df00fd8e-6611-4196-bf94-b543d085d8cf\") " Apr 16 22:34:29.868491 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:29.868467 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df00fd8e-6611-4196-bf94-b543d085d8cf-error-404-isvc-73a06-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-73a06-kube-rbac-proxy-sar-config") pod "df00fd8e-6611-4196-bf94-b543d085d8cf" (UID: "df00fd8e-6611-4196-bf94-b543d085d8cf"). InnerVolumeSpecName "error-404-isvc-73a06-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:34:29.870163 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:29.870136 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df00fd8e-6611-4196-bf94-b543d085d8cf-kube-api-access-j84gg" (OuterVolumeSpecName: "kube-api-access-j84gg") pod "df00fd8e-6611-4196-bf94-b543d085d8cf" (UID: "df00fd8e-6611-4196-bf94-b543d085d8cf"). InnerVolumeSpecName "kube-api-access-j84gg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:34:29.870275 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:29.870214 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df00fd8e-6611-4196-bf94-b543d085d8cf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "df00fd8e-6611-4196-bf94-b543d085d8cf" (UID: "df00fd8e-6611-4196-bf94-b543d085d8cf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:34:29.969290 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:29.969269 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j84gg\" (UniqueName: \"kubernetes.io/projected/df00fd8e-6611-4196-bf94-b543d085d8cf-kube-api-access-j84gg\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:34:29.969290 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:29.969291 2576 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-73a06-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df00fd8e-6611-4196-bf94-b543d085d8cf-error-404-isvc-73a06-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:34:29.969434 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:29.969302 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df00fd8e-6611-4196-bf94-b543d085d8cf-proxy-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:34:30.070151 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:30.070116 2576 generic.go:358] "Generic (PLEG): container finished" podID="df00fd8e-6611-4196-bf94-b543d085d8cf" containerID="0cb28798f7036fbffa22103c6b402138a8f504fd79650f4ab1bd975df160b9b9" exitCode=0 Apr 16 22:34:30.070572 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:30.070193 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" Apr 16 22:34:30.070572 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:30.070199 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" event={"ID":"df00fd8e-6611-4196-bf94-b543d085d8cf","Type":"ContainerDied","Data":"0cb28798f7036fbffa22103c6b402138a8f504fd79650f4ab1bd975df160b9b9"} Apr 16 22:34:30.070572 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:30.070235 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d" event={"ID":"df00fd8e-6611-4196-bf94-b543d085d8cf","Type":"ContainerDied","Data":"1069b6c8458f7f2669506073d16142e782246a6a48e7a5729396f56dcca64782"} Apr 16 22:34:30.070572 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:30.070251 2576 scope.go:117] "RemoveContainer" containerID="7a6a6655240b4d9d480d4b55db9f3e6b35cf0335c8936deeb2b5b48086ace803" Apr 16 22:34:30.070789 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:30.070619 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" Apr 16 22:34:30.072213 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:30.072181 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" podUID="ba4f64bc-75af-4bbd-a028-a8cd4c7257ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 22:34:30.079230 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:30.079211 2576 scope.go:117] "RemoveContainer" containerID="0cb28798f7036fbffa22103c6b402138a8f504fd79650f4ab1bd975df160b9b9" Apr 16 22:34:30.086298 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:30.086282 2576 scope.go:117] "RemoveContainer" containerID="7a6a6655240b4d9d480d4b55db9f3e6b35cf0335c8936deeb2b5b48086ace803" Apr 16 22:34:30.086537 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:34:30.086518 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a6a6655240b4d9d480d4b55db9f3e6b35cf0335c8936deeb2b5b48086ace803\": container with ID starting with 7a6a6655240b4d9d480d4b55db9f3e6b35cf0335c8936deeb2b5b48086ace803 not found: ID does not exist" containerID="7a6a6655240b4d9d480d4b55db9f3e6b35cf0335c8936deeb2b5b48086ace803" Apr 16 22:34:30.086601 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:30.086544 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a6a6655240b4d9d480d4b55db9f3e6b35cf0335c8936deeb2b5b48086ace803"} err="failed to get container status \"7a6a6655240b4d9d480d4b55db9f3e6b35cf0335c8936deeb2b5b48086ace803\": rpc error: code = NotFound desc = could not find container \"7a6a6655240b4d9d480d4b55db9f3e6b35cf0335c8936deeb2b5b48086ace803\": container with ID starting with 7a6a6655240b4d9d480d4b55db9f3e6b35cf0335c8936deeb2b5b48086ace803 not found: ID does not exist" Apr 16 22:34:30.086601 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:30.086560 2576 scope.go:117] "RemoveContainer" containerID="0cb28798f7036fbffa22103c6b402138a8f504fd79650f4ab1bd975df160b9b9" Apr 16 22:34:30.086819 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:34:30.086786 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cb28798f7036fbffa22103c6b402138a8f504fd79650f4ab1bd975df160b9b9\": container with ID starting with 0cb28798f7036fbffa22103c6b402138a8f504fd79650f4ab1bd975df160b9b9 not found: ID does not exist" containerID="0cb28798f7036fbffa22103c6b402138a8f504fd79650f4ab1bd975df160b9b9" Apr 16 22:34:30.086864 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:30.086826 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cb28798f7036fbffa22103c6b402138a8f504fd79650f4ab1bd975df160b9b9"} err="failed to get container status \"0cb28798f7036fbffa22103c6b402138a8f504fd79650f4ab1bd975df160b9b9\": rpc error: code = NotFound desc = could not find container \"0cb28798f7036fbffa22103c6b402138a8f504fd79650f4ab1bd975df160b9b9\": container with ID starting with 0cb28798f7036fbffa22103c6b402138a8f504fd79650f4ab1bd975df160b9b9 not found: ID does not exist" Apr 16 22:34:30.092559 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:30.092539 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d"] Apr 16 22:34:30.096504 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:30.096485 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-73a06-predictor-ccd458b55-5zv2d"] Apr 16 22:34:31.019529 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:31.019496 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df00fd8e-6611-4196-bf94-b543d085d8cf" path="/var/lib/kubelet/pods/df00fd8e-6611-4196-bf94-b543d085d8cf/volumes" Apr 16 22:34:31.082811 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:31.082762 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" podUID="ba4f64bc-75af-4bbd-a028-a8cd4c7257ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 22:34:36.080277 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:36.080242 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" Apr 16 22:34:36.080816 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:36.080730 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" podUID="ba4f64bc-75af-4bbd-a028-a8cd4c7257ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 22:34:46.080967 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:46.080923 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" podUID="ba4f64bc-75af-4bbd-a028-a8cd4c7257ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 22:34:56.081013 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:34:56.080975 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" podUID="ba4f64bc-75af-4bbd-a028-a8cd4c7257ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 22:35:02.780282 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:02.780245 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9"] Apr 16 22:35:02.780820 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:02.780553 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" podUID="91926b85-6938-49da-a9ab-eb5c67bad4e9" containerName="kserve-container" containerID="cri-o://44d346319bd6d43f0d3f273ed94ebeb245ba3a76729fc441899c8c0b9a48c844" gracePeriod=30 Apr 16 22:35:02.780820 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:02.780563 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" podUID="91926b85-6938-49da-a9ab-eb5c67bad4e9" containerName="kube-rbac-proxy" containerID="cri-o://11778fde438ca290063e8dd76a89f838e9451c11baadc239defc1fb5afa95594" gracePeriod=30 Apr 16 22:35:02.967746 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:02.967710 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb"] Apr 16 22:35:02.968088 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:02.968077 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df00fd8e-6611-4196-bf94-b543d085d8cf" containerName="kserve-container" Apr 16 22:35:02.968136 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:02.968090 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="df00fd8e-6611-4196-bf94-b543d085d8cf" containerName="kserve-container" Apr 16 22:35:02.968136 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:02.968114 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df00fd8e-6611-4196-bf94-b543d085d8cf" containerName="kube-rbac-proxy" Apr 16 22:35:02.968136 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:02.968119 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="df00fd8e-6611-4196-bf94-b543d085d8cf" containerName="kube-rbac-proxy" Apr 16 22:35:02.968228 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:02.968172 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="df00fd8e-6611-4196-bf94-b543d085d8cf" containerName="kube-rbac-proxy" Apr 16 22:35:02.968228 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:02.968181 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="df00fd8e-6611-4196-bf94-b543d085d8cf" containerName="kserve-container" Apr 16 22:35:02.971174 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:02.971158 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" Apr 16 22:35:02.973851 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:02.973830 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-cf933-predictor-serving-cert\"" Apr 16 22:35:02.973851 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:02.973838 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-cf933-kube-rbac-proxy-sar-config\"" Apr 16 22:35:02.980489 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:02.980469 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb"] Apr 16 22:35:03.031964 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:03.031874 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-cf933-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87e88914-62ef-4147-9ba5-3506606068fd-error-404-isvc-cf933-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-cf933-predictor-6c96b8f849-fmlhb\" (UID: \"87e88914-62ef-4147-9ba5-3506606068fd\") " pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" Apr 16 22:35:03.031964 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:03.031922 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th76j\" (UniqueName: \"kubernetes.io/projected/87e88914-62ef-4147-9ba5-3506606068fd-kube-api-access-th76j\") pod \"error-404-isvc-cf933-predictor-6c96b8f849-fmlhb\" (UID: \"87e88914-62ef-4147-9ba5-3506606068fd\") " pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" Apr 16 22:35:03.032146 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:03.031987 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87e88914-62ef-4147-9ba5-3506606068fd-proxy-tls\") pod \"error-404-isvc-cf933-predictor-6c96b8f849-fmlhb\" (UID: \"87e88914-62ef-4147-9ba5-3506606068fd\") " pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" Apr 16 22:35:03.132400 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:03.132368 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87e88914-62ef-4147-9ba5-3506606068fd-proxy-tls\") pod \"error-404-isvc-cf933-predictor-6c96b8f849-fmlhb\" (UID: \"87e88914-62ef-4147-9ba5-3506606068fd\") " pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" Apr 16 22:35:03.132599 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:03.132430 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-cf933-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87e88914-62ef-4147-9ba5-3506606068fd-error-404-isvc-cf933-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-cf933-predictor-6c96b8f849-fmlhb\" (UID: \"87e88914-62ef-4147-9ba5-3506606068fd\") " pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" Apr 16 22:35:03.132599 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:35:03.132552 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-cf933-predictor-serving-cert: secret "error-404-isvc-cf933-predictor-serving-cert" not found Apr 16 22:35:03.132756 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:35:03.132623 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87e88914-62ef-4147-9ba5-3506606068fd-proxy-tls podName:87e88914-62ef-4147-9ba5-3506606068fd nodeName:}" failed. No retries permitted until 2026-04-16 22:35:03.632601966 +0000 UTC m=+1299.165942814 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/87e88914-62ef-4147-9ba5-3506606068fd-proxy-tls") pod "error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" (UID: "87e88914-62ef-4147-9ba5-3506606068fd") : secret "error-404-isvc-cf933-predictor-serving-cert" not found Apr 16 22:35:03.132756 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:03.132648 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-th76j\" (UniqueName: \"kubernetes.io/projected/87e88914-62ef-4147-9ba5-3506606068fd-kube-api-access-th76j\") pod \"error-404-isvc-cf933-predictor-6c96b8f849-fmlhb\" (UID: \"87e88914-62ef-4147-9ba5-3506606068fd\") " pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" Apr 16 22:35:03.133107 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:03.133087 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-cf933-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87e88914-62ef-4147-9ba5-3506606068fd-error-404-isvc-cf933-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-cf933-predictor-6c96b8f849-fmlhb\" (UID: \"87e88914-62ef-4147-9ba5-3506606068fd\") " pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" Apr 16 22:35:03.141357 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:03.141334 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-th76j\" (UniqueName: \"kubernetes.io/projected/87e88914-62ef-4147-9ba5-3506606068fd-kube-api-access-th76j\") pod \"error-404-isvc-cf933-predictor-6c96b8f849-fmlhb\" (UID: \"87e88914-62ef-4147-9ba5-3506606068fd\") " pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" Apr 16 22:35:03.190107 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:03.190074 2576 generic.go:358] "Generic (PLEG): container finished" podID="91926b85-6938-49da-a9ab-eb5c67bad4e9" containerID="11778fde438ca290063e8dd76a89f838e9451c11baadc239defc1fb5afa95594" exitCode=2 Apr 16 22:35:03.190274 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:03.190106 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" event={"ID":"91926b85-6938-49da-a9ab-eb5c67bad4e9","Type":"ContainerDied","Data":"11778fde438ca290063e8dd76a89f838e9451c11baadc239defc1fb5afa95594"} Apr 16 22:35:03.636338 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:03.636300 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87e88914-62ef-4147-9ba5-3506606068fd-proxy-tls\") pod \"error-404-isvc-cf933-predictor-6c96b8f849-fmlhb\" (UID: \"87e88914-62ef-4147-9ba5-3506606068fd\") " pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" Apr 16 22:35:03.638751 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:03.638720 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87e88914-62ef-4147-9ba5-3506606068fd-proxy-tls\") pod \"error-404-isvc-cf933-predictor-6c96b8f849-fmlhb\" (UID: \"87e88914-62ef-4147-9ba5-3506606068fd\") " pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" Apr 16 22:35:03.882305 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:03.882261 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" Apr 16 22:35:04.001058 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:04.001032 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb"] Apr 16 22:35:04.003171 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:35:04.003146 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87e88914_62ef_4147_9ba5_3506606068fd.slice/crio-49d515bf9fb402ae944d73ec8490f0f1f3aed0bfeac8a4bef40b87dfa540274f WatchSource:0}: Error finding container 49d515bf9fb402ae944d73ec8490f0f1f3aed0bfeac8a4bef40b87dfa540274f: Status 404 returned error can't find the container with id 49d515bf9fb402ae944d73ec8490f0f1f3aed0bfeac8a4bef40b87dfa540274f Apr 16 22:35:04.194777 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:04.194667 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" event={"ID":"87e88914-62ef-4147-9ba5-3506606068fd","Type":"ContainerStarted","Data":"9cfd6c99444c4fed052872351692241f04859662e5801d1ab5d7b1c9d7bfc4bf"} Apr 16 22:35:04.194777 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:04.194731 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" event={"ID":"87e88914-62ef-4147-9ba5-3506606068fd","Type":"ContainerStarted","Data":"a649074eb0f2ff256513f7f259dab9b4eb23ae32b77a28714ef439f5a0ac0b20"} Apr 16 22:35:04.194777 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:04.194745 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" event={"ID":"87e88914-62ef-4147-9ba5-3506606068fd","Type":"ContainerStarted","Data":"49d515bf9fb402ae944d73ec8490f0f1f3aed0bfeac8a4bef40b87dfa540274f"} Apr 16 22:35:04.194976 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:04.194824 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" Apr 16 22:35:04.214124 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:04.214081 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" podStartSLOduration=2.214066808 podStartE2EDuration="2.214066808s" podCreationTimestamp="2026-04-16 22:35:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:35:04.2119391 +0000 UTC m=+1299.745279953" watchObservedRunningTime="2026-04-16 22:35:04.214066808 +0000 UTC m=+1299.747407659" Apr 16 22:35:05.198606 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:05.198563 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" Apr 16 22:35:05.199924 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:05.199899 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" podUID="87e88914-62ef-4147-9ba5-3506606068fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 22:35:06.028586 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.028563 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" Apr 16 22:35:06.051485 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.051454 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-d7c57-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/91926b85-6938-49da-a9ab-eb5c67bad4e9-error-404-isvc-d7c57-kube-rbac-proxy-sar-config\") pod \"91926b85-6938-49da-a9ab-eb5c67bad4e9\" (UID: \"91926b85-6938-49da-a9ab-eb5c67bad4e9\") " Apr 16 22:35:06.051648 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.051546 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91926b85-6938-49da-a9ab-eb5c67bad4e9-proxy-tls\") pod \"91926b85-6938-49da-a9ab-eb5c67bad4e9\" (UID: \"91926b85-6938-49da-a9ab-eb5c67bad4e9\") " Apr 16 22:35:06.051944 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.051891 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91926b85-6938-49da-a9ab-eb5c67bad4e9-error-404-isvc-d7c57-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-d7c57-kube-rbac-proxy-sar-config") pod "91926b85-6938-49da-a9ab-eb5c67bad4e9" (UID: "91926b85-6938-49da-a9ab-eb5c67bad4e9"). InnerVolumeSpecName "error-404-isvc-d7c57-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:35:06.054125 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.054101 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91926b85-6938-49da-a9ab-eb5c67bad4e9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "91926b85-6938-49da-a9ab-eb5c67bad4e9" (UID: "91926b85-6938-49da-a9ab-eb5c67bad4e9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:35:06.081615 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.081577 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" podUID="ba4f64bc-75af-4bbd-a028-a8cd4c7257ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 22:35:06.152655 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.152573 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjdst\" (UniqueName: \"kubernetes.io/projected/91926b85-6938-49da-a9ab-eb5c67bad4e9-kube-api-access-hjdst\") pod \"91926b85-6938-49da-a9ab-eb5c67bad4e9\" (UID: \"91926b85-6938-49da-a9ab-eb5c67bad4e9\") " Apr 16 22:35:06.152834 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.152761 2576 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-d7c57-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/91926b85-6938-49da-a9ab-eb5c67bad4e9-error-404-isvc-d7c57-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:35:06.152834 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.152774 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91926b85-6938-49da-a9ab-eb5c67bad4e9-proxy-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:35:06.154586 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.154560 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91926b85-6938-49da-a9ab-eb5c67bad4e9-kube-api-access-hjdst" (OuterVolumeSpecName: "kube-api-access-hjdst") pod "91926b85-6938-49da-a9ab-eb5c67bad4e9" (UID: "91926b85-6938-49da-a9ab-eb5c67bad4e9"). InnerVolumeSpecName "kube-api-access-hjdst". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:35:06.203180 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.203138 2576 generic.go:358] "Generic (PLEG): container finished" podID="91926b85-6938-49da-a9ab-eb5c67bad4e9" containerID="44d346319bd6d43f0d3f273ed94ebeb245ba3a76729fc441899c8c0b9a48c844" exitCode=0 Apr 16 22:35:06.203180 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.203174 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" event={"ID":"91926b85-6938-49da-a9ab-eb5c67bad4e9","Type":"ContainerDied","Data":"44d346319bd6d43f0d3f273ed94ebeb245ba3a76729fc441899c8c0b9a48c844"} Apr 16 22:35:06.203665 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.203209 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" event={"ID":"91926b85-6938-49da-a9ab-eb5c67bad4e9","Type":"ContainerDied","Data":"7945c65b6104df45829ca4e5fac0151565ea74c22df112a98d3fdc20dbdf8fde"} Apr 16 22:35:06.203665 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.203213 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9" Apr 16 22:35:06.203665 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.203224 2576 scope.go:117] "RemoveContainer" containerID="11778fde438ca290063e8dd76a89f838e9451c11baadc239defc1fb5afa95594" Apr 16 22:35:06.203861 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.203795 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" podUID="87e88914-62ef-4147-9ba5-3506606068fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 22:35:06.211342 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.211319 2576 scope.go:117] "RemoveContainer" containerID="44d346319bd6d43f0d3f273ed94ebeb245ba3a76729fc441899c8c0b9a48c844" Apr 16 22:35:06.220847 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.220830 2576 scope.go:117] "RemoveContainer" containerID="11778fde438ca290063e8dd76a89f838e9451c11baadc239defc1fb5afa95594" Apr 16 22:35:06.221079 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:35:06.221062 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11778fde438ca290063e8dd76a89f838e9451c11baadc239defc1fb5afa95594\": container with ID starting with 11778fde438ca290063e8dd76a89f838e9451c11baadc239defc1fb5afa95594 not found: ID does not exist" containerID="11778fde438ca290063e8dd76a89f838e9451c11baadc239defc1fb5afa95594" Apr 16 22:35:06.221126 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.221088 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11778fde438ca290063e8dd76a89f838e9451c11baadc239defc1fb5afa95594"} err="failed to get container status \"11778fde438ca290063e8dd76a89f838e9451c11baadc239defc1fb5afa95594\": rpc error: code = NotFound desc = could not find container \"11778fde438ca290063e8dd76a89f838e9451c11baadc239defc1fb5afa95594\": container with ID starting with 11778fde438ca290063e8dd76a89f838e9451c11baadc239defc1fb5afa95594 not found: ID does not exist" Apr 16 22:35:06.221126 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.221103 2576 scope.go:117] "RemoveContainer" containerID="44d346319bd6d43f0d3f273ed94ebeb245ba3a76729fc441899c8c0b9a48c844" Apr 16 22:35:06.221365 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:35:06.221318 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44d346319bd6d43f0d3f273ed94ebeb245ba3a76729fc441899c8c0b9a48c844\": container with ID starting with 44d346319bd6d43f0d3f273ed94ebeb245ba3a76729fc441899c8c0b9a48c844 not found: ID does not exist" containerID="44d346319bd6d43f0d3f273ed94ebeb245ba3a76729fc441899c8c0b9a48c844" Apr 16 22:35:06.221365 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.221342 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44d346319bd6d43f0d3f273ed94ebeb245ba3a76729fc441899c8c0b9a48c844"} err="failed to get container status \"44d346319bd6d43f0d3f273ed94ebeb245ba3a76729fc441899c8c0b9a48c844\": rpc error: code = NotFound desc = could not find container \"44d346319bd6d43f0d3f273ed94ebeb245ba3a76729fc441899c8c0b9a48c844\": container with ID starting with 44d346319bd6d43f0d3f273ed94ebeb245ba3a76729fc441899c8c0b9a48c844 not found: ID does not exist" Apr 16 22:35:06.224062 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.224041 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9"] Apr 16 22:35:06.227375 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.227352 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d7c57-predictor-6f7f8c65bf-d66j9"] Apr 16 22:35:06.253403 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:06.253381 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hjdst\" (UniqueName: \"kubernetes.io/projected/91926b85-6938-49da-a9ab-eb5c67bad4e9-kube-api-access-hjdst\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:35:07.020157 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:07.020125 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91926b85-6938-49da-a9ab-eb5c67bad4e9" path="/var/lib/kubelet/pods/91926b85-6938-49da-a9ab-eb5c67bad4e9/volumes" Apr 16 22:35:11.208199 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:11.208173 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" Apr 16 22:35:11.208654 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:11.208619 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" podUID="87e88914-62ef-4147-9ba5-3506606068fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 22:35:16.081856 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:16.081829 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" Apr 16 22:35:21.209606 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:21.209566 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" podUID="87e88914-62ef-4147-9ba5-3506606068fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 22:35:31.208718 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:31.208658 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" podUID="87e88914-62ef-4147-9ba5-3506606068fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 22:35:41.209098 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:41.209059 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" podUID="87e88914-62ef-4147-9ba5-3506606068fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 22:35:47.274009 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:47.273927 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb"] Apr 16 22:35:47.274372 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:47.274295 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" podUID="ba4f64bc-75af-4bbd-a028-a8cd4c7257ab" containerName="kserve-container" containerID="cri-o://266b1b23077ac326a8fb174615338cefa35a6e1fca2f7f4a3b13f032443bb171" gracePeriod=30 Apr 16 22:35:47.274444 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:47.274338 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" podUID="ba4f64bc-75af-4bbd-a028-a8cd4c7257ab" containerName="kube-rbac-proxy" containerID="cri-o://09c9df206841a0e2a5190d1c459e46bf93f34d90dd653c1989fc2d1e05413887" gracePeriod=30 Apr 16 22:35:47.327396 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:47.327362 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n"] Apr 16 22:35:47.327759 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:47.327746 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91926b85-6938-49da-a9ab-eb5c67bad4e9" containerName="kube-rbac-proxy" Apr 16 22:35:47.327826 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:47.327761 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="91926b85-6938-49da-a9ab-eb5c67bad4e9" containerName="kube-rbac-proxy" Apr 16 22:35:47.327826 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:47.327774 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91926b85-6938-49da-a9ab-eb5c67bad4e9" containerName="kserve-container" Apr 16 22:35:47.327826 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:47.327779 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="91926b85-6938-49da-a9ab-eb5c67bad4e9" containerName="kserve-container" Apr 16 22:35:47.327935 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:47.327831 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="91926b85-6938-49da-a9ab-eb5c67bad4e9" containerName="kube-rbac-proxy" Apr 16 22:35:47.327935 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:47.327845 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="91926b85-6938-49da-a9ab-eb5c67bad4e9" containerName="kserve-container" Apr 16 22:35:47.332060 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:47.332042 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" Apr 16 22:35:47.334403 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:47.334381 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-ea3bc-kube-rbac-proxy-sar-config\"" Apr 16 22:35:47.334503 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:47.334399 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-ea3bc-predictor-serving-cert\"" Apr 16 22:35:47.340571 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:47.340548 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n"] Apr 16 22:35:47.497544 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:47.497505 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kht4\" (UniqueName: \"kubernetes.io/projected/9a97e08c-acc0-4907-810b-a637f32ea6c6-kube-api-access-6kht4\") pod \"error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n\" (UID: \"9a97e08c-acc0-4907-810b-a637f32ea6c6\") " pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" Apr 16 22:35:47.497732 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:47.497592 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-ea3bc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a97e08c-acc0-4907-810b-a637f32ea6c6-error-404-isvc-ea3bc-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n\" (UID: \"9a97e08c-acc0-4907-810b-a637f32ea6c6\") " pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" Apr 16 22:35:47.497732 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:47.497644 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a97e08c-acc0-4907-810b-a637f32ea6c6-proxy-tls\") pod \"error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n\" (UID: \"9a97e08c-acc0-4907-810b-a637f32ea6c6\") " pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" Apr 16 22:35:47.598214 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:47.598122 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a97e08c-acc0-4907-810b-a637f32ea6c6-proxy-tls\") pod \"error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n\" (UID: \"9a97e08c-acc0-4907-810b-a637f32ea6c6\") " pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" Apr 16 22:35:47.598214 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:47.598212 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kht4\" (UniqueName: \"kubernetes.io/projected/9a97e08c-acc0-4907-810b-a637f32ea6c6-kube-api-access-6kht4\") pod \"error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n\" (UID: \"9a97e08c-acc0-4907-810b-a637f32ea6c6\") " pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" Apr 16 22:35:47.598446 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:47.598242 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-ea3bc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a97e08c-acc0-4907-810b-a637f32ea6c6-error-404-isvc-ea3bc-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n\" (UID: \"9a97e08c-acc0-4907-810b-a637f32ea6c6\") " pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" Apr 16 22:35:47.598446 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:35:47.598260 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-serving-cert: secret "error-404-isvc-ea3bc-predictor-serving-cert" not found Apr 16 22:35:47.598446 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:35:47.598326 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a97e08c-acc0-4907-810b-a637f32ea6c6-proxy-tls podName:9a97e08c-acc0-4907-810b-a637f32ea6c6 nodeName:}" failed. No retries permitted until 2026-04-16 22:35:48.098308941 +0000 UTC m=+1343.631649772 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9a97e08c-acc0-4907-810b-a637f32ea6c6-proxy-tls") pod "error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" (UID: "9a97e08c-acc0-4907-810b-a637f32ea6c6") : secret "error-404-isvc-ea3bc-predictor-serving-cert" not found Apr 16 22:35:47.598878 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:47.598860 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-ea3bc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a97e08c-acc0-4907-810b-a637f32ea6c6-error-404-isvc-ea3bc-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n\" (UID: \"9a97e08c-acc0-4907-810b-a637f32ea6c6\") " pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" Apr 16 22:35:47.609194 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:47.609165 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kht4\" (UniqueName: \"kubernetes.io/projected/9a97e08c-acc0-4907-810b-a637f32ea6c6-kube-api-access-6kht4\") pod \"error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n\" (UID: \"9a97e08c-acc0-4907-810b-a637f32ea6c6\") " pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" Apr 16 22:35:48.102781 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:48.102728 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a97e08c-acc0-4907-810b-a637f32ea6c6-proxy-tls\") pod \"error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n\" (UID: \"9a97e08c-acc0-4907-810b-a637f32ea6c6\") " pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" Apr 16 22:35:48.105105 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:48.105073 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a97e08c-acc0-4907-810b-a637f32ea6c6-proxy-tls\") pod \"error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n\" (UID: \"9a97e08c-acc0-4907-810b-a637f32ea6c6\") " pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" Apr 16 22:35:48.243097 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:48.243059 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" Apr 16 22:35:48.348488 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:48.348455 2576 generic.go:358] "Generic (PLEG): container finished" podID="ba4f64bc-75af-4bbd-a028-a8cd4c7257ab" containerID="09c9df206841a0e2a5190d1c459e46bf93f34d90dd653c1989fc2d1e05413887" exitCode=2 Apr 16 22:35:48.348848 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:48.348527 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" event={"ID":"ba4f64bc-75af-4bbd-a028-a8cd4c7257ab","Type":"ContainerDied","Data":"09c9df206841a0e2a5190d1c459e46bf93f34d90dd653c1989fc2d1e05413887"} Apr 16 22:35:48.361466 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:48.361445 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n"] Apr 16 22:35:48.363767 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:35:48.363735 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a97e08c_acc0_4907_810b_a637f32ea6c6.slice/crio-4578a6f99e2289e6785758f89ce4778a50ffb2603551fa8542bb220a95385253 WatchSource:0}: Error finding container 4578a6f99e2289e6785758f89ce4778a50ffb2603551fa8542bb220a95385253: Status 404 returned error can't find the container with id 4578a6f99e2289e6785758f89ce4778a50ffb2603551fa8542bb220a95385253 Apr 16 22:35:49.354149 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:49.354106 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" event={"ID":"9a97e08c-acc0-4907-810b-a637f32ea6c6","Type":"ContainerStarted","Data":"609b6acaa0dfb4af79d2111313738aa00586e04ac80749f0f2f1b47c1cdd414d"} Apr 16 22:35:49.354509 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:49.354154 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" event={"ID":"9a97e08c-acc0-4907-810b-a637f32ea6c6","Type":"ContainerStarted","Data":"5127134cecd3b37daab93b53b65d3f10c939e80569a51666d277a5c0d462dae5"} Apr 16 22:35:49.354509 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:49.354169 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" event={"ID":"9a97e08c-acc0-4907-810b-a637f32ea6c6","Type":"ContainerStarted","Data":"4578a6f99e2289e6785758f89ce4778a50ffb2603551fa8542bb220a95385253"} Apr 16 22:35:49.354509 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:49.354269 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" Apr 16 22:35:49.371956 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:49.371917 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" podStartSLOduration=2.371902976 podStartE2EDuration="2.371902976s" podCreationTimestamp="2026-04-16 22:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:35:49.370431049 +0000 UTC m=+1344.903771898" watchObservedRunningTime="2026-04-16 22:35:49.371902976 +0000 UTC m=+1344.905243829" Apr 16 22:35:50.385785 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:50.385750 2576 generic.go:358] "Generic (PLEG): container finished" podID="ba4f64bc-75af-4bbd-a028-a8cd4c7257ab" containerID="266b1b23077ac326a8fb174615338cefa35a6e1fca2f7f4a3b13f032443bb171" exitCode=0 Apr 16 22:35:50.386236 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:50.385788 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" event={"ID":"ba4f64bc-75af-4bbd-a028-a8cd4c7257ab","Type":"ContainerDied","Data":"266b1b23077ac326a8fb174615338cefa35a6e1fca2f7f4a3b13f032443bb171"} Apr 16 22:35:50.386236 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:50.386105 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" Apr 16 22:35:50.387074 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:50.387046 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" podUID="9a97e08c-acc0-4907-810b-a637f32ea6c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 22:35:50.427135 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:50.427117 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" Apr 16 22:35:50.523435 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:50.523413 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba4f64bc-75af-4bbd-a028-a8cd4c7257ab-proxy-tls\") pod \"ba4f64bc-75af-4bbd-a028-a8cd4c7257ab\" (UID: \"ba4f64bc-75af-4bbd-a028-a8cd4c7257ab\") " Apr 16 22:35:50.523594 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:50.523522 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64br8\" (UniqueName: \"kubernetes.io/projected/ba4f64bc-75af-4bbd-a028-a8cd4c7257ab-kube-api-access-64br8\") pod \"ba4f64bc-75af-4bbd-a028-a8cd4c7257ab\" (UID: \"ba4f64bc-75af-4bbd-a028-a8cd4c7257ab\") " Apr 16 22:35:50.523594 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:50.523552 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-1148f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ba4f64bc-75af-4bbd-a028-a8cd4c7257ab-error-404-isvc-1148f-kube-rbac-proxy-sar-config\") pod \"ba4f64bc-75af-4bbd-a028-a8cd4c7257ab\" (UID: \"ba4f64bc-75af-4bbd-a028-a8cd4c7257ab\") " Apr 16 22:35:50.523979 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:50.523953 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba4f64bc-75af-4bbd-a028-a8cd4c7257ab-error-404-isvc-1148f-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-1148f-kube-rbac-proxy-sar-config") pod "ba4f64bc-75af-4bbd-a028-a8cd4c7257ab" (UID: "ba4f64bc-75af-4bbd-a028-a8cd4c7257ab"). InnerVolumeSpecName "error-404-isvc-1148f-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:35:50.525589 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:50.525566 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba4f64bc-75af-4bbd-a028-a8cd4c7257ab-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ba4f64bc-75af-4bbd-a028-a8cd4c7257ab" (UID: "ba4f64bc-75af-4bbd-a028-a8cd4c7257ab"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:35:50.525749 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:50.525729 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba4f64bc-75af-4bbd-a028-a8cd4c7257ab-kube-api-access-64br8" (OuterVolumeSpecName: "kube-api-access-64br8") pod "ba4f64bc-75af-4bbd-a028-a8cd4c7257ab" (UID: "ba4f64bc-75af-4bbd-a028-a8cd4c7257ab"). InnerVolumeSpecName "kube-api-access-64br8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:35:50.624621 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:50.624586 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-64br8\" (UniqueName: \"kubernetes.io/projected/ba4f64bc-75af-4bbd-a028-a8cd4c7257ab-kube-api-access-64br8\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:35:50.624621 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:50.624618 2576 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-1148f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ba4f64bc-75af-4bbd-a028-a8cd4c7257ab-error-404-isvc-1148f-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:35:50.624820 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:50.624633 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba4f64bc-75af-4bbd-a028-a8cd4c7257ab-proxy-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:35:51.209809 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:51.209782 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" Apr 16 22:35:51.391573 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:51.391492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" event={"ID":"ba4f64bc-75af-4bbd-a028-a8cd4c7257ab","Type":"ContainerDied","Data":"0d78eba5cde70728c937dcfced48cf30c8c84bbdb98025a0ebdc11bb6b7fbc17"} Apr 16 22:35:51.391573 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:51.391538 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb" Apr 16 22:35:51.391573 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:51.391547 2576 scope.go:117] "RemoveContainer" containerID="09c9df206841a0e2a5190d1c459e46bf93f34d90dd653c1989fc2d1e05413887" Apr 16 22:35:51.392063 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:51.391817 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" podUID="9a97e08c-acc0-4907-810b-a637f32ea6c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 22:35:51.399648 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:51.399632 2576 scope.go:117] "RemoveContainer" containerID="266b1b23077ac326a8fb174615338cefa35a6e1fca2f7f4a3b13f032443bb171" Apr 16 22:35:51.407457 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:51.407439 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb"] Apr 16 22:35:51.411180 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:51.411160 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1148f-predictor-85f8bf7f7c-cxfhb"] Apr 16 22:35:53.020294 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:53.020258 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba4f64bc-75af-4bbd-a028-a8cd4c7257ab" path="/var/lib/kubelet/pods/ba4f64bc-75af-4bbd-a028-a8cd4c7257ab/volumes" Apr 16 22:35:56.398088 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:56.398055 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" Apr 16 22:35:56.398617 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:35:56.398588 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" podUID="9a97e08c-acc0-4907-810b-a637f32ea6c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 22:36:06.398757 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:06.398717 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" podUID="9a97e08c-acc0-4907-810b-a637f32ea6c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 22:36:13.115051 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.115014 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb"] Apr 16 22:36:13.115545 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.115365 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" podUID="87e88914-62ef-4147-9ba5-3506606068fd" containerName="kserve-container" containerID="cri-o://a649074eb0f2ff256513f7f259dab9b4eb23ae32b77a28714ef439f5a0ac0b20" gracePeriod=30 Apr 16 22:36:13.115545 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.115419 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" podUID="87e88914-62ef-4147-9ba5-3506606068fd" containerName="kube-rbac-proxy" containerID="cri-o://9cfd6c99444c4fed052872351692241f04859662e5801d1ab5d7b1c9d7bfc4bf" gracePeriod=30 Apr 16 22:36:13.190408 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.190371 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf"] Apr 16 22:36:13.190764 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.190750 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba4f64bc-75af-4bbd-a028-a8cd4c7257ab" containerName="kserve-container" Apr 16 22:36:13.190815 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.190766 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba4f64bc-75af-4bbd-a028-a8cd4c7257ab" containerName="kserve-container" Apr 16 22:36:13.190815 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.190783 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba4f64bc-75af-4bbd-a028-a8cd4c7257ab" containerName="kube-rbac-proxy" Apr 16 22:36:13.190815 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.190789 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba4f64bc-75af-4bbd-a028-a8cd4c7257ab" containerName="kube-rbac-proxy" Apr 16 22:36:13.190908 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.190841 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba4f64bc-75af-4bbd-a028-a8cd4c7257ab" containerName="kserve-container" Apr 16 22:36:13.190908 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.190850 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba4f64bc-75af-4bbd-a028-a8cd4c7257ab" containerName="kube-rbac-proxy" Apr 16 22:36:13.195090 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.195071 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" Apr 16 22:36:13.197628 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.197606 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-c905c-kube-rbac-proxy-sar-config\"" Apr 16 22:36:13.197750 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.197688 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-c905c-predictor-serving-cert\"" Apr 16 22:36:13.199730 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.199709 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-c905c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a61e12f9-eb2b-4e19-91f5-31e99b6ad914-error-404-isvc-c905c-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-c905c-predictor-545b784cf7-84hwf\" (UID: \"a61e12f9-eb2b-4e19-91f5-31e99b6ad914\") " pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" Apr 16 22:36:13.199846 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.199774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddp7r\" (UniqueName: \"kubernetes.io/projected/a61e12f9-eb2b-4e19-91f5-31e99b6ad914-kube-api-access-ddp7r\") pod \"error-404-isvc-c905c-predictor-545b784cf7-84hwf\" (UID: \"a61e12f9-eb2b-4e19-91f5-31e99b6ad914\") " pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" Apr 16 22:36:13.199846 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.199810 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a61e12f9-eb2b-4e19-91f5-31e99b6ad914-proxy-tls\") pod \"error-404-isvc-c905c-predictor-545b784cf7-84hwf\" (UID: \"a61e12f9-eb2b-4e19-91f5-31e99b6ad914\") " pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" Apr 16 22:36:13.203983 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.203963 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf"] Apr 16 22:36:13.301078 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.301039 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-c905c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a61e12f9-eb2b-4e19-91f5-31e99b6ad914-error-404-isvc-c905c-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-c905c-predictor-545b784cf7-84hwf\" (UID: \"a61e12f9-eb2b-4e19-91f5-31e99b6ad914\") " pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" Apr 16 22:36:13.301253 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.301100 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddp7r\" (UniqueName: \"kubernetes.io/projected/a61e12f9-eb2b-4e19-91f5-31e99b6ad914-kube-api-access-ddp7r\") pod \"error-404-isvc-c905c-predictor-545b784cf7-84hwf\" (UID: \"a61e12f9-eb2b-4e19-91f5-31e99b6ad914\") " pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" Apr 16 22:36:13.301253 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.301133 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a61e12f9-eb2b-4e19-91f5-31e99b6ad914-proxy-tls\") pod \"error-404-isvc-c905c-predictor-545b784cf7-84hwf\" (UID: \"a61e12f9-eb2b-4e19-91f5-31e99b6ad914\") " pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" Apr 16 22:36:13.301861 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.301838 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-c905c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a61e12f9-eb2b-4e19-91f5-31e99b6ad914-error-404-isvc-c905c-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-c905c-predictor-545b784cf7-84hwf\" (UID: \"a61e12f9-eb2b-4e19-91f5-31e99b6ad914\") " pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" Apr 16 22:36:13.303529 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.303507 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a61e12f9-eb2b-4e19-91f5-31e99b6ad914-proxy-tls\") pod \"error-404-isvc-c905c-predictor-545b784cf7-84hwf\" (UID: \"a61e12f9-eb2b-4e19-91f5-31e99b6ad914\") " pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" Apr 16 22:36:13.309183 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.309164 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddp7r\" (UniqueName: \"kubernetes.io/projected/a61e12f9-eb2b-4e19-91f5-31e99b6ad914-kube-api-access-ddp7r\") pod \"error-404-isvc-c905c-predictor-545b784cf7-84hwf\" (UID: \"a61e12f9-eb2b-4e19-91f5-31e99b6ad914\") " pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" Apr 16 22:36:13.466407 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.466324 2576 generic.go:358] "Generic (PLEG): container finished" podID="87e88914-62ef-4147-9ba5-3506606068fd" containerID="9cfd6c99444c4fed052872351692241f04859662e5801d1ab5d7b1c9d7bfc4bf" exitCode=2 Apr 16 22:36:13.466407 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.466388 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" event={"ID":"87e88914-62ef-4147-9ba5-3506606068fd","Type":"ContainerDied","Data":"9cfd6c99444c4fed052872351692241f04859662e5801d1ab5d7b1c9d7bfc4bf"} Apr 16 22:36:13.506773 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.506740 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" Apr 16 22:36:13.628043 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:13.628014 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf"] Apr 16 22:36:13.631249 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:36:13.631224 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda61e12f9_eb2b_4e19_91f5_31e99b6ad914.slice/crio-42e5d3a71099009c91d4d86a12fec373d95f2a3c686f7412a4170f1337134e93 WatchSource:0}: Error finding container 42e5d3a71099009c91d4d86a12fec373d95f2a3c686f7412a4170f1337134e93: Status 404 returned error can't find the container with id 42e5d3a71099009c91d4d86a12fec373d95f2a3c686f7412a4170f1337134e93 Apr 16 22:36:14.471336 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:14.471299 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" event={"ID":"a61e12f9-eb2b-4e19-91f5-31e99b6ad914","Type":"ContainerStarted","Data":"a557945d0cde01545c679188ce8179113167c08c6941806a93097856ce170806"} Apr 16 22:36:14.471729 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:14.471342 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" event={"ID":"a61e12f9-eb2b-4e19-91f5-31e99b6ad914","Type":"ContainerStarted","Data":"3a569479eec2a8d9e5512fdc993ae1cc3c1fd809a7752595a4cde806de5c0c12"} Apr 16 22:36:14.471729 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:14.471356 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" event={"ID":"a61e12f9-eb2b-4e19-91f5-31e99b6ad914","Type":"ContainerStarted","Data":"42e5d3a71099009c91d4d86a12fec373d95f2a3c686f7412a4170f1337134e93"} Apr 16 22:36:14.471729 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:14.471450 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" Apr 16 22:36:14.487979 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:14.487933 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" podStartSLOduration=1.487916973 podStartE2EDuration="1.487916973s" podCreationTimestamp="2026-04-16 22:36:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:36:14.48681785 +0000 UTC m=+1370.020158714" watchObservedRunningTime="2026-04-16 22:36:14.487916973 +0000 UTC m=+1370.021257828" Apr 16 22:36:15.475855 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:15.475824 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" Apr 16 22:36:15.477039 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:15.477001 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" podUID="a61e12f9-eb2b-4e19-91f5-31e99b6ad914" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 22:36:16.168309 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.168288 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" Apr 16 22:36:16.223717 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.223692 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87e88914-62ef-4147-9ba5-3506606068fd-proxy-tls\") pod \"87e88914-62ef-4147-9ba5-3506606068fd\" (UID: \"87e88914-62ef-4147-9ba5-3506606068fd\") " Apr 16 22:36:16.223861 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.223815 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th76j\" (UniqueName: \"kubernetes.io/projected/87e88914-62ef-4147-9ba5-3506606068fd-kube-api-access-th76j\") pod \"87e88914-62ef-4147-9ba5-3506606068fd\" (UID: \"87e88914-62ef-4147-9ba5-3506606068fd\") " Apr 16 22:36:16.223861 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.223838 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-cf933-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87e88914-62ef-4147-9ba5-3506606068fd-error-404-isvc-cf933-kube-rbac-proxy-sar-config\") pod \"87e88914-62ef-4147-9ba5-3506606068fd\" (UID: \"87e88914-62ef-4147-9ba5-3506606068fd\") " Apr 16 22:36:16.224210 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.224185 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87e88914-62ef-4147-9ba5-3506606068fd-error-404-isvc-cf933-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-cf933-kube-rbac-proxy-sar-config") pod "87e88914-62ef-4147-9ba5-3506606068fd" (UID: "87e88914-62ef-4147-9ba5-3506606068fd"). InnerVolumeSpecName "error-404-isvc-cf933-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:36:16.225633 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.225610 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e88914-62ef-4147-9ba5-3506606068fd-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "87e88914-62ef-4147-9ba5-3506606068fd" (UID: "87e88914-62ef-4147-9ba5-3506606068fd"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:36:16.225780 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.225763 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87e88914-62ef-4147-9ba5-3506606068fd-kube-api-access-th76j" (OuterVolumeSpecName: "kube-api-access-th76j") pod "87e88914-62ef-4147-9ba5-3506606068fd" (UID: "87e88914-62ef-4147-9ba5-3506606068fd"). InnerVolumeSpecName "kube-api-access-th76j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:36:16.325321 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.325294 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/87e88914-62ef-4147-9ba5-3506606068fd-proxy-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:36:16.325321 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.325318 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-th76j\" (UniqueName: \"kubernetes.io/projected/87e88914-62ef-4147-9ba5-3506606068fd-kube-api-access-th76j\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:36:16.325477 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.325330 2576 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-cf933-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/87e88914-62ef-4147-9ba5-3506606068fd-error-404-isvc-cf933-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:36:16.398496 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.398463 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" podUID="9a97e08c-acc0-4907-810b-a637f32ea6c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 22:36:16.481415 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.481329 2576 generic.go:358] "Generic (PLEG): container finished" podID="87e88914-62ef-4147-9ba5-3506606068fd" containerID="a649074eb0f2ff256513f7f259dab9b4eb23ae32b77a28714ef439f5a0ac0b20" exitCode=0 Apr 16 22:36:16.481804 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.481422 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" event={"ID":"87e88914-62ef-4147-9ba5-3506606068fd","Type":"ContainerDied","Data":"a649074eb0f2ff256513f7f259dab9b4eb23ae32b77a28714ef439f5a0ac0b20"} Apr 16 22:36:16.481804 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.481429 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" Apr 16 22:36:16.481804 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.481463 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb" event={"ID":"87e88914-62ef-4147-9ba5-3506606068fd","Type":"ContainerDied","Data":"49d515bf9fb402ae944d73ec8490f0f1f3aed0bfeac8a4bef40b87dfa540274f"} Apr 16 22:36:16.481804 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.481483 2576 scope.go:117] "RemoveContainer" containerID="9cfd6c99444c4fed052872351692241f04859662e5801d1ab5d7b1c9d7bfc4bf" Apr 16 22:36:16.482144 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.482120 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" podUID="a61e12f9-eb2b-4e19-91f5-31e99b6ad914" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 22:36:16.490949 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.490935 2576 scope.go:117] "RemoveContainer" containerID="a649074eb0f2ff256513f7f259dab9b4eb23ae32b77a28714ef439f5a0ac0b20" Apr 16 22:36:16.497819 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.497786 2576 scope.go:117] "RemoveContainer" containerID="9cfd6c99444c4fed052872351692241f04859662e5801d1ab5d7b1c9d7bfc4bf" Apr 16 22:36:16.498014 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:36:16.497996 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cfd6c99444c4fed052872351692241f04859662e5801d1ab5d7b1c9d7bfc4bf\": container with ID starting with 9cfd6c99444c4fed052872351692241f04859662e5801d1ab5d7b1c9d7bfc4bf not found: ID does not exist" containerID="9cfd6c99444c4fed052872351692241f04859662e5801d1ab5d7b1c9d7bfc4bf" Apr 16 22:36:16.498058 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.498022 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cfd6c99444c4fed052872351692241f04859662e5801d1ab5d7b1c9d7bfc4bf"} err="failed to get container status \"9cfd6c99444c4fed052872351692241f04859662e5801d1ab5d7b1c9d7bfc4bf\": rpc error: code = NotFound desc = could not find container \"9cfd6c99444c4fed052872351692241f04859662e5801d1ab5d7b1c9d7bfc4bf\": container with ID starting with 9cfd6c99444c4fed052872351692241f04859662e5801d1ab5d7b1c9d7bfc4bf not found: ID does not exist" Apr 16 22:36:16.498058 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.498038 2576 scope.go:117] "RemoveContainer" containerID="a649074eb0f2ff256513f7f259dab9b4eb23ae32b77a28714ef439f5a0ac0b20" Apr 16 22:36:16.498275 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:36:16.498259 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a649074eb0f2ff256513f7f259dab9b4eb23ae32b77a28714ef439f5a0ac0b20\": container with ID starting with a649074eb0f2ff256513f7f259dab9b4eb23ae32b77a28714ef439f5a0ac0b20 not found: ID does not exist" containerID="a649074eb0f2ff256513f7f259dab9b4eb23ae32b77a28714ef439f5a0ac0b20" Apr 16 22:36:16.498330 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.498279 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a649074eb0f2ff256513f7f259dab9b4eb23ae32b77a28714ef439f5a0ac0b20"} err="failed to get container status \"a649074eb0f2ff256513f7f259dab9b4eb23ae32b77a28714ef439f5a0ac0b20\": rpc error: code = NotFound desc = could not find container \"a649074eb0f2ff256513f7f259dab9b4eb23ae32b77a28714ef439f5a0ac0b20\": container with ID starting with a649074eb0f2ff256513f7f259dab9b4eb23ae32b77a28714ef439f5a0ac0b20 not found: ID does not exist" Apr 16 22:36:16.502802 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.502782 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb"] Apr 16 22:36:16.506004 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:16.505985 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cf933-predictor-6c96b8f849-fmlhb"] Apr 16 22:36:17.019884 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:17.019854 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87e88914-62ef-4147-9ba5-3506606068fd" path="/var/lib/kubelet/pods/87e88914-62ef-4147-9ba5-3506606068fd/volumes" Apr 16 22:36:21.486520 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:21.486493 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" Apr 16 22:36:21.487062 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:21.487038 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" podUID="a61e12f9-eb2b-4e19-91f5-31e99b6ad914" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 22:36:26.399146 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:26.399107 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" podUID="9a97e08c-acc0-4907-810b-a637f32ea6c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 22:36:31.487884 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:31.487845 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" podUID="a61e12f9-eb2b-4e19-91f5-31e99b6ad914" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 22:36:36.399830 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:36.399802 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" Apr 16 22:36:41.487872 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:41.487833 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" podUID="a61e12f9-eb2b-4e19-91f5-31e99b6ad914" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 22:36:51.487043 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:36:51.487002 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" podUID="a61e12f9-eb2b-4e19-91f5-31e99b6ad914" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 22:37:01.488382 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:37:01.488350 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" Apr 16 22:38:25.056717 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:38:25.056585 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 22:38:25.061636 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:38:25.061617 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 22:43:25.085521 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:43:25.085417 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 22:43:25.087804 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:43:25.087781 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 22:45:02.107542 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.107457 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n"] Apr 16 22:45:02.108224 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.107852 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" podUID="9a97e08c-acc0-4907-810b-a637f32ea6c6" containerName="kserve-container" containerID="cri-o://5127134cecd3b37daab93b53b65d3f10c939e80569a51666d277a5c0d462dae5" gracePeriod=30 Apr 16 22:45:02.108224 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.107867 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" podUID="9a97e08c-acc0-4907-810b-a637f32ea6c6" containerName="kube-rbac-proxy" containerID="cri-o://609b6acaa0dfb4af79d2111313738aa00586e04ac80749f0f2f1b47c1cdd414d" gracePeriod=30 Apr 16 22:45:02.208335 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.208304 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp"] Apr 16 22:45:02.208778 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.208759 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87e88914-62ef-4147-9ba5-3506606068fd" containerName="kserve-container" Apr 16 22:45:02.208888 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.208779 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e88914-62ef-4147-9ba5-3506606068fd" containerName="kserve-container" Apr 16 22:45:02.208888 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.208798 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87e88914-62ef-4147-9ba5-3506606068fd" containerName="kube-rbac-proxy" Apr 16 22:45:02.208888 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.208806 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e88914-62ef-4147-9ba5-3506606068fd" containerName="kube-rbac-proxy" Apr 16 22:45:02.208888 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.208881 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="87e88914-62ef-4147-9ba5-3506606068fd" containerName="kserve-container" Apr 16 22:45:02.209087 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.208901 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="87e88914-62ef-4147-9ba5-3506606068fd" containerName="kube-rbac-proxy" Apr 16 22:45:02.212199 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.212174 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" Apr 16 22:45:02.214577 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.214553 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-a5b4b-predictor-serving-cert\"" Apr 16 22:45:02.214734 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.214591 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-a5b4b-kube-rbac-proxy-sar-config\"" Apr 16 22:45:02.222174 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.222152 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp"] Apr 16 22:45:02.364267 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.364180 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-a5b4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/982484e0-9e62-48e0-949c-c1ff24170fd7-error-404-isvc-a5b4b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp\" (UID: \"982484e0-9e62-48e0-949c-c1ff24170fd7\") " pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" Apr 16 22:45:02.364267 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.364227 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmhhc\" (UniqueName: \"kubernetes.io/projected/982484e0-9e62-48e0-949c-c1ff24170fd7-kube-api-access-fmhhc\") pod \"error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp\" (UID: \"982484e0-9e62-48e0-949c-c1ff24170fd7\") " pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" Apr 16 22:45:02.364446 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.364322 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/982484e0-9e62-48e0-949c-c1ff24170fd7-proxy-tls\") pod \"error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp\" (UID: \"982484e0-9e62-48e0-949c-c1ff24170fd7\") " pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" Apr 16 22:45:02.464906 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.464866 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/982484e0-9e62-48e0-949c-c1ff24170fd7-proxy-tls\") pod \"error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp\" (UID: \"982484e0-9e62-48e0-949c-c1ff24170fd7\") " pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" Apr 16 22:45:02.465068 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.464958 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-a5b4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/982484e0-9e62-48e0-949c-c1ff24170fd7-error-404-isvc-a5b4b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp\" (UID: \"982484e0-9e62-48e0-949c-c1ff24170fd7\") " pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" Apr 16 22:45:02.465068 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.464994 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmhhc\" (UniqueName: \"kubernetes.io/projected/982484e0-9e62-48e0-949c-c1ff24170fd7-kube-api-access-fmhhc\") pod \"error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp\" (UID: \"982484e0-9e62-48e0-949c-c1ff24170fd7\") " pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" Apr 16 22:45:02.465771 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.465740 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-a5b4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/982484e0-9e62-48e0-949c-c1ff24170fd7-error-404-isvc-a5b4b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp\" (UID: \"982484e0-9e62-48e0-949c-c1ff24170fd7\") " pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" Apr 16 22:45:02.467258 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.467236 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/982484e0-9e62-48e0-949c-c1ff24170fd7-proxy-tls\") pod \"error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp\" (UID: \"982484e0-9e62-48e0-949c-c1ff24170fd7\") " pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" Apr 16 22:45:02.473316 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.473290 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmhhc\" (UniqueName: \"kubernetes.io/projected/982484e0-9e62-48e0-949c-c1ff24170fd7-kube-api-access-fmhhc\") pod \"error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp\" (UID: \"982484e0-9e62-48e0-949c-c1ff24170fd7\") " pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" Apr 16 22:45:02.524246 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.524209 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" Apr 16 22:45:02.646762 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.646739 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp"] Apr 16 22:45:02.649342 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:45:02.649317 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod982484e0_9e62_48e0_949c_c1ff24170fd7.slice/crio-ead4660c6c3ce78fbacd862ff9ab518acd5491a0aebf13bf01f68ef2b5251af3 WatchSource:0}: Error finding container ead4660c6c3ce78fbacd862ff9ab518acd5491a0aebf13bf01f68ef2b5251af3: Status 404 returned error can't find the container with id ead4660c6c3ce78fbacd862ff9ab518acd5491a0aebf13bf01f68ef2b5251af3 Apr 16 22:45:02.651086 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:02.651070 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:45:03.197915 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:03.197881 2576 generic.go:358] "Generic (PLEG): container finished" podID="9a97e08c-acc0-4907-810b-a637f32ea6c6" containerID="609b6acaa0dfb4af79d2111313738aa00586e04ac80749f0f2f1b47c1cdd414d" exitCode=2 Apr 16 22:45:03.198358 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:03.197961 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" event={"ID":"9a97e08c-acc0-4907-810b-a637f32ea6c6","Type":"ContainerDied","Data":"609b6acaa0dfb4af79d2111313738aa00586e04ac80749f0f2f1b47c1cdd414d"} Apr 16 22:45:03.199465 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:03.199445 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" event={"ID":"982484e0-9e62-48e0-949c-c1ff24170fd7","Type":"ContainerStarted","Data":"f65baa698e71444a411cd378d2aa0198dcc97f7dc78999b86bfcb028d0de496c"} Apr 16 22:45:03.199551 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:03.199472 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" event={"ID":"982484e0-9e62-48e0-949c-c1ff24170fd7","Type":"ContainerStarted","Data":"9a2e6196d297dfdacffd55c95d22e5cccb780c1a4c8747c612cd43fb936fed0c"} Apr 16 22:45:03.199551 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:03.199484 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" event={"ID":"982484e0-9e62-48e0-949c-c1ff24170fd7","Type":"ContainerStarted","Data":"ead4660c6c3ce78fbacd862ff9ab518acd5491a0aebf13bf01f68ef2b5251af3"} Apr 16 22:45:03.199627 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:03.199572 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" Apr 16 22:45:03.219363 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:03.219319 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" podStartSLOduration=1.219304925 podStartE2EDuration="1.219304925s" podCreationTimestamp="2026-04-16 22:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:45:03.218247503 +0000 UTC m=+1898.751588356" watchObservedRunningTime="2026-04-16 22:45:03.219304925 +0000 UTC m=+1898.752645777" Apr 16 22:45:04.202523 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:04.202490 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" Apr 16 22:45:04.203616 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:04.203589 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" podUID="982484e0-9e62-48e0-949c-c1ff24170fd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 22:45:05.207610 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:05.207578 2576 generic.go:358] "Generic (PLEG): container finished" podID="9a97e08c-acc0-4907-810b-a637f32ea6c6" containerID="5127134cecd3b37daab93b53b65d3f10c939e80569a51666d277a5c0d462dae5" exitCode=0 Apr 16 22:45:05.208070 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:05.207621 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" event={"ID":"9a97e08c-acc0-4907-810b-a637f32ea6c6","Type":"ContainerDied","Data":"5127134cecd3b37daab93b53b65d3f10c939e80569a51666d277a5c0d462dae5"} Apr 16 22:45:05.208070 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:05.207988 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" podUID="982484e0-9e62-48e0-949c-c1ff24170fd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 22:45:05.250722 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:05.250703 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" Apr 16 22:45:05.390156 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:05.390077 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kht4\" (UniqueName: \"kubernetes.io/projected/9a97e08c-acc0-4907-810b-a637f32ea6c6-kube-api-access-6kht4\") pod \"9a97e08c-acc0-4907-810b-a637f32ea6c6\" (UID: \"9a97e08c-acc0-4907-810b-a637f32ea6c6\") " Apr 16 22:45:05.390294 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:05.390194 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-ea3bc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a97e08c-acc0-4907-810b-a637f32ea6c6-error-404-isvc-ea3bc-kube-rbac-proxy-sar-config\") pod \"9a97e08c-acc0-4907-810b-a637f32ea6c6\" (UID: \"9a97e08c-acc0-4907-810b-a637f32ea6c6\") " Apr 16 22:45:05.390294 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:05.390252 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a97e08c-acc0-4907-810b-a637f32ea6c6-proxy-tls\") pod \"9a97e08c-acc0-4907-810b-a637f32ea6c6\" (UID: \"9a97e08c-acc0-4907-810b-a637f32ea6c6\") " Apr 16 22:45:05.390530 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:05.390504 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a97e08c-acc0-4907-810b-a637f32ea6c6-error-404-isvc-ea3bc-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-ea3bc-kube-rbac-proxy-sar-config") pod "9a97e08c-acc0-4907-810b-a637f32ea6c6" (UID: "9a97e08c-acc0-4907-810b-a637f32ea6c6"). InnerVolumeSpecName "error-404-isvc-ea3bc-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:45:05.392239 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:05.392214 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a97e08c-acc0-4907-810b-a637f32ea6c6-kube-api-access-6kht4" (OuterVolumeSpecName: "kube-api-access-6kht4") pod "9a97e08c-acc0-4907-810b-a637f32ea6c6" (UID: "9a97e08c-acc0-4907-810b-a637f32ea6c6"). InnerVolumeSpecName "kube-api-access-6kht4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:45:05.392350 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:05.392292 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a97e08c-acc0-4907-810b-a637f32ea6c6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9a97e08c-acc0-4907-810b-a637f32ea6c6" (UID: "9a97e08c-acc0-4907-810b-a637f32ea6c6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:45:05.491163 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:05.491129 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6kht4\" (UniqueName: \"kubernetes.io/projected/9a97e08c-acc0-4907-810b-a637f32ea6c6-kube-api-access-6kht4\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:45:05.491163 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:05.491158 2576 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-ea3bc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a97e08c-acc0-4907-810b-a637f32ea6c6-error-404-isvc-ea3bc-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:45:05.491163 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:05.491169 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a97e08c-acc0-4907-810b-a637f32ea6c6-proxy-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:45:06.212684 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:06.212640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" event={"ID":"9a97e08c-acc0-4907-810b-a637f32ea6c6","Type":"ContainerDied","Data":"4578a6f99e2289e6785758f89ce4778a50ffb2603551fa8542bb220a95385253"} Apr 16 22:45:06.213103 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:06.212709 2576 scope.go:117] "RemoveContainer" containerID="609b6acaa0dfb4af79d2111313738aa00586e04ac80749f0f2f1b47c1cdd414d" Apr 16 22:45:06.213103 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:06.212707 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n" Apr 16 22:45:06.221210 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:06.221183 2576 scope.go:117] "RemoveContainer" containerID="5127134cecd3b37daab93b53b65d3f10c939e80569a51666d277a5c0d462dae5" Apr 16 22:45:06.239206 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:06.234859 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n"] Apr 16 22:45:06.241985 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:06.241961 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ea3bc-predictor-5cf5f8bf89-xgg5n"] Apr 16 22:45:07.019659 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:07.019624 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a97e08c-acc0-4907-810b-a637f32ea6c6" path="/var/lib/kubelet/pods/9a97e08c-acc0-4907-810b-a637f32ea6c6/volumes" Apr 16 22:45:10.212590 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:10.212561 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" Apr 16 22:45:10.213140 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:10.213112 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" podUID="982484e0-9e62-48e0-949c-c1ff24170fd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 22:45:20.213877 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:20.213837 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" podUID="982484e0-9e62-48e0-949c-c1ff24170fd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 22:45:27.922366 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:27.922330 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf"] Apr 16 22:45:27.922756 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:27.922612 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" podUID="a61e12f9-eb2b-4e19-91f5-31e99b6ad914" containerName="kserve-container" containerID="cri-o://3a569479eec2a8d9e5512fdc993ae1cc3c1fd809a7752595a4cde806de5c0c12" gracePeriod=30 Apr 16 22:45:27.922756 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:27.922625 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" podUID="a61e12f9-eb2b-4e19-91f5-31e99b6ad914" containerName="kube-rbac-proxy" containerID="cri-o://a557945d0cde01545c679188ce8179113167c08c6941806a93097856ce170806" gracePeriod=30 Apr 16 22:45:28.005568 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.005532 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd"] Apr 16 22:45:28.005916 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.005904 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a97e08c-acc0-4907-810b-a637f32ea6c6" containerName="kserve-container" Apr 16 22:45:28.005964 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.005917 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a97e08c-acc0-4907-810b-a637f32ea6c6" containerName="kserve-container" Apr 16 22:45:28.005964 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.005936 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a97e08c-acc0-4907-810b-a637f32ea6c6" containerName="kube-rbac-proxy" Apr 16 22:45:28.005964 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.005941 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a97e08c-acc0-4907-810b-a637f32ea6c6" containerName="kube-rbac-proxy" Apr 16 22:45:28.006060 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.006000 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a97e08c-acc0-4907-810b-a637f32ea6c6" containerName="kube-rbac-proxy" Apr 16 22:45:28.006060 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.006008 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a97e08c-acc0-4907-810b-a637f32ea6c6" containerName="kserve-container" Apr 16 22:45:28.010079 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.010062 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" Apr 16 22:45:28.012567 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.012542 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-a9d3d-kube-rbac-proxy-sar-config\"" Apr 16 22:45:28.012731 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.012606 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-a9d3d-predictor-serving-cert\"" Apr 16 22:45:28.019404 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.019381 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd"] Apr 16 22:45:28.185529 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.185439 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5wr4\" (UniqueName: \"kubernetes.io/projected/962e73ad-4e33-4700-87d0-5ea46d65e8a9-kube-api-access-n5wr4\") pod \"error-404-isvc-a9d3d-predictor-59878dc55-5kshd\" (UID: \"962e73ad-4e33-4700-87d0-5ea46d65e8a9\") " pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" Apr 16 22:45:28.185529 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.185482 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/962e73ad-4e33-4700-87d0-5ea46d65e8a9-proxy-tls\") pod \"error-404-isvc-a9d3d-predictor-59878dc55-5kshd\" (UID: \"962e73ad-4e33-4700-87d0-5ea46d65e8a9\") " pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" Apr 16 22:45:28.185810 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.185587 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-a9d3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/962e73ad-4e33-4700-87d0-5ea46d65e8a9-error-404-isvc-a9d3d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a9d3d-predictor-59878dc55-5kshd\" (UID: \"962e73ad-4e33-4700-87d0-5ea46d65e8a9\") " pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" Apr 16 22:45:28.285478 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.285447 2576 generic.go:358] "Generic (PLEG): container finished" podID="a61e12f9-eb2b-4e19-91f5-31e99b6ad914" containerID="a557945d0cde01545c679188ce8179113167c08c6941806a93097856ce170806" exitCode=2 Apr 16 22:45:28.285643 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.285505 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" event={"ID":"a61e12f9-eb2b-4e19-91f5-31e99b6ad914","Type":"ContainerDied","Data":"a557945d0cde01545c679188ce8179113167c08c6941806a93097856ce170806"} Apr 16 22:45:28.286870 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.286850 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-a9d3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/962e73ad-4e33-4700-87d0-5ea46d65e8a9-error-404-isvc-a9d3d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a9d3d-predictor-59878dc55-5kshd\" (UID: \"962e73ad-4e33-4700-87d0-5ea46d65e8a9\") " pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" Apr 16 22:45:28.286946 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.286899 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5wr4\" (UniqueName: \"kubernetes.io/projected/962e73ad-4e33-4700-87d0-5ea46d65e8a9-kube-api-access-n5wr4\") pod \"error-404-isvc-a9d3d-predictor-59878dc55-5kshd\" (UID: \"962e73ad-4e33-4700-87d0-5ea46d65e8a9\") " pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" Apr 16 22:45:28.286946 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.286919 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/962e73ad-4e33-4700-87d0-5ea46d65e8a9-proxy-tls\") pod \"error-404-isvc-a9d3d-predictor-59878dc55-5kshd\" (UID: \"962e73ad-4e33-4700-87d0-5ea46d65e8a9\") " pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" Apr 16 22:45:28.287047 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:45:28.287038 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-serving-cert: secret "error-404-isvc-a9d3d-predictor-serving-cert" not found Apr 16 22:45:28.287112 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:45:28.287100 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/962e73ad-4e33-4700-87d0-5ea46d65e8a9-proxy-tls podName:962e73ad-4e33-4700-87d0-5ea46d65e8a9 nodeName:}" failed. No retries permitted until 2026-04-16 22:45:28.787080642 +0000 UTC m=+1924.320421473 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/962e73ad-4e33-4700-87d0-5ea46d65e8a9-proxy-tls") pod "error-404-isvc-a9d3d-predictor-59878dc55-5kshd" (UID: "962e73ad-4e33-4700-87d0-5ea46d65e8a9") : secret "error-404-isvc-a9d3d-predictor-serving-cert" not found Apr 16 22:45:28.287507 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.287485 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-a9d3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/962e73ad-4e33-4700-87d0-5ea46d65e8a9-error-404-isvc-a9d3d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a9d3d-predictor-59878dc55-5kshd\" (UID: \"962e73ad-4e33-4700-87d0-5ea46d65e8a9\") " pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" Apr 16 22:45:28.294889 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.294867 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5wr4\" (UniqueName: \"kubernetes.io/projected/962e73ad-4e33-4700-87d0-5ea46d65e8a9-kube-api-access-n5wr4\") pod \"error-404-isvc-a9d3d-predictor-59878dc55-5kshd\" (UID: \"962e73ad-4e33-4700-87d0-5ea46d65e8a9\") " pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" Apr 16 22:45:28.791344 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.791305 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/962e73ad-4e33-4700-87d0-5ea46d65e8a9-proxy-tls\") pod \"error-404-isvc-a9d3d-predictor-59878dc55-5kshd\" (UID: \"962e73ad-4e33-4700-87d0-5ea46d65e8a9\") " pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" Apr 16 22:45:28.793714 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.793686 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/962e73ad-4e33-4700-87d0-5ea46d65e8a9-proxy-tls\") pod \"error-404-isvc-a9d3d-predictor-59878dc55-5kshd\" (UID: \"962e73ad-4e33-4700-87d0-5ea46d65e8a9\") " pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" Apr 16 22:45:28.921161 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:28.921123 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" Apr 16 22:45:29.045622 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:29.045596 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd"] Apr 16 22:45:29.048264 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:45:29.048239 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod962e73ad_4e33_4700_87d0_5ea46d65e8a9.slice/crio-55b85e6f790a236f9883e5dc25e7eff0433423f3c48ad56ada64e922f5710aea WatchSource:0}: Error finding container 55b85e6f790a236f9883e5dc25e7eff0433423f3c48ad56ada64e922f5710aea: Status 404 returned error can't find the container with id 55b85e6f790a236f9883e5dc25e7eff0433423f3c48ad56ada64e922f5710aea Apr 16 22:45:29.290977 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:29.290938 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" event={"ID":"962e73ad-4e33-4700-87d0-5ea46d65e8a9","Type":"ContainerStarted","Data":"5b461956ff41248b2171bff5cc9dead2b3e7d96fb1f569bce08cbf7a64c9cce0"} Apr 16 22:45:29.290977 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:29.290975 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" event={"ID":"962e73ad-4e33-4700-87d0-5ea46d65e8a9","Type":"ContainerStarted","Data":"cf13f7231af71d32fd8233f88ee3472b44a7c138dcd158c4d1c56f80dbb6a7a2"} Apr 16 22:45:29.291219 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:29.290991 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" Apr 16 22:45:29.291219 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:29.291001 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" event={"ID":"962e73ad-4e33-4700-87d0-5ea46d65e8a9","Type":"ContainerStarted","Data":"55b85e6f790a236f9883e5dc25e7eff0433423f3c48ad56ada64e922f5710aea"} Apr 16 22:45:29.291219 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:29.291103 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" Apr 16 22:45:29.292489 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:29.292464 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" podUID="962e73ad-4e33-4700-87d0-5ea46d65e8a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 22:45:29.307591 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:29.307501 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" podStartSLOduration=2.307486908 podStartE2EDuration="2.307486908s" podCreationTimestamp="2026-04-16 22:45:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:45:29.306302864 +0000 UTC m=+1924.839643717" watchObservedRunningTime="2026-04-16 22:45:29.307486908 +0000 UTC m=+1924.840827757" Apr 16 22:45:30.213845 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:30.213809 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" podUID="982484e0-9e62-48e0-949c-c1ff24170fd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 22:45:30.295022 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:30.294983 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" podUID="962e73ad-4e33-4700-87d0-5ea46d65e8a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 22:45:31.279975 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:31.279953 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" Apr 16 22:45:31.299632 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:31.299599 2576 generic.go:358] "Generic (PLEG): container finished" podID="a61e12f9-eb2b-4e19-91f5-31e99b6ad914" containerID="3a569479eec2a8d9e5512fdc993ae1cc3c1fd809a7752595a4cde806de5c0c12" exitCode=0 Apr 16 22:45:31.299774 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:31.299700 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" Apr 16 22:45:31.299774 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:31.299732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" event={"ID":"a61e12f9-eb2b-4e19-91f5-31e99b6ad914","Type":"ContainerDied","Data":"3a569479eec2a8d9e5512fdc993ae1cc3c1fd809a7752595a4cde806de5c0c12"} Apr 16 22:45:31.299881 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:31.299778 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf" event={"ID":"a61e12f9-eb2b-4e19-91f5-31e99b6ad914","Type":"ContainerDied","Data":"42e5d3a71099009c91d4d86a12fec373d95f2a3c686f7412a4170f1337134e93"} Apr 16 22:45:31.299881 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:31.299801 2576 scope.go:117] "RemoveContainer" containerID="a557945d0cde01545c679188ce8179113167c08c6941806a93097856ce170806" Apr 16 22:45:31.308588 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:31.308572 2576 scope.go:117] "RemoveContainer" containerID="3a569479eec2a8d9e5512fdc993ae1cc3c1fd809a7752595a4cde806de5c0c12" Apr 16 22:45:31.316369 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:31.316345 2576 scope.go:117] "RemoveContainer" containerID="a557945d0cde01545c679188ce8179113167c08c6941806a93097856ce170806" Apr 16 22:45:31.316633 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:45:31.316617 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a557945d0cde01545c679188ce8179113167c08c6941806a93097856ce170806\": container with ID starting with a557945d0cde01545c679188ce8179113167c08c6941806a93097856ce170806 not found: ID does not exist" containerID="a557945d0cde01545c679188ce8179113167c08c6941806a93097856ce170806" Apr 16 22:45:31.316702 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:31.316643 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a557945d0cde01545c679188ce8179113167c08c6941806a93097856ce170806"} err="failed to get container status \"a557945d0cde01545c679188ce8179113167c08c6941806a93097856ce170806\": rpc error: code = NotFound desc = could not find container \"a557945d0cde01545c679188ce8179113167c08c6941806a93097856ce170806\": container with ID starting with a557945d0cde01545c679188ce8179113167c08c6941806a93097856ce170806 not found: ID does not exist" Apr 16 22:45:31.316702 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:31.316660 2576 scope.go:117] "RemoveContainer" containerID="3a569479eec2a8d9e5512fdc993ae1cc3c1fd809a7752595a4cde806de5c0c12" Apr 16 22:45:31.316966 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:45:31.316929 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a569479eec2a8d9e5512fdc993ae1cc3c1fd809a7752595a4cde806de5c0c12\": container with ID starting with 3a569479eec2a8d9e5512fdc993ae1cc3c1fd809a7752595a4cde806de5c0c12 not found: ID does not exist" containerID="3a569479eec2a8d9e5512fdc993ae1cc3c1fd809a7752595a4cde806de5c0c12" Apr 16 22:45:31.317012 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:31.316977 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a569479eec2a8d9e5512fdc993ae1cc3c1fd809a7752595a4cde806de5c0c12"} err="failed to get container status \"3a569479eec2a8d9e5512fdc993ae1cc3c1fd809a7752595a4cde806de5c0c12\": rpc error: code = NotFound desc = could not find container \"3a569479eec2a8d9e5512fdc993ae1cc3c1fd809a7752595a4cde806de5c0c12\": container with ID starting with 3a569479eec2a8d9e5512fdc993ae1cc3c1fd809a7752595a4cde806de5c0c12 not found: ID does not exist" Apr 16 22:45:31.413793 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:31.413709 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddp7r\" (UniqueName: \"kubernetes.io/projected/a61e12f9-eb2b-4e19-91f5-31e99b6ad914-kube-api-access-ddp7r\") pod \"a61e12f9-eb2b-4e19-91f5-31e99b6ad914\" (UID: \"a61e12f9-eb2b-4e19-91f5-31e99b6ad914\") " Apr 16 22:45:31.413793 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:31.413764 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-c905c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a61e12f9-eb2b-4e19-91f5-31e99b6ad914-error-404-isvc-c905c-kube-rbac-proxy-sar-config\") pod \"a61e12f9-eb2b-4e19-91f5-31e99b6ad914\" (UID: \"a61e12f9-eb2b-4e19-91f5-31e99b6ad914\") " Apr 16 22:45:31.413991 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:31.413803 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a61e12f9-eb2b-4e19-91f5-31e99b6ad914-proxy-tls\") pod \"a61e12f9-eb2b-4e19-91f5-31e99b6ad914\" (UID: \"a61e12f9-eb2b-4e19-91f5-31e99b6ad914\") " Apr 16 22:45:31.414170 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:31.414144 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a61e12f9-eb2b-4e19-91f5-31e99b6ad914-error-404-isvc-c905c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-c905c-kube-rbac-proxy-sar-config") pod "a61e12f9-eb2b-4e19-91f5-31e99b6ad914" (UID: "a61e12f9-eb2b-4e19-91f5-31e99b6ad914"). InnerVolumeSpecName "error-404-isvc-c905c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:45:31.415928 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:31.415900 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a61e12f9-eb2b-4e19-91f5-31e99b6ad914-kube-api-access-ddp7r" (OuterVolumeSpecName: "kube-api-access-ddp7r") pod "a61e12f9-eb2b-4e19-91f5-31e99b6ad914" (UID: "a61e12f9-eb2b-4e19-91f5-31e99b6ad914"). InnerVolumeSpecName "kube-api-access-ddp7r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:45:31.416028 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:31.415939 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a61e12f9-eb2b-4e19-91f5-31e99b6ad914-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a61e12f9-eb2b-4e19-91f5-31e99b6ad914" (UID: "a61e12f9-eb2b-4e19-91f5-31e99b6ad914"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:45:31.514859 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:31.514825 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ddp7r\" (UniqueName: \"kubernetes.io/projected/a61e12f9-eb2b-4e19-91f5-31e99b6ad914-kube-api-access-ddp7r\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:45:31.514859 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:31.514854 2576 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-c905c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a61e12f9-eb2b-4e19-91f5-31e99b6ad914-error-404-isvc-c905c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:45:31.514859 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:31.514864 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a61e12f9-eb2b-4e19-91f5-31e99b6ad914-proxy-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:45:31.621190 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:31.621159 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf"] Apr 16 22:45:31.626549 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:31.626522 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c905c-predictor-545b784cf7-84hwf"] Apr 16 22:45:33.020343 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:33.020308 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a61e12f9-eb2b-4e19-91f5-31e99b6ad914" path="/var/lib/kubelet/pods/a61e12f9-eb2b-4e19-91f5-31e99b6ad914/volumes" Apr 16 22:45:35.299622 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:35.299596 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" Apr 16 22:45:35.300160 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:35.300136 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" podUID="962e73ad-4e33-4700-87d0-5ea46d65e8a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 22:45:40.213041 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:40.212986 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" podUID="982484e0-9e62-48e0-949c-c1ff24170fd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 22:45:45.300467 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:45.300432 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" podUID="962e73ad-4e33-4700-87d0-5ea46d65e8a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 22:45:50.213743 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:50.213714 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" Apr 16 22:45:55.300540 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:45:55.300498 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" podUID="962e73ad-4e33-4700-87d0-5ea46d65e8a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 22:46:05.300235 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:05.300196 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" podUID="962e73ad-4e33-4700-87d0-5ea46d65e8a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 22:46:12.373052 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:12.373022 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp"] Apr 16 22:46:12.373478 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:12.373285 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" podUID="982484e0-9e62-48e0-949c-c1ff24170fd7" containerName="kserve-container" containerID="cri-o://9a2e6196d297dfdacffd55c95d22e5cccb780c1a4c8747c612cd43fb936fed0c" gracePeriod=30 Apr 16 22:46:12.373478 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:12.373353 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" podUID="982484e0-9e62-48e0-949c-c1ff24170fd7" containerName="kube-rbac-proxy" containerID="cri-o://f65baa698e71444a411cd378d2aa0198dcc97f7dc78999b86bfcb028d0de496c" gracePeriod=30 Apr 16 22:46:12.459863 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:12.459831 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4"] Apr 16 22:46:12.460246 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:12.460229 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a61e12f9-eb2b-4e19-91f5-31e99b6ad914" containerName="kserve-container" Apr 16 22:46:12.460334 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:12.460248 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a61e12f9-eb2b-4e19-91f5-31e99b6ad914" containerName="kserve-container" Apr 16 22:46:12.460334 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:12.460259 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a61e12f9-eb2b-4e19-91f5-31e99b6ad914" containerName="kube-rbac-proxy" Apr 16 22:46:12.460334 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:12.460267 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a61e12f9-eb2b-4e19-91f5-31e99b6ad914" containerName="kube-rbac-proxy" Apr 16 22:46:12.460485 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:12.460352 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a61e12f9-eb2b-4e19-91f5-31e99b6ad914" containerName="kube-rbac-proxy" Apr 16 22:46:12.460485 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:12.460365 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a61e12f9-eb2b-4e19-91f5-31e99b6ad914" containerName="kserve-container" Apr 16 22:46:12.463736 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:12.463715 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" Apr 16 22:46:12.465983 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:12.465960 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-3d312-predictor-serving-cert\"" Apr 16 22:46:12.466093 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:12.465963 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-3d312-kube-rbac-proxy-sar-config\"" Apr 16 22:46:12.472868 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:12.472818 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4"] Apr 16 22:46:12.546158 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:12.546127 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-3d312-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b1633da0-f07f-456d-a670-765543a5ebfd-error-404-isvc-3d312-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-3d312-predictor-86dbd8855f-w5rc4\" (UID: \"b1633da0-f07f-456d-a670-765543a5ebfd\") " pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" Apr 16 22:46:12.546273 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:12.546162 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1633da0-f07f-456d-a670-765543a5ebfd-proxy-tls\") pod \"error-404-isvc-3d312-predictor-86dbd8855f-w5rc4\" (UID: \"b1633da0-f07f-456d-a670-765543a5ebfd\") " pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" Apr 16 22:46:12.546273 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:12.546240 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwll5\" (UniqueName: \"kubernetes.io/projected/b1633da0-f07f-456d-a670-765543a5ebfd-kube-api-access-mwll5\") pod \"error-404-isvc-3d312-predictor-86dbd8855f-w5rc4\" (UID: \"b1633da0-f07f-456d-a670-765543a5ebfd\") " pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" Apr 16 22:46:12.647725 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:12.647617 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-3d312-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b1633da0-f07f-456d-a670-765543a5ebfd-error-404-isvc-3d312-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-3d312-predictor-86dbd8855f-w5rc4\" (UID: \"b1633da0-f07f-456d-a670-765543a5ebfd\") " pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" Apr 16 22:46:12.647725 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:12.647666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1633da0-f07f-456d-a670-765543a5ebfd-proxy-tls\") pod \"error-404-isvc-3d312-predictor-86dbd8855f-w5rc4\" (UID: \"b1633da0-f07f-456d-a670-765543a5ebfd\") " pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" Apr 16 22:46:12.647930 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:12.647740 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mwll5\" (UniqueName: \"kubernetes.io/projected/b1633da0-f07f-456d-a670-765543a5ebfd-kube-api-access-mwll5\") pod \"error-404-isvc-3d312-predictor-86dbd8855f-w5rc4\" (UID: \"b1633da0-f07f-456d-a670-765543a5ebfd\") " pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" Apr 16 22:46:12.647930 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:46:12.647846 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-3d312-predictor-serving-cert: secret "error-404-isvc-3d312-predictor-serving-cert" not found Apr 16 22:46:12.647930 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:46:12.647903 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1633da0-f07f-456d-a670-765543a5ebfd-proxy-tls podName:b1633da0-f07f-456d-a670-765543a5ebfd nodeName:}" failed. No retries permitted until 2026-04-16 22:46:13.147886811 +0000 UTC m=+1968.681227642 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b1633da0-f07f-456d-a670-765543a5ebfd-proxy-tls") pod "error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" (UID: "b1633da0-f07f-456d-a670-765543a5ebfd") : secret "error-404-isvc-3d312-predictor-serving-cert" not found Apr 16 22:46:12.648312 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:12.648294 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-3d312-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b1633da0-f07f-456d-a670-765543a5ebfd-error-404-isvc-3d312-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-3d312-predictor-86dbd8855f-w5rc4\" (UID: \"b1633da0-f07f-456d-a670-765543a5ebfd\") " pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" Apr 16 22:46:12.656037 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:12.656016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwll5\" (UniqueName: \"kubernetes.io/projected/b1633da0-f07f-456d-a670-765543a5ebfd-kube-api-access-mwll5\") pod \"error-404-isvc-3d312-predictor-86dbd8855f-w5rc4\" (UID: \"b1633da0-f07f-456d-a670-765543a5ebfd\") " pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" Apr 16 22:46:13.150851 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:13.150803 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1633da0-f07f-456d-a670-765543a5ebfd-proxy-tls\") pod \"error-404-isvc-3d312-predictor-86dbd8855f-w5rc4\" (UID: \"b1633da0-f07f-456d-a670-765543a5ebfd\") " pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" Apr 16 22:46:13.153184 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:13.153163 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1633da0-f07f-456d-a670-765543a5ebfd-proxy-tls\") pod \"error-404-isvc-3d312-predictor-86dbd8855f-w5rc4\" (UID: \"b1633da0-f07f-456d-a670-765543a5ebfd\") " pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" Apr 16 22:46:13.374746 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:13.374708 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" Apr 16 22:46:13.442442 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:13.442406 2576 generic.go:358] "Generic (PLEG): container finished" podID="982484e0-9e62-48e0-949c-c1ff24170fd7" containerID="f65baa698e71444a411cd378d2aa0198dcc97f7dc78999b86bfcb028d0de496c" exitCode=2 Apr 16 22:46:13.442615 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:13.442443 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" event={"ID":"982484e0-9e62-48e0-949c-c1ff24170fd7","Type":"ContainerDied","Data":"f65baa698e71444a411cd378d2aa0198dcc97f7dc78999b86bfcb028d0de496c"} Apr 16 22:46:13.492935 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:13.492877 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4"] Apr 16 22:46:13.495120 ip-10-0-130-227 kubenswrapper[2576]: W0416 22:46:13.495095 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1633da0_f07f_456d_a670_765543a5ebfd.slice/crio-8b9941ae9344ada8351008be49e9c1860a724a9039d2a226da85ae76dbc3a708 WatchSource:0}: Error finding container 8b9941ae9344ada8351008be49e9c1860a724a9039d2a226da85ae76dbc3a708: Status 404 returned error can't find the container with id 8b9941ae9344ada8351008be49e9c1860a724a9039d2a226da85ae76dbc3a708 Apr 16 22:46:14.447751 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:14.447713 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" event={"ID":"b1633da0-f07f-456d-a670-765543a5ebfd","Type":"ContainerStarted","Data":"2578659cad7a24621c678402e4c5cef7449680b3c22a0d3e0c8cd4723c10321a"} Apr 16 22:46:14.447751 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:14.447754 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" event={"ID":"b1633da0-f07f-456d-a670-765543a5ebfd","Type":"ContainerStarted","Data":"e2b924fc0fc81d4418813c6f486133921420074c832d1eaef3d897696858fbb9"} Apr 16 22:46:14.448167 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:14.447765 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" event={"ID":"b1633da0-f07f-456d-a670-765543a5ebfd","Type":"ContainerStarted","Data":"8b9941ae9344ada8351008be49e9c1860a724a9039d2a226da85ae76dbc3a708"} Apr 16 22:46:14.448167 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:14.447794 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" Apr 16 22:46:14.465409 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:14.465370 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" podStartSLOduration=2.465356325 podStartE2EDuration="2.465356325s" podCreationTimestamp="2026-04-16 22:46:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:46:14.463563939 +0000 UTC m=+1969.996904790" watchObservedRunningTime="2026-04-16 22:46:14.465356325 +0000 UTC m=+1969.998697176" Apr 16 22:46:15.208273 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:15.208230 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" podUID="982484e0-9e62-48e0-949c-c1ff24170fd7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.37:8643/healthz\": dial tcp 10.132.0.37:8643: connect: connection refused" Apr 16 22:46:15.301292 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:15.301266 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" Apr 16 22:46:15.453150 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:15.453121 2576 generic.go:358] "Generic (PLEG): container finished" podID="982484e0-9e62-48e0-949c-c1ff24170fd7" containerID="9a2e6196d297dfdacffd55c95d22e5cccb780c1a4c8747c612cd43fb936fed0c" exitCode=0 Apr 16 22:46:15.453516 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:15.453194 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" event={"ID":"982484e0-9e62-48e0-949c-c1ff24170fd7","Type":"ContainerDied","Data":"9a2e6196d297dfdacffd55c95d22e5cccb780c1a4c8747c612cd43fb936fed0c"} Apr 16 22:46:15.453516 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:15.453439 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" Apr 16 22:46:15.454539 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:15.454516 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" podUID="b1633da0-f07f-456d-a670-765543a5ebfd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 22:46:15.506493 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:15.506468 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" Apr 16 22:46:15.673613 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:15.673568 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-a5b4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/982484e0-9e62-48e0-949c-c1ff24170fd7-error-404-isvc-a5b4b-kube-rbac-proxy-sar-config\") pod \"982484e0-9e62-48e0-949c-c1ff24170fd7\" (UID: \"982484e0-9e62-48e0-949c-c1ff24170fd7\") " Apr 16 22:46:15.673835 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:15.673648 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmhhc\" (UniqueName: \"kubernetes.io/projected/982484e0-9e62-48e0-949c-c1ff24170fd7-kube-api-access-fmhhc\") pod \"982484e0-9e62-48e0-949c-c1ff24170fd7\" (UID: \"982484e0-9e62-48e0-949c-c1ff24170fd7\") " Apr 16 22:46:15.673835 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:15.673793 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/982484e0-9e62-48e0-949c-c1ff24170fd7-proxy-tls\") pod \"982484e0-9e62-48e0-949c-c1ff24170fd7\" (UID: \"982484e0-9e62-48e0-949c-c1ff24170fd7\") " Apr 16 22:46:15.673966 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:15.673947 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/982484e0-9e62-48e0-949c-c1ff24170fd7-error-404-isvc-a5b4b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-a5b4b-kube-rbac-proxy-sar-config") pod "982484e0-9e62-48e0-949c-c1ff24170fd7" (UID: "982484e0-9e62-48e0-949c-c1ff24170fd7"). InnerVolumeSpecName "error-404-isvc-a5b4b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:46:15.674051 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:15.674034 2576 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-a5b4b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/982484e0-9e62-48e0-949c-c1ff24170fd7-error-404-isvc-a5b4b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:46:15.675831 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:15.675807 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/982484e0-9e62-48e0-949c-c1ff24170fd7-kube-api-access-fmhhc" (OuterVolumeSpecName: "kube-api-access-fmhhc") pod "982484e0-9e62-48e0-949c-c1ff24170fd7" (UID: "982484e0-9e62-48e0-949c-c1ff24170fd7"). InnerVolumeSpecName "kube-api-access-fmhhc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:46:15.675920 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:15.675899 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/982484e0-9e62-48e0-949c-c1ff24170fd7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "982484e0-9e62-48e0-949c-c1ff24170fd7" (UID: "982484e0-9e62-48e0-949c-c1ff24170fd7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:46:15.775356 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:15.775324 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/982484e0-9e62-48e0-949c-c1ff24170fd7-proxy-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:46:15.775356 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:15.775355 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fmhhc\" (UniqueName: \"kubernetes.io/projected/982484e0-9e62-48e0-949c-c1ff24170fd7-kube-api-access-fmhhc\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:46:16.457808 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:16.457776 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" Apr 16 22:46:16.457808 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:16.457788 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp" event={"ID":"982484e0-9e62-48e0-949c-c1ff24170fd7","Type":"ContainerDied","Data":"ead4660c6c3ce78fbacd862ff9ab518acd5491a0aebf13bf01f68ef2b5251af3"} Apr 16 22:46:16.458332 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:16.457833 2576 scope.go:117] "RemoveContainer" containerID="f65baa698e71444a411cd378d2aa0198dcc97f7dc78999b86bfcb028d0de496c" Apr 16 22:46:16.458332 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:16.458214 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" podUID="b1633da0-f07f-456d-a670-765543a5ebfd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 22:46:16.470379 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:16.470362 2576 scope.go:117] "RemoveContainer" containerID="9a2e6196d297dfdacffd55c95d22e5cccb780c1a4c8747c612cd43fb936fed0c" Apr 16 22:46:16.481497 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:16.481473 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp"] Apr 16 22:46:16.486706 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:16.486687 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a5b4b-predictor-b8bdc6c9c-26xmp"] Apr 16 22:46:17.020403 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:17.020370 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="982484e0-9e62-48e0-949c-c1ff24170fd7" path="/var/lib/kubelet/pods/982484e0-9e62-48e0-949c-c1ff24170fd7/volumes" Apr 16 22:46:21.461661 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:21.461634 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" Apr 16 22:46:21.462083 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:21.462057 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" podUID="b1633da0-f07f-456d-a670-765543a5ebfd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 22:46:31.462760 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:31.462720 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" podUID="b1633da0-f07f-456d-a670-765543a5ebfd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 22:46:41.462527 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:41.462485 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" podUID="b1633da0-f07f-456d-a670-765543a5ebfd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 22:46:51.462584 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:46:51.462547 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" podUID="b1633da0-f07f-456d-a670-765543a5ebfd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 22:47:01.462739 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:47:01.462712 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" Apr 16 22:48:25.111870 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:48:25.111762 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 22:48:25.114591 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:48:25.114570 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 22:53:25.134924 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:53:25.134814 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 22:53:25.138394 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:53:25.138376 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 22:55:27.179916 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:27.179842 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4"] Apr 16 22:55:27.180366 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:27.180108 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" podUID="b1633da0-f07f-456d-a670-765543a5ebfd" containerName="kserve-container" containerID="cri-o://e2b924fc0fc81d4418813c6f486133921420074c832d1eaef3d897696858fbb9" gracePeriod=30 Apr 16 22:55:27.180366 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:27.180197 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" podUID="b1633da0-f07f-456d-a670-765543a5ebfd" containerName="kube-rbac-proxy" containerID="cri-o://2578659cad7a24621c678402e4c5cef7449680b3c22a0d3e0c8cd4723c10321a" gracePeriod=30 Apr 16 22:55:28.278249 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:28.278208 2576 generic.go:358] "Generic (PLEG): container finished" podID="b1633da0-f07f-456d-a670-765543a5ebfd" containerID="2578659cad7a24621c678402e4c5cef7449680b3c22a0d3e0c8cd4723c10321a" exitCode=2 Apr 16 22:55:28.278249 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:28.278247 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" event={"ID":"b1633da0-f07f-456d-a670-765543a5ebfd","Type":"ContainerDied","Data":"2578659cad7a24621c678402e4c5cef7449680b3c22a0d3e0c8cd4723c10321a"} Apr 16 22:55:29.926095 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:29.926073 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" Apr 16 22:55:30.052868 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:30.052781 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1633da0-f07f-456d-a670-765543a5ebfd-proxy-tls\") pod \"b1633da0-f07f-456d-a670-765543a5ebfd\" (UID: \"b1633da0-f07f-456d-a670-765543a5ebfd\") " Apr 16 22:55:30.052868 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:30.052822 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwll5\" (UniqueName: \"kubernetes.io/projected/b1633da0-f07f-456d-a670-765543a5ebfd-kube-api-access-mwll5\") pod \"b1633da0-f07f-456d-a670-765543a5ebfd\" (UID: \"b1633da0-f07f-456d-a670-765543a5ebfd\") " Apr 16 22:55:30.052868 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:30.052846 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-3d312-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b1633da0-f07f-456d-a670-765543a5ebfd-error-404-isvc-3d312-kube-rbac-proxy-sar-config\") pod \"b1633da0-f07f-456d-a670-765543a5ebfd\" (UID: \"b1633da0-f07f-456d-a670-765543a5ebfd\") " Apr 16 22:55:30.053291 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:30.053259 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1633da0-f07f-456d-a670-765543a5ebfd-error-404-isvc-3d312-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-3d312-kube-rbac-proxy-sar-config") pod "b1633da0-f07f-456d-a670-765543a5ebfd" (UID: "b1633da0-f07f-456d-a670-765543a5ebfd"). InnerVolumeSpecName "error-404-isvc-3d312-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:55:30.054959 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:30.054934 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1633da0-f07f-456d-a670-765543a5ebfd-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b1633da0-f07f-456d-a670-765543a5ebfd" (UID: "b1633da0-f07f-456d-a670-765543a5ebfd"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:55:30.055118 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:30.055096 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1633da0-f07f-456d-a670-765543a5ebfd-kube-api-access-mwll5" (OuterVolumeSpecName: "kube-api-access-mwll5") pod "b1633da0-f07f-456d-a670-765543a5ebfd" (UID: "b1633da0-f07f-456d-a670-765543a5ebfd"). InnerVolumeSpecName "kube-api-access-mwll5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:55:30.153661 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:30.153631 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1633da0-f07f-456d-a670-765543a5ebfd-proxy-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:55:30.153661 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:30.153656 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mwll5\" (UniqueName: \"kubernetes.io/projected/b1633da0-f07f-456d-a670-765543a5ebfd-kube-api-access-mwll5\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:55:30.153865 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:30.153667 2576 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-3d312-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b1633da0-f07f-456d-a670-765543a5ebfd-error-404-isvc-3d312-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 22:55:30.286027 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:30.285995 2576 generic.go:358] "Generic (PLEG): container finished" podID="b1633da0-f07f-456d-a670-765543a5ebfd" containerID="e2b924fc0fc81d4418813c6f486133921420074c832d1eaef3d897696858fbb9" exitCode=0 Apr 16 22:55:30.286196 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:30.286065 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" Apr 16 22:55:30.286196 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:30.286078 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" event={"ID":"b1633da0-f07f-456d-a670-765543a5ebfd","Type":"ContainerDied","Data":"e2b924fc0fc81d4418813c6f486133921420074c832d1eaef3d897696858fbb9"} Apr 16 22:55:30.286196 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:30.286119 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4" event={"ID":"b1633da0-f07f-456d-a670-765543a5ebfd","Type":"ContainerDied","Data":"8b9941ae9344ada8351008be49e9c1860a724a9039d2a226da85ae76dbc3a708"} Apr 16 22:55:30.286196 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:30.286135 2576 scope.go:117] "RemoveContainer" containerID="2578659cad7a24621c678402e4c5cef7449680b3c22a0d3e0c8cd4723c10321a" Apr 16 22:55:30.294053 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:30.294024 2576 scope.go:117] "RemoveContainer" containerID="e2b924fc0fc81d4418813c6f486133921420074c832d1eaef3d897696858fbb9" Apr 16 22:55:30.301512 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:30.301498 2576 scope.go:117] "RemoveContainer" containerID="2578659cad7a24621c678402e4c5cef7449680b3c22a0d3e0c8cd4723c10321a" Apr 16 22:55:30.301747 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:55:30.301729 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2578659cad7a24621c678402e4c5cef7449680b3c22a0d3e0c8cd4723c10321a\": container with ID starting with 2578659cad7a24621c678402e4c5cef7449680b3c22a0d3e0c8cd4723c10321a not found: ID does not exist" containerID="2578659cad7a24621c678402e4c5cef7449680b3c22a0d3e0c8cd4723c10321a" Apr 16 22:55:30.301844 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:30.301751 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2578659cad7a24621c678402e4c5cef7449680b3c22a0d3e0c8cd4723c10321a"} err="failed to get container status \"2578659cad7a24621c678402e4c5cef7449680b3c22a0d3e0c8cd4723c10321a\": rpc error: code = NotFound desc = could not find container \"2578659cad7a24621c678402e4c5cef7449680b3c22a0d3e0c8cd4723c10321a\": container with ID starting with 2578659cad7a24621c678402e4c5cef7449680b3c22a0d3e0c8cd4723c10321a not found: ID does not exist" Apr 16 22:55:30.301844 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:30.301768 2576 scope.go:117] "RemoveContainer" containerID="e2b924fc0fc81d4418813c6f486133921420074c832d1eaef3d897696858fbb9" Apr 16 22:55:30.301973 ip-10-0-130-227 kubenswrapper[2576]: E0416 22:55:30.301960 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2b924fc0fc81d4418813c6f486133921420074c832d1eaef3d897696858fbb9\": container with ID starting with e2b924fc0fc81d4418813c6f486133921420074c832d1eaef3d897696858fbb9 not found: ID does not exist" containerID="e2b924fc0fc81d4418813c6f486133921420074c832d1eaef3d897696858fbb9" Apr 16 22:55:30.302009 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:30.301978 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2b924fc0fc81d4418813c6f486133921420074c832d1eaef3d897696858fbb9"} err="failed to get container status \"e2b924fc0fc81d4418813c6f486133921420074c832d1eaef3d897696858fbb9\": rpc error: code = NotFound desc = could not find container \"e2b924fc0fc81d4418813c6f486133921420074c832d1eaef3d897696858fbb9\": container with ID starting with e2b924fc0fc81d4418813c6f486133921420074c832d1eaef3d897696858fbb9 not found: ID does not exist" Apr 16 22:55:30.307110 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:30.307059 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4"] Apr 16 22:55:30.310496 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:30.310477 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3d312-predictor-86dbd8855f-w5rc4"] Apr 16 22:55:31.019383 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:55:31.019349 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1633da0-f07f-456d-a670-765543a5ebfd" path="/var/lib/kubelet/pods/b1633da0-f07f-456d-a670-765543a5ebfd/volumes" Apr 16 22:58:25.157731 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:58:25.157606 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 22:58:25.162519 ip-10-0-130-227 kubenswrapper[2576]: I0416 22:58:25.162501 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 23:02:47.418364 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:47.418291 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd"] Apr 16 23:02:47.418886 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:47.418643 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" podUID="962e73ad-4e33-4700-87d0-5ea46d65e8a9" containerName="kserve-container" containerID="cri-o://cf13f7231af71d32fd8233f88ee3472b44a7c138dcd158c4d1c56f80dbb6a7a2" gracePeriod=30 Apr 16 23:02:47.418886 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:47.418718 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" podUID="962e73ad-4e33-4700-87d0-5ea46d65e8a9" containerName="kube-rbac-proxy" containerID="cri-o://5b461956ff41248b2171bff5cc9dead2b3e7d96fb1f569bce08cbf7a64c9cce0" gracePeriod=30 Apr 16 23:02:47.723873 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:47.723786 2576 generic.go:358] "Generic (PLEG): container finished" podID="962e73ad-4e33-4700-87d0-5ea46d65e8a9" containerID="5b461956ff41248b2171bff5cc9dead2b3e7d96fb1f569bce08cbf7a64c9cce0" exitCode=2 Apr 16 23:02:47.723873 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:47.723852 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" event={"ID":"962e73ad-4e33-4700-87d0-5ea46d65e8a9","Type":"ContainerDied","Data":"5b461956ff41248b2171bff5cc9dead2b3e7d96fb1f569bce08cbf7a64c9cce0"} Apr 16 23:02:48.430987 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.430953 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bcjnq/must-gather-dswv5"] Apr 16 23:02:48.431384 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.431303 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1633da0-f07f-456d-a670-765543a5ebfd" containerName="kserve-container" Apr 16 23:02:48.431384 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.431314 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1633da0-f07f-456d-a670-765543a5ebfd" containerName="kserve-container" Apr 16 23:02:48.431384 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.431326 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1633da0-f07f-456d-a670-765543a5ebfd" containerName="kube-rbac-proxy" Apr 16 23:02:48.431384 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.431331 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1633da0-f07f-456d-a670-765543a5ebfd" containerName="kube-rbac-proxy" Apr 16 23:02:48.431384 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.431338 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="982484e0-9e62-48e0-949c-c1ff24170fd7" containerName="kube-rbac-proxy" Apr 16 23:02:48.431384 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.431344 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="982484e0-9e62-48e0-949c-c1ff24170fd7" containerName="kube-rbac-proxy" Apr 16 23:02:48.431384 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.431356 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="982484e0-9e62-48e0-949c-c1ff24170fd7" containerName="kserve-container" Apr 16 23:02:48.431384 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.431361 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="982484e0-9e62-48e0-949c-c1ff24170fd7" containerName="kserve-container" Apr 16 23:02:48.431700 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.431416 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="982484e0-9e62-48e0-949c-c1ff24170fd7" containerName="kube-rbac-proxy" Apr 16 23:02:48.431700 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.431426 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="982484e0-9e62-48e0-949c-c1ff24170fd7" containerName="kserve-container" Apr 16 23:02:48.431700 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.431432 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b1633da0-f07f-456d-a670-765543a5ebfd" containerName="kube-rbac-proxy" Apr 16 23:02:48.431700 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.431441 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b1633da0-f07f-456d-a670-765543a5ebfd" containerName="kserve-container" Apr 16 23:02:48.434346 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.434330 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bcjnq/must-gather-dswv5" Apr 16 23:02:48.437743 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.437717 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-bcjnq\"/\"openshift-service-ca.crt\"" Apr 16 23:02:48.438860 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.438839 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-bcjnq\"/\"kube-root-ca.crt\"" Apr 16 23:02:48.438958 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.438914 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-bcjnq\"/\"default-dockercfg-5cdqg\"" Apr 16 23:02:48.448198 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.448175 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bcjnq/must-gather-dswv5"] Apr 16 23:02:48.495975 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.495950 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/92fb86f6-7356-4eb4-8cde-5c25dd565e56-must-gather-output\") pod \"must-gather-dswv5\" (UID: \"92fb86f6-7356-4eb4-8cde-5c25dd565e56\") " pod="openshift-must-gather-bcjnq/must-gather-dswv5" Apr 16 23:02:48.496108 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.496010 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4plww\" (UniqueName: \"kubernetes.io/projected/92fb86f6-7356-4eb4-8cde-5c25dd565e56-kube-api-access-4plww\") pod \"must-gather-dswv5\" (UID: \"92fb86f6-7356-4eb4-8cde-5c25dd565e56\") " pod="openshift-must-gather-bcjnq/must-gather-dswv5" Apr 16 23:02:48.596385 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.596355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/92fb86f6-7356-4eb4-8cde-5c25dd565e56-must-gather-output\") pod \"must-gather-dswv5\" (UID: \"92fb86f6-7356-4eb4-8cde-5c25dd565e56\") " pod="openshift-must-gather-bcjnq/must-gather-dswv5" Apr 16 23:02:48.596533 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.596419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4plww\" (UniqueName: \"kubernetes.io/projected/92fb86f6-7356-4eb4-8cde-5c25dd565e56-kube-api-access-4plww\") pod \"must-gather-dswv5\" (UID: \"92fb86f6-7356-4eb4-8cde-5c25dd565e56\") " pod="openshift-must-gather-bcjnq/must-gather-dswv5" Apr 16 23:02:48.596716 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.596695 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/92fb86f6-7356-4eb4-8cde-5c25dd565e56-must-gather-output\") pod \"must-gather-dswv5\" (UID: \"92fb86f6-7356-4eb4-8cde-5c25dd565e56\") " pod="openshift-must-gather-bcjnq/must-gather-dswv5" Apr 16 23:02:48.603960 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.603938 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4plww\" (UniqueName: \"kubernetes.io/projected/92fb86f6-7356-4eb4-8cde-5c25dd565e56-kube-api-access-4plww\") pod \"must-gather-dswv5\" (UID: \"92fb86f6-7356-4eb4-8cde-5c25dd565e56\") " pod="openshift-must-gather-bcjnq/must-gather-dswv5" Apr 16 23:02:48.760521 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.760493 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bcjnq/must-gather-dswv5" Apr 16 23:02:48.879908 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.879882 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bcjnq/must-gather-dswv5"] Apr 16 23:02:48.881821 ip-10-0-130-227 kubenswrapper[2576]: W0416 23:02:48.881793 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92fb86f6_7356_4eb4_8cde_5c25dd565e56.slice/crio-4f9ef03df68ec09fa202715140c4f7afe698d3f7772277c8eef3b4914f750be0 WatchSource:0}: Error finding container 4f9ef03df68ec09fa202715140c4f7afe698d3f7772277c8eef3b4914f750be0: Status 404 returned error can't find the container with id 4f9ef03df68ec09fa202715140c4f7afe698d3f7772277c8eef3b4914f750be0 Apr 16 23:02:48.883416 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:48.883399 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:02:49.731877 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:49.731839 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bcjnq/must-gather-dswv5" event={"ID":"92fb86f6-7356-4eb4-8cde-5c25dd565e56","Type":"ContainerStarted","Data":"4f9ef03df68ec09fa202715140c4f7afe698d3f7772277c8eef3b4914f750be0"} Apr 16 23:02:50.295497 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:50.295449 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" podUID="962e73ad-4e33-4700-87d0-5ea46d65e8a9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.38:8643/healthz\": dial tcp 10.132.0.38:8643: connect: connection refused" Apr 16 23:02:50.685822 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:50.685796 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" Apr 16 23:02:50.715812 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:50.715784 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/962e73ad-4e33-4700-87d0-5ea46d65e8a9-proxy-tls\") pod \"962e73ad-4e33-4700-87d0-5ea46d65e8a9\" (UID: \"962e73ad-4e33-4700-87d0-5ea46d65e8a9\") " Apr 16 23:02:50.715981 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:50.715836 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5wr4\" (UniqueName: \"kubernetes.io/projected/962e73ad-4e33-4700-87d0-5ea46d65e8a9-kube-api-access-n5wr4\") pod \"962e73ad-4e33-4700-87d0-5ea46d65e8a9\" (UID: \"962e73ad-4e33-4700-87d0-5ea46d65e8a9\") " Apr 16 23:02:50.716054 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:50.716000 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-a9d3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/962e73ad-4e33-4700-87d0-5ea46d65e8a9-error-404-isvc-a9d3d-kube-rbac-proxy-sar-config\") pod \"962e73ad-4e33-4700-87d0-5ea46d65e8a9\" (UID: \"962e73ad-4e33-4700-87d0-5ea46d65e8a9\") " Apr 16 23:02:50.716327 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:50.716299 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/962e73ad-4e33-4700-87d0-5ea46d65e8a9-error-404-isvc-a9d3d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-a9d3d-kube-rbac-proxy-sar-config") pod "962e73ad-4e33-4700-87d0-5ea46d65e8a9" (UID: "962e73ad-4e33-4700-87d0-5ea46d65e8a9"). InnerVolumeSpecName "error-404-isvc-a9d3d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:02:50.717884 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:50.717858 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/962e73ad-4e33-4700-87d0-5ea46d65e8a9-kube-api-access-n5wr4" (OuterVolumeSpecName: "kube-api-access-n5wr4") pod "962e73ad-4e33-4700-87d0-5ea46d65e8a9" (UID: "962e73ad-4e33-4700-87d0-5ea46d65e8a9"). InnerVolumeSpecName "kube-api-access-n5wr4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:02:50.717991 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:50.717928 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962e73ad-4e33-4700-87d0-5ea46d65e8a9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "962e73ad-4e33-4700-87d0-5ea46d65e8a9" (UID: "962e73ad-4e33-4700-87d0-5ea46d65e8a9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:02:50.736554 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:50.736521 2576 generic.go:358] "Generic (PLEG): container finished" podID="962e73ad-4e33-4700-87d0-5ea46d65e8a9" containerID="cf13f7231af71d32fd8233f88ee3472b44a7c138dcd158c4d1c56f80dbb6a7a2" exitCode=0 Apr 16 23:02:50.736926 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:50.736589 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" event={"ID":"962e73ad-4e33-4700-87d0-5ea46d65e8a9","Type":"ContainerDied","Data":"cf13f7231af71d32fd8233f88ee3472b44a7c138dcd158c4d1c56f80dbb6a7a2"} Apr 16 23:02:50.736926 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:50.736605 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" Apr 16 23:02:50.736926 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:50.736625 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd" event={"ID":"962e73ad-4e33-4700-87d0-5ea46d65e8a9","Type":"ContainerDied","Data":"55b85e6f790a236f9883e5dc25e7eff0433423f3c48ad56ada64e922f5710aea"} Apr 16 23:02:50.736926 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:50.736645 2576 scope.go:117] "RemoveContainer" containerID="5b461956ff41248b2171bff5cc9dead2b3e7d96fb1f569bce08cbf7a64c9cce0" Apr 16 23:02:50.761189 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:50.761139 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd"] Apr 16 23:02:50.762500 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:50.762459 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a9d3d-predictor-59878dc55-5kshd"] Apr 16 23:02:50.817392 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:50.817357 2576 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-a9d3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/962e73ad-4e33-4700-87d0-5ea46d65e8a9-error-404-isvc-a9d3d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 23:02:50.817392 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:50.817385 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/962e73ad-4e33-4700-87d0-5ea46d65e8a9-proxy-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 23:02:50.817392 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:50.817396 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n5wr4\" (UniqueName: \"kubernetes.io/projected/962e73ad-4e33-4700-87d0-5ea46d65e8a9-kube-api-access-n5wr4\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 23:02:51.022663 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:51.022632 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="962e73ad-4e33-4700-87d0-5ea46d65e8a9" path="/var/lib/kubelet/pods/962e73ad-4e33-4700-87d0-5ea46d65e8a9/volumes" Apr 16 23:02:52.068532 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:52.068379 2576 scope.go:117] "RemoveContainer" containerID="cf13f7231af71d32fd8233f88ee3472b44a7c138dcd158c4d1c56f80dbb6a7a2" Apr 16 23:02:52.968421 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:52.968386 2576 scope.go:117] "RemoveContainer" containerID="5b461956ff41248b2171bff5cc9dead2b3e7d96fb1f569bce08cbf7a64c9cce0" Apr 16 23:02:52.968731 ip-10-0-130-227 kubenswrapper[2576]: E0416 23:02:52.968704 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b461956ff41248b2171bff5cc9dead2b3e7d96fb1f569bce08cbf7a64c9cce0\": container with ID starting with 5b461956ff41248b2171bff5cc9dead2b3e7d96fb1f569bce08cbf7a64c9cce0 not found: ID does not exist" containerID="5b461956ff41248b2171bff5cc9dead2b3e7d96fb1f569bce08cbf7a64c9cce0" Apr 16 23:02:52.968818 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:52.968741 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b461956ff41248b2171bff5cc9dead2b3e7d96fb1f569bce08cbf7a64c9cce0"} err="failed to get container status \"5b461956ff41248b2171bff5cc9dead2b3e7d96fb1f569bce08cbf7a64c9cce0\": rpc error: code = NotFound desc = could not find container \"5b461956ff41248b2171bff5cc9dead2b3e7d96fb1f569bce08cbf7a64c9cce0\": container with ID starting with 5b461956ff41248b2171bff5cc9dead2b3e7d96fb1f569bce08cbf7a64c9cce0 not found: ID does not exist" Apr 16 23:02:52.968818 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:52.968762 2576 scope.go:117] "RemoveContainer" containerID="cf13f7231af71d32fd8233f88ee3472b44a7c138dcd158c4d1c56f80dbb6a7a2" Apr 16 23:02:52.969066 ip-10-0-130-227 kubenswrapper[2576]: E0416 23:02:52.969047 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf13f7231af71d32fd8233f88ee3472b44a7c138dcd158c4d1c56f80dbb6a7a2\": container with ID starting with cf13f7231af71d32fd8233f88ee3472b44a7c138dcd158c4d1c56f80dbb6a7a2 not found: ID does not exist" containerID="cf13f7231af71d32fd8233f88ee3472b44a7c138dcd158c4d1c56f80dbb6a7a2" Apr 16 23:02:52.969107 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:52.969074 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf13f7231af71d32fd8233f88ee3472b44a7c138dcd158c4d1c56f80dbb6a7a2"} err="failed to get container status \"cf13f7231af71d32fd8233f88ee3472b44a7c138dcd158c4d1c56f80dbb6a7a2\": rpc error: code = NotFound desc = could not find container \"cf13f7231af71d32fd8233f88ee3472b44a7c138dcd158c4d1c56f80dbb6a7a2\": container with ID starting with cf13f7231af71d32fd8233f88ee3472b44a7c138dcd158c4d1c56f80dbb6a7a2 not found: ID does not exist" Apr 16 23:02:53.756707 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:53.756643 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bcjnq/must-gather-dswv5" event={"ID":"92fb86f6-7356-4eb4-8cde-5c25dd565e56","Type":"ContainerStarted","Data":"43e6d903bd0524529fa336ba6e5adf3490ba17f23a2ec80afa452f4e74eab259"} Apr 16 23:02:53.756707 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:53.756713 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bcjnq/must-gather-dswv5" event={"ID":"92fb86f6-7356-4eb4-8cde-5c25dd565e56","Type":"ContainerStarted","Data":"f2a8e738199cabd7704e31213a37fa24a6d847e4d26b0f749fe8b64d74bea908"} Apr 16 23:02:53.772269 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:02:53.772212 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bcjnq/must-gather-dswv5" podStartSLOduration=1.582217317 podStartE2EDuration="5.772195467s" podCreationTimestamp="2026-04-16 23:02:48 +0000 UTC" firstStartedPulling="2026-04-16 23:02:48.883522956 +0000 UTC m=+2964.416863787" lastFinishedPulling="2026-04-16 23:02:53.073501088 +0000 UTC m=+2968.606841937" observedRunningTime="2026-04-16 23:02:53.770727182 +0000 UTC m=+2969.304068035" watchObservedRunningTime="2026-04-16 23:02:53.772195467 +0000 UTC m=+2969.305536322" Apr 16 23:03:11.822097 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:11.822060 2576 generic.go:358] "Generic (PLEG): container finished" podID="92fb86f6-7356-4eb4-8cde-5c25dd565e56" containerID="f2a8e738199cabd7704e31213a37fa24a6d847e4d26b0f749fe8b64d74bea908" exitCode=0 Apr 16 23:03:11.822713 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:11.822137 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bcjnq/must-gather-dswv5" event={"ID":"92fb86f6-7356-4eb4-8cde-5c25dd565e56","Type":"ContainerDied","Data":"f2a8e738199cabd7704e31213a37fa24a6d847e4d26b0f749fe8b64d74bea908"} Apr 16 23:03:11.822713 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:11.822447 2576 scope.go:117] "RemoveContainer" containerID="f2a8e738199cabd7704e31213a37fa24a6d847e4d26b0f749fe8b64d74bea908" Apr 16 23:03:12.048069 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:12.048035 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bcjnq_must-gather-dswv5_92fb86f6-7356-4eb4-8cde-5c25dd565e56/gather/0.log" Apr 16 23:03:12.599728 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:12.599693 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vzm9p/must-gather-nf89m"] Apr 16 23:03:12.600045 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:12.600033 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="962e73ad-4e33-4700-87d0-5ea46d65e8a9" containerName="kserve-container" Apr 16 23:03:12.600092 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:12.600047 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="962e73ad-4e33-4700-87d0-5ea46d65e8a9" containerName="kserve-container" Apr 16 23:03:12.600092 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:12.600063 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="962e73ad-4e33-4700-87d0-5ea46d65e8a9" containerName="kube-rbac-proxy" Apr 16 23:03:12.600092 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:12.600069 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="962e73ad-4e33-4700-87d0-5ea46d65e8a9" containerName="kube-rbac-proxy" Apr 16 23:03:12.600186 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:12.600131 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="962e73ad-4e33-4700-87d0-5ea46d65e8a9" containerName="kserve-container" Apr 16 23:03:12.600186 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:12.600140 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="962e73ad-4e33-4700-87d0-5ea46d65e8a9" containerName="kube-rbac-proxy" Apr 16 23:03:12.602639 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:12.602622 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzm9p/must-gather-nf89m" Apr 16 23:03:12.605142 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:12.605119 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vzm9p\"/\"kube-root-ca.crt\"" Apr 16 23:03:12.605302 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:12.605194 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vzm9p\"/\"openshift-service-ca.crt\"" Apr 16 23:03:12.605302 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:12.605215 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vzm9p\"/\"default-dockercfg-bgrbr\"" Apr 16 23:03:12.609735 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:12.609518 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vzm9p/must-gather-nf89m"] Apr 16 23:03:12.712918 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:12.712887 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3083eade-56de-43b8-a02c-3d70cdead9d1-must-gather-output\") pod \"must-gather-nf89m\" (UID: \"3083eade-56de-43b8-a02c-3d70cdead9d1\") " pod="openshift-must-gather-vzm9p/must-gather-nf89m" Apr 16 23:03:12.713056 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:12.712932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kj9w\" (UniqueName: \"kubernetes.io/projected/3083eade-56de-43b8-a02c-3d70cdead9d1-kube-api-access-4kj9w\") pod \"must-gather-nf89m\" (UID: \"3083eade-56de-43b8-a02c-3d70cdead9d1\") " pod="openshift-must-gather-vzm9p/must-gather-nf89m" Apr 16 23:03:12.813367 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:12.813332 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3083eade-56de-43b8-a02c-3d70cdead9d1-must-gather-output\") pod \"must-gather-nf89m\" (UID: \"3083eade-56de-43b8-a02c-3d70cdead9d1\") " pod="openshift-must-gather-vzm9p/must-gather-nf89m" Apr 16 23:03:12.813524 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:12.813394 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4kj9w\" (UniqueName: \"kubernetes.io/projected/3083eade-56de-43b8-a02c-3d70cdead9d1-kube-api-access-4kj9w\") pod \"must-gather-nf89m\" (UID: \"3083eade-56de-43b8-a02c-3d70cdead9d1\") " pod="openshift-must-gather-vzm9p/must-gather-nf89m" Apr 16 23:03:12.813711 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:12.813690 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3083eade-56de-43b8-a02c-3d70cdead9d1-must-gather-output\") pod \"must-gather-nf89m\" (UID: \"3083eade-56de-43b8-a02c-3d70cdead9d1\") " pod="openshift-must-gather-vzm9p/must-gather-nf89m" Apr 16 23:03:12.821485 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:12.821462 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kj9w\" (UniqueName: \"kubernetes.io/projected/3083eade-56de-43b8-a02c-3d70cdead9d1-kube-api-access-4kj9w\") pod \"must-gather-nf89m\" (UID: \"3083eade-56de-43b8-a02c-3d70cdead9d1\") " pod="openshift-must-gather-vzm9p/must-gather-nf89m" Apr 16 23:03:12.912565 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:12.912498 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzm9p/must-gather-nf89m" Apr 16 23:03:13.031307 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:13.031272 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vzm9p/must-gather-nf89m"] Apr 16 23:03:13.034003 ip-10-0-130-227 kubenswrapper[2576]: W0416 23:03:13.033975 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3083eade_56de_43b8_a02c_3d70cdead9d1.slice/crio-9a0a2762e14590976e0f449a1c0058ca52b3c941fc8e315db84c502f01e86ca3 WatchSource:0}: Error finding container 9a0a2762e14590976e0f449a1c0058ca52b3c941fc8e315db84c502f01e86ca3: Status 404 returned error can't find the container with id 9a0a2762e14590976e0f449a1c0058ca52b3c941fc8e315db84c502f01e86ca3 Apr 16 23:03:13.830695 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:13.830636 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzm9p/must-gather-nf89m" event={"ID":"3083eade-56de-43b8-a02c-3d70cdead9d1","Type":"ContainerStarted","Data":"9a0a2762e14590976e0f449a1c0058ca52b3c941fc8e315db84c502f01e86ca3"} Apr 16 23:03:14.836945 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:14.836906 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzm9p/must-gather-nf89m" event={"ID":"3083eade-56de-43b8-a02c-3d70cdead9d1","Type":"ContainerStarted","Data":"cef87920f3c760c91ef7c64f9bee5fa0846e85a19f342876d77bf7757f274daf"} Apr 16 23:03:14.836945 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:14.836946 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzm9p/must-gather-nf89m" event={"ID":"3083eade-56de-43b8-a02c-3d70cdead9d1","Type":"ContainerStarted","Data":"4d8c8fb7a7e3902cb4e6b736fa029e18a2dd43f5365b8247107278f520f6f46d"} Apr 16 23:03:14.852702 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:14.852627 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vzm9p/must-gather-nf89m" podStartSLOduration=2.012243767 podStartE2EDuration="2.852608105s" podCreationTimestamp="2026-04-16 23:03:12 +0000 UTC" firstStartedPulling="2026-04-16 23:03:13.035568678 +0000 UTC m=+2988.568909509" lastFinishedPulling="2026-04-16 23:03:13.875933013 +0000 UTC m=+2989.409273847" observedRunningTime="2026-04-16 23:03:14.851267574 +0000 UTC m=+2990.384608428" watchObservedRunningTime="2026-04-16 23:03:14.852608105 +0000 UTC m=+2990.385948960" Apr 16 23:03:15.142416 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:15.142323 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-76zkl_4074edbf-bd19-4b41-b90e-a27b5cc066e7/global-pull-secret-syncer/0.log" Apr 16 23:03:15.275882 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:15.275848 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-6lpzp_d3ae4fa7-c3ae-4639-be2a-f39b4666fae0/konnectivity-agent/0.log" Apr 16 23:03:15.361562 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:15.361529 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-227.ec2.internal_1e12d5376c1eb684628dd9cf17dac4a9/haproxy/0.log" Apr 16 23:03:17.448145 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:17.448106 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bcjnq/must-gather-dswv5"] Apr 16 23:03:17.448828 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:17.448379 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-bcjnq/must-gather-dswv5" podUID="92fb86f6-7356-4eb4-8cde-5c25dd565e56" containerName="copy" containerID="cri-o://43e6d903bd0524529fa336ba6e5adf3490ba17f23a2ec80afa452f4e74eab259" gracePeriod=2 Apr 16 23:03:17.452453 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:17.452426 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bcjnq/must-gather-dswv5"] Apr 16 23:03:17.453648 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:17.452930 2576 status_manager.go:895] "Failed to get status for pod" podUID="92fb86f6-7356-4eb4-8cde-5c25dd565e56" pod="openshift-must-gather-bcjnq/must-gather-dswv5" err="pods \"must-gather-dswv5\" is forbidden: User \"system:node:ip-10-0-130-227.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-bcjnq\": no relationship found between node 'ip-10-0-130-227.ec2.internal' and this object" Apr 16 23:03:17.846694 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:17.843388 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bcjnq_must-gather-dswv5_92fb86f6-7356-4eb4-8cde-5c25dd565e56/copy/0.log" Apr 16 23:03:17.846694 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:17.843891 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bcjnq/must-gather-dswv5" Apr 16 23:03:17.856289 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:17.847057 2576 status_manager.go:895] "Failed to get status for pod" podUID="92fb86f6-7356-4eb4-8cde-5c25dd565e56" pod="openshift-must-gather-bcjnq/must-gather-dswv5" err="pods \"must-gather-dswv5\" is forbidden: User \"system:node:ip-10-0-130-227.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-bcjnq\": no relationship found between node 'ip-10-0-130-227.ec2.internal' and this object" Apr 16 23:03:17.856289 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:17.851391 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bcjnq_must-gather-dswv5_92fb86f6-7356-4eb4-8cde-5c25dd565e56/copy/0.log" Apr 16 23:03:17.856289 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:17.851768 2576 generic.go:358] "Generic (PLEG): container finished" podID="92fb86f6-7356-4eb4-8cde-5c25dd565e56" containerID="43e6d903bd0524529fa336ba6e5adf3490ba17f23a2ec80afa452f4e74eab259" exitCode=143 Apr 16 23:03:17.856289 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:17.851884 2576 scope.go:117] "RemoveContainer" containerID="43e6d903bd0524529fa336ba6e5adf3490ba17f23a2ec80afa452f4e74eab259" Apr 16 23:03:17.856289 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:17.852003 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bcjnq/must-gather-dswv5" Apr 16 23:03:17.856289 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:17.854848 2576 status_manager.go:895] "Failed to get status for pod" podUID="92fb86f6-7356-4eb4-8cde-5c25dd565e56" pod="openshift-must-gather-bcjnq/must-gather-dswv5" err="pods \"must-gather-dswv5\" is forbidden: User \"system:node:ip-10-0-130-227.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-bcjnq\": no relationship found between node 'ip-10-0-130-227.ec2.internal' and this object" Apr 16 23:03:17.878932 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:17.876519 2576 scope.go:117] "RemoveContainer" containerID="f2a8e738199cabd7704e31213a37fa24a6d847e4d26b0f749fe8b64d74bea908" Apr 16 23:03:17.907717 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:17.903658 2576 scope.go:117] "RemoveContainer" containerID="43e6d903bd0524529fa336ba6e5adf3490ba17f23a2ec80afa452f4e74eab259" Apr 16 23:03:17.907887 ip-10-0-130-227 kubenswrapper[2576]: E0416 23:03:17.907842 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43e6d903bd0524529fa336ba6e5adf3490ba17f23a2ec80afa452f4e74eab259\": container with ID starting with 43e6d903bd0524529fa336ba6e5adf3490ba17f23a2ec80afa452f4e74eab259 not found: ID does not exist" containerID="43e6d903bd0524529fa336ba6e5adf3490ba17f23a2ec80afa452f4e74eab259" Apr 16 23:03:17.907961 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:17.907884 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43e6d903bd0524529fa336ba6e5adf3490ba17f23a2ec80afa452f4e74eab259"} err="failed to get container status \"43e6d903bd0524529fa336ba6e5adf3490ba17f23a2ec80afa452f4e74eab259\": rpc error: code = NotFound desc = could not find container \"43e6d903bd0524529fa336ba6e5adf3490ba17f23a2ec80afa452f4e74eab259\": container with ID starting with 43e6d903bd0524529fa336ba6e5adf3490ba17f23a2ec80afa452f4e74eab259 not found: ID does not exist" Apr 16 23:03:17.907961 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:17.907914 2576 scope.go:117] "RemoveContainer" containerID="f2a8e738199cabd7704e31213a37fa24a6d847e4d26b0f749fe8b64d74bea908" Apr 16 23:03:17.911567 ip-10-0-130-227 kubenswrapper[2576]: E0416 23:03:17.908224 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2a8e738199cabd7704e31213a37fa24a6d847e4d26b0f749fe8b64d74bea908\": container with ID starting with f2a8e738199cabd7704e31213a37fa24a6d847e4d26b0f749fe8b64d74bea908 not found: ID does not exist" containerID="f2a8e738199cabd7704e31213a37fa24a6d847e4d26b0f749fe8b64d74bea908" Apr 16 23:03:17.911567 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:17.908269 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2a8e738199cabd7704e31213a37fa24a6d847e4d26b0f749fe8b64d74bea908"} err="failed to get container status \"f2a8e738199cabd7704e31213a37fa24a6d847e4d26b0f749fe8b64d74bea908\": rpc error: code = NotFound desc = could not find container \"f2a8e738199cabd7704e31213a37fa24a6d847e4d26b0f749fe8b64d74bea908\": container with ID starting with f2a8e738199cabd7704e31213a37fa24a6d847e4d26b0f749fe8b64d74bea908 not found: ID does not exist" Apr 16 23:03:17.974771 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:17.974733 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/92fb86f6-7356-4eb4-8cde-5c25dd565e56-must-gather-output\") pod \"92fb86f6-7356-4eb4-8cde-5c25dd565e56\" (UID: \"92fb86f6-7356-4eb4-8cde-5c25dd565e56\") " Apr 16 23:03:17.974961 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:17.974849 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4plww\" (UniqueName: \"kubernetes.io/projected/92fb86f6-7356-4eb4-8cde-5c25dd565e56-kube-api-access-4plww\") pod \"92fb86f6-7356-4eb4-8cde-5c25dd565e56\" (UID: \"92fb86f6-7356-4eb4-8cde-5c25dd565e56\") " Apr 16 23:03:17.980692 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:17.978203 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92fb86f6-7356-4eb4-8cde-5c25dd565e56-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "92fb86f6-7356-4eb4-8cde-5c25dd565e56" (UID: "92fb86f6-7356-4eb4-8cde-5c25dd565e56"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:03:17.980692 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:17.979539 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92fb86f6-7356-4eb4-8cde-5c25dd565e56-kube-api-access-4plww" (OuterVolumeSpecName: "kube-api-access-4plww") pod "92fb86f6-7356-4eb4-8cde-5c25dd565e56" (UID: "92fb86f6-7356-4eb4-8cde-5c25dd565e56"). InnerVolumeSpecName "kube-api-access-4plww". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:03:18.076317 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:18.076213 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4plww\" (UniqueName: \"kubernetes.io/projected/92fb86f6-7356-4eb4-8cde-5c25dd565e56-kube-api-access-4plww\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 23:03:18.076317 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:18.076264 2576 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/92fb86f6-7356-4eb4-8cde-5c25dd565e56-must-gather-output\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 16 23:03:18.166049 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:18.166000 2576 status_manager.go:895] "Failed to get status for pod" podUID="92fb86f6-7356-4eb4-8cde-5c25dd565e56" pod="openshift-must-gather-bcjnq/must-gather-dswv5" err="pods \"must-gather-dswv5\" is forbidden: User \"system:node:ip-10-0-130-227.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-bcjnq\": no relationship found between node 'ip-10-0-130-227.ec2.internal' and this object" Apr 16 23:03:19.022796 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:19.022757 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92fb86f6-7356-4eb4-8cde-5c25dd565e56" path="/var/lib/kubelet/pods/92fb86f6-7356-4eb4-8cde-5c25dd565e56/volumes" Apr 16 23:03:19.327056 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:19.326926 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mrwb7_36a7c67c-1021-4738-8021-e4430bef3530/node-exporter/0.log" Apr 16 23:03:19.347519 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:19.347493 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mrwb7_36a7c67c-1021-4738-8021-e4430bef3530/kube-rbac-proxy/0.log" Apr 16 23:03:19.368739 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:19.368703 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mrwb7_36a7c67c-1021-4738-8021-e4430bef3530/init-textfile/0.log" Apr 16 23:03:19.542829 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:19.542792 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-r5jqt_9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7/kube-rbac-proxy-main/0.log" Apr 16 23:03:19.562864 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:19.562837 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-r5jqt_9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7/kube-rbac-proxy-self/0.log" Apr 16 23:03:19.586511 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:19.586431 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-r5jqt_9a6dac50-06b7-4f4a-80b3-ef0ce5e3d8f7/openshift-state-metrics/0.log" Apr 16 23:03:19.630797 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:19.630766 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_56ec1120-485a-41be-b08a-da982885fb24/prometheus/0.log" Apr 16 23:03:19.650217 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:19.650180 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_56ec1120-485a-41be-b08a-da982885fb24/config-reloader/0.log" Apr 16 23:03:19.671694 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:19.671645 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_56ec1120-485a-41be-b08a-da982885fb24/thanos-sidecar/0.log" Apr 16 23:03:19.695761 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:19.695731 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_56ec1120-485a-41be-b08a-da982885fb24/kube-rbac-proxy-web/0.log" Apr 16 23:03:19.717228 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:19.717197 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_56ec1120-485a-41be-b08a-da982885fb24/kube-rbac-proxy/0.log" Apr 16 23:03:19.739409 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:19.739383 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_56ec1120-485a-41be-b08a-da982885fb24/kube-rbac-proxy-thanos/0.log" Apr 16 23:03:19.759541 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:19.759508 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_56ec1120-485a-41be-b08a-da982885fb24/init-config-reloader/0.log" Apr 16 23:03:19.840616 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:19.840537 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-t2x6j_f8c9f944-9faf-4679-8737-bf5a9333ec82/prometheus-operator-admission-webhook/0.log" Apr 16 23:03:21.185854 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:21.185813 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-fcxmn_f8ba909f-a641-4832-b7a1-a11849ea7211/networking-console-plugin/0.log" Apr 16 23:03:21.577181 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:21.577154 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 23:03:21.586407 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:21.586380 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/2.log" Apr 16 23:03:21.935784 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:21.935700 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69bdffc497-pfzk8_be0d1b5c-ef47-43d8-8f95-2bf7ab155349/console/0.log" Apr 16 23:03:21.974217 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:21.974187 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-4x9sz_20d3b92f-a244-4735-991f-c5e63025f301/download-server/0.log" Apr 16 23:03:22.342480 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.342442 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-l94dt_79d26f89-119e-4c1e-a0a0-4d0e4e546efe/volume-data-source-validator/0.log" Apr 16 23:03:22.552907 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.552874 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh"] Apr 16 23:03:22.553277 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.553259 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92fb86f6-7356-4eb4-8cde-5c25dd565e56" containerName="copy" Apr 16 23:03:22.553364 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.553280 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="92fb86f6-7356-4eb4-8cde-5c25dd565e56" containerName="copy" Apr 16 23:03:22.553364 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.553292 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92fb86f6-7356-4eb4-8cde-5c25dd565e56" containerName="gather" Apr 16 23:03:22.553364 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.553300 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="92fb86f6-7356-4eb4-8cde-5c25dd565e56" containerName="gather" Apr 16 23:03:22.553515 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.553402 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="92fb86f6-7356-4eb4-8cde-5c25dd565e56" containerName="copy" Apr 16 23:03:22.553515 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.553414 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="92fb86f6-7356-4eb4-8cde-5c25dd565e56" containerName="gather" Apr 16 23:03:22.557826 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.557805 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh" Apr 16 23:03:22.567035 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.567006 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh"] Apr 16 23:03:22.717079 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.716988 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fe047e2c-688b-4dc3-98fb-660582cf0d46-proc\") pod \"perf-node-gather-daemonset-v89jh\" (UID: \"fe047e2c-688b-4dc3-98fb-660582cf0d46\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh" Apr 16 23:03:22.717079 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.717050 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe047e2c-688b-4dc3-98fb-660582cf0d46-sys\") pod \"perf-node-gather-daemonset-v89jh\" (UID: \"fe047e2c-688b-4dc3-98fb-660582cf0d46\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh" Apr 16 23:03:22.717314 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.717108 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fe047e2c-688b-4dc3-98fb-660582cf0d46-podres\") pod \"perf-node-gather-daemonset-v89jh\" (UID: \"fe047e2c-688b-4dc3-98fb-660582cf0d46\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh" Apr 16 23:03:22.717314 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.717140 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe047e2c-688b-4dc3-98fb-660582cf0d46-lib-modules\") pod \"perf-node-gather-daemonset-v89jh\" (UID: \"fe047e2c-688b-4dc3-98fb-660582cf0d46\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh" Apr 16 23:03:22.717314 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.717232 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx9h7\" (UniqueName: \"kubernetes.io/projected/fe047e2c-688b-4dc3-98fb-660582cf0d46-kube-api-access-qx9h7\") pod \"perf-node-gather-daemonset-v89jh\" (UID: \"fe047e2c-688b-4dc3-98fb-660582cf0d46\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh" Apr 16 23:03:22.817664 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.817629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qx9h7\" (UniqueName: \"kubernetes.io/projected/fe047e2c-688b-4dc3-98fb-660582cf0d46-kube-api-access-qx9h7\") pod \"perf-node-gather-daemonset-v89jh\" (UID: \"fe047e2c-688b-4dc3-98fb-660582cf0d46\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh" Apr 16 23:03:22.817874 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.817698 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fe047e2c-688b-4dc3-98fb-660582cf0d46-proc\") pod \"perf-node-gather-daemonset-v89jh\" (UID: \"fe047e2c-688b-4dc3-98fb-660582cf0d46\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh" Apr 16 23:03:22.817874 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.817731 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe047e2c-688b-4dc3-98fb-660582cf0d46-sys\") pod \"perf-node-gather-daemonset-v89jh\" (UID: \"fe047e2c-688b-4dc3-98fb-660582cf0d46\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh" Apr 16 23:03:22.817874 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.817779 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fe047e2c-688b-4dc3-98fb-660582cf0d46-podres\") pod \"perf-node-gather-daemonset-v89jh\" (UID: \"fe047e2c-688b-4dc3-98fb-660582cf0d46\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh" Apr 16 23:03:22.817874 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.817810 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe047e2c-688b-4dc3-98fb-660582cf0d46-lib-modules\") pod \"perf-node-gather-daemonset-v89jh\" (UID: \"fe047e2c-688b-4dc3-98fb-660582cf0d46\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh" Apr 16 23:03:22.817874 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.817816 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fe047e2c-688b-4dc3-98fb-660582cf0d46-proc\") pod \"perf-node-gather-daemonset-v89jh\" (UID: \"fe047e2c-688b-4dc3-98fb-660582cf0d46\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh" Apr 16 23:03:22.818170 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.817957 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fe047e2c-688b-4dc3-98fb-660582cf0d46-podres\") pod \"perf-node-gather-daemonset-v89jh\" (UID: \"fe047e2c-688b-4dc3-98fb-660582cf0d46\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh" Apr 16 23:03:22.818170 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.817970 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe047e2c-688b-4dc3-98fb-660582cf0d46-sys\") pod \"perf-node-gather-daemonset-v89jh\" (UID: \"fe047e2c-688b-4dc3-98fb-660582cf0d46\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh" Apr 16 23:03:22.818170 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.818093 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe047e2c-688b-4dc3-98fb-660582cf0d46-lib-modules\") pod \"perf-node-gather-daemonset-v89jh\" (UID: \"fe047e2c-688b-4dc3-98fb-660582cf0d46\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh" Apr 16 23:03:22.825701 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.825660 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx9h7\" (UniqueName: \"kubernetes.io/projected/fe047e2c-688b-4dc3-98fb-660582cf0d46-kube-api-access-qx9h7\") pod \"perf-node-gather-daemonset-v89jh\" (UID: \"fe047e2c-688b-4dc3-98fb-660582cf0d46\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh" Apr 16 23:03:22.873363 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:22.871987 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh" Apr 16 23:03:23.013575 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:23.013524 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh"] Apr 16 23:03:23.016288 ip-10-0-130-227 kubenswrapper[2576]: W0416 23:03:23.016257 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfe047e2c_688b_4dc3_98fb_660582cf0d46.slice/crio-3c380b2c7e10465b5e455dd7ef69e5abbfaea8e5e84ebc15879b2da9b6f1afde WatchSource:0}: Error finding container 3c380b2c7e10465b5e455dd7ef69e5abbfaea8e5e84ebc15879b2da9b6f1afde: Status 404 returned error can't find the container with id 3c380b2c7e10465b5e455dd7ef69e5abbfaea8e5e84ebc15879b2da9b6f1afde Apr 16 23:03:23.045538 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:23.045511 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fbgc2_a290a4ed-5ccc-47be-bd46-836ba21fea56/dns/0.log" Apr 16 23:03:23.069800 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:23.069782 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fbgc2_a290a4ed-5ccc-47be-bd46-836ba21fea56/kube-rbac-proxy/0.log" Apr 16 23:03:23.134307 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:23.134280 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9vfz7_97863583-437e-48ec-8c04-9cb36c6d5a89/dns-node-resolver/0.log" Apr 16 23:03:23.637573 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:23.637540 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xr5zn_07d69a7c-22b7-44a4-8aea-024e47f7912b/node-ca/0.log" Apr 16 23:03:23.878699 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:23.878643 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh" event={"ID":"fe047e2c-688b-4dc3-98fb-660582cf0d46","Type":"ContainerStarted","Data":"ad50f495db6b7f013c06693a9636c885b57e32a35f44f5d5ca1e5ce811cf948b"} Apr 16 23:03:23.878699 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:23.878692 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh" event={"ID":"fe047e2c-688b-4dc3-98fb-660582cf0d46","Type":"ContainerStarted","Data":"3c380b2c7e10465b5e455dd7ef69e5abbfaea8e5e84ebc15879b2da9b6f1afde"} Apr 16 23:03:23.878908 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:23.878782 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh" Apr 16 23:03:23.896655 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:23.896564 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh" podStartSLOduration=1.896551014 podStartE2EDuration="1.896551014s" podCreationTimestamp="2026-04-16 23:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:03:23.895941498 +0000 UTC m=+2999.429282363" watchObservedRunningTime="2026-04-16 23:03:23.896551014 +0000 UTC m=+2999.429891865" Apr 16 23:03:24.683197 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:24.683165 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-s8bss_49f46636-9c8e-44e1-88ec-d5c207868f31/serve-healthcheck-canary/0.log" Apr 16 23:03:25.031002 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:25.030972 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-rpsvs_ed1e6036-1beb-445f-8b61-65e736181605/insights-operator/0.log" Apr 16 23:03:25.031568 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:25.031542 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-rpsvs_ed1e6036-1beb-445f-8b61-65e736181605/insights-operator/1.log" Apr 16 23:03:25.050795 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:25.050772 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-54956_1a9ba876-91ba-49b9-87b0-90be2057ee0a/kube-rbac-proxy/0.log" Apr 16 23:03:25.070709 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:25.070667 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-54956_1a9ba876-91ba-49b9-87b0-90be2057ee0a/exporter/0.log" Apr 16 23:03:25.090600 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:25.090570 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-54956_1a9ba876-91ba-49b9-87b0-90be2057ee0a/extractor/0.log" Apr 16 23:03:25.186816 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:25.186716 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 23:03:25.205042 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:25.190961 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7r6sf_ba403d08-6963-4274-8dcb-309016f31037/console-operator/1.log" Apr 16 23:03:27.280552 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:27.280516 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-59r9p_1aa1eaad-9f70-457a-8176-c63289f84a68/server/0.log" Apr 16 23:03:27.770072 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:27.770040 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-qgtjz_7d876dfd-77c0-49a2-9c5a-eee24f383f55/manager/0.log" Apr 16 23:03:27.788101 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:27.788077 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-zdfks_7a9f6297-f5f9-4dff-9ef5-05f34355699d/s3-init/0.log" Apr 16 23:03:27.814498 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:27.814472 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-s8g9n_845f0fa7-49c2-4f4b-a017-9f376b6c1499/seaweedfs/0.log" Apr 16 23:03:29.892377 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:29.892337 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-v89jh" Apr 16 23:03:31.366095 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:31.366064 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-fvvh4_3b80a9f7-2b18-4471-a955-6e15ca0536b3/migrator/0.log" Apr 16 23:03:31.390803 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:31.390775 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-fvvh4_3b80a9f7-2b18-4471-a955-6e15ca0536b3/graceful-termination/0.log" Apr 16 23:03:31.742171 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:31.742138 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-749dh_cbfb5b76-e002-4f88-a22a-9d55fd6b9348/kube-storage-version-migrator-operator/1.log" Apr 16 23:03:31.743369 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:31.743345 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-749dh_cbfb5b76-e002-4f88-a22a-9d55fd6b9348/kube-storage-version-migrator-operator/0.log" Apr 16 23:03:32.989144 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:32.989098 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xn6mm_fab36813-c9b9-4c2c-aa71-55346680c966/kube-multus-additional-cni-plugins/0.log" Apr 16 23:03:33.016836 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:33.016801 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xn6mm_fab36813-c9b9-4c2c-aa71-55346680c966/egress-router-binary-copy/0.log" Apr 16 23:03:33.038717 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:33.038662 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xn6mm_fab36813-c9b9-4c2c-aa71-55346680c966/cni-plugins/0.log" Apr 16 23:03:33.062822 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:33.062794 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xn6mm_fab36813-c9b9-4c2c-aa71-55346680c966/bond-cni-plugin/0.log" Apr 16 23:03:33.085738 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:33.085703 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xn6mm_fab36813-c9b9-4c2c-aa71-55346680c966/routeoverride-cni/0.log" Apr 16 23:03:33.110195 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:33.110162 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xn6mm_fab36813-c9b9-4c2c-aa71-55346680c966/whereabouts-cni-bincopy/0.log" Apr 16 23:03:33.132045 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:33.132022 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xn6mm_fab36813-c9b9-4c2c-aa71-55346680c966/whereabouts-cni/0.log" Apr 16 23:03:33.171348 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:33.171302 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hqm5z_77967758-2942-4cb4-aa90-9b16761c46b3/kube-multus/0.log" Apr 16 23:03:33.310476 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:33.310444 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lpwn6_ddd2c4f1-e24c-431c-a59a-9936d01e4667/network-metrics-daemon/0.log" Apr 16 23:03:33.341296 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:33.341270 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lpwn6_ddd2c4f1-e24c-431c-a59a-9936d01e4667/kube-rbac-proxy/0.log" Apr 16 23:03:34.103170 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:34.103143 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k24wz_47402cab-7cba-42f8-be72-8d67c31c2e7c/ovn-controller/0.log" Apr 16 23:03:34.148297 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:34.148273 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k24wz_47402cab-7cba-42f8-be72-8d67c31c2e7c/ovn-acl-logging/0.log" Apr 16 23:03:34.170215 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:34.170153 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k24wz_47402cab-7cba-42f8-be72-8d67c31c2e7c/kube-rbac-proxy-node/0.log" Apr 16 23:03:34.190419 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:34.190397 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k24wz_47402cab-7cba-42f8-be72-8d67c31c2e7c/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 23:03:34.206300 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:34.206278 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k24wz_47402cab-7cba-42f8-be72-8d67c31c2e7c/northd/0.log" Apr 16 23:03:34.225383 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:34.225364 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k24wz_47402cab-7cba-42f8-be72-8d67c31c2e7c/nbdb/0.log" Apr 16 23:03:34.244854 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:34.244837 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k24wz_47402cab-7cba-42f8-be72-8d67c31c2e7c/sbdb/0.log" Apr 16 23:03:34.423961 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:34.423864 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k24wz_47402cab-7cba-42f8-be72-8d67c31c2e7c/ovnkube-controller/0.log" Apr 16 23:03:35.879186 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:35.879138 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-vvx4n_a7f138c6-266e-41cd-9d3e-1ad1d55c0770/check-endpoints/0.log" Apr 16 23:03:35.942539 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:35.942507 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-4kjg5_42680cd4-0a5c-4123-aae1-963237fa5b60/network-check-target-container/0.log" Apr 16 23:03:36.755050 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:36.755022 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-7mhr4_365e92c8-bcc3-46c2-9512-0fa95057726c/iptables-alerter/0.log" Apr 16 23:03:37.411689 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:37.411647 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-jdbf4_a72c732d-41ba-4c13-bf18-c21f9bf94968/tuned/0.log" Apr 16 23:03:39.011695 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:39.011652 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-k5vkd_1335948c-8695-4844-a3a5-6819848109ce/cluster-samples-operator/0.log" Apr 16 23:03:39.034041 ip-10-0-130-227 kubenswrapper[2576]: I0416 23:03:39.034017 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-k5vkd_1335948c-8695-4844-a3a5-6819848109ce/cluster-samples-operator-watch/0.log"