Apr 16 19:51:15.543214 ip-10-0-128-201 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 19:51:15.543224 ip-10-0-128-201 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 19:51:15.543231 ip-10-0-128-201 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 19:51:15.543372 ip-10-0-128-201 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 19:51:25.781874 ip-10-0-128-201 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 19:51:25.781895 ip-10-0-128-201 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot daf6983b861543b7851e1f2197d86084 -- Apr 16 19:53:42.154049 ip-10-0-128-201 systemd[1]: Starting Kubernetes Kubelet... Apr 16 19:53:42.629422 ip-10-0-128-201 kubenswrapper[2561]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:42.629422 ip-10-0-128-201 kubenswrapper[2561]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 19:53:42.629422 ip-10-0-128-201 kubenswrapper[2561]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:42.629422 ip-10-0-128-201 kubenswrapper[2561]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 19:53:42.630018 ip-10-0-128-201 kubenswrapper[2561]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:42.631477 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.631396 2561 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 19:53:42.634956 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.634942 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:42.634956 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.634955 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:42.635021 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.634959 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:42.635021 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.634963 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:42.635021 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.634966 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:42.635021 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.634968 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:42.635021 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.634971 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:42.635021 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.634974 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:42.635021 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.634978 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:42.635021 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.634981 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:42.635021 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.634984 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:42.635021 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.634986 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:42.635021 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.634989 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:42.635021 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.634992 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:42.635021 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.634995 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:42.635021 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.634998 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:42.635021 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635001 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:42.635021 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635003 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:42.635021 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635006 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:42.635021 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635008 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:42.635021 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635011 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:42.635021 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635019 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:42.635484 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635021 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:42.635484 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635025 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:42.635484 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635030 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:42.635484 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635034 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:42.635484 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635036 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:42.635484 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635039 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:42.635484 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635043 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:42.635484 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635047 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:42.635484 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635050 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:42.635484 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635053 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:42.635484 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635056 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:42.635484 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635058 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:42.635484 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635061 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:42.635484 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635064 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:42.635484 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635066 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:42.635484 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635069 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:42.635484 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635071 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:42.635484 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635074 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:42.635484 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635076 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:42.635966 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635079 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:42.635966 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635082 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:42.635966 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635085 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:42.635966 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635088 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:42.635966 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635090 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:42.635966 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635093 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:42.635966 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635095 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:42.635966 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635098 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:42.635966 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635100 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:42.635966 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635103 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:42.635966 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635105 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:42.635966 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635108 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:42.635966 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635110 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:42.635966 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635113 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:42.635966 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635116 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:42.635966 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635119 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:42.635966 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635121 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:42.635966 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635124 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:42.635966 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635127 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:42.636426 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635129 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:42.636426 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635132 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:42.636426 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635134 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:42.636426 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635137 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:42.636426 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635139 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:42.636426 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635142 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:42.636426 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635144 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:42.636426 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635147 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:42.636426 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635149 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:42.636426 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635152 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:42.636426 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635155 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:42.636426 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635157 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:42.636426 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635160 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:42.636426 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635162 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:42.636426 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635166 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:42.636426 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635168 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:42.636426 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635171 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:42.636426 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635174 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:42.636426 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635177 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:42.636426 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635179 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:42.636928 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635182 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:42.636928 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635184 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:42.636928 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635187 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:42.636928 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635189 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:42.636928 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635192 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:42.636928 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635194 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:42.636928 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635546 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:42.636928 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635551 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:42.636928 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635554 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:42.636928 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635557 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:42.636928 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635560 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:42.636928 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635563 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:42.636928 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635566 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:42.636928 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635568 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:42.636928 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635571 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:42.636928 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635573 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:42.636928 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635576 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:42.636928 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635579 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:42.636928 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635581 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:42.636928 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635584 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:42.637410 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635586 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:42.637410 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635589 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:42.637410 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635591 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:42.637410 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635594 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:42.637410 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635597 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:42.637410 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635601 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:42.637410 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635605 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:42.637410 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635608 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:42.637410 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635611 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:42.637410 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635613 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:42.637410 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635616 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:42.637410 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635619 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:42.637410 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635621 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:42.637410 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635624 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:42.637410 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635627 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:42.637410 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635630 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:42.637410 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635632 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:42.637410 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635635 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:42.637410 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635638 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:42.637883 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635641 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:42.637883 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635643 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:42.637883 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635646 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:42.637883 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635648 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:42.637883 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635651 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:42.637883 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635653 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:42.637883 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635656 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:42.637883 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635658 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:42.637883 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635661 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:42.637883 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635663 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:42.637883 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635666 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:42.637883 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635669 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:42.637883 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635672 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:42.637883 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635675 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:42.637883 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635677 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:42.637883 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635680 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:42.637883 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635682 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:42.637883 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635684 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:42.637883 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635687 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:42.637883 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635690 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:42.638364 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635692 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:42.638364 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635695 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:42.638364 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635697 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:42.638364 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635699 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:42.638364 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635702 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:42.638364 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635704 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:42.638364 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635707 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:42.638364 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635709 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:42.638364 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635712 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:42.638364 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635714 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:42.638364 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635717 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:42.638364 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635719 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:42.638364 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635722 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:42.638364 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635726 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:42.638364 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635728 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:42.638364 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635731 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:42.638364 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635734 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:42.638364 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635736 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:42.638364 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635739 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:42.638364 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635741 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:42.638875 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635744 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:42.638875 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635746 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:42.638875 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635749 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:42.638875 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635751 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:42.638875 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635753 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:42.638875 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635756 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:42.638875 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635759 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:42.638875 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635761 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:42.638875 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635763 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:42.638875 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635766 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:42.638875 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635768 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:42.638875 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635771 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:42.638875 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.635774 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:42.638875 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636427 2561 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 19:53:42.638875 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636441 2561 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 19:53:42.638875 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636449 2561 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 19:53:42.638875 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636453 2561 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 19:53:42.638875 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636458 2561 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 19:53:42.638875 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636461 2561 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 19:53:42.638875 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636465 2561 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 19:53:42.638875 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636470 2561 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636474 2561 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636477 2561 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636480 2561 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636484 2561 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636487 2561 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636490 2561 flags.go:64] FLAG: --cgroup-root="" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636493 2561 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636496 2561 flags.go:64] FLAG: --client-ca-file="" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636499 2561 flags.go:64] FLAG: --cloud-config="" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636501 2561 flags.go:64] FLAG: --cloud-provider="external" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636504 2561 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636509 2561 flags.go:64] FLAG: --cluster-domain="" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636512 2561 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636515 2561 flags.go:64] FLAG: --config-dir="" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636518 2561 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636522 2561 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636526 2561 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636528 2561 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636532 2561 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636535 2561 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636538 2561 flags.go:64] FLAG: --contention-profiling="false" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636541 2561 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636544 2561 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636547 2561 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 19:53:42.639370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636551 2561 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636555 2561 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636558 2561 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636561 2561 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636564 2561 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636567 2561 flags.go:64] FLAG: --enable-server="true" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636569 2561 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636574 2561 flags.go:64] FLAG: --event-burst="100" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636577 2561 flags.go:64] FLAG: --event-qps="50" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636580 2561 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636583 2561 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636586 2561 flags.go:64] FLAG: --eviction-hard="" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636590 2561 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636593 2561 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636596 2561 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636599 2561 flags.go:64] FLAG: --eviction-soft="" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636602 2561 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636605 2561 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636608 2561 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636611 2561 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636613 2561 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636616 2561 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636619 2561 flags.go:64] FLAG: --feature-gates="" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636623 2561 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636626 2561 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 19:53:42.640004 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636629 2561 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636632 2561 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636636 2561 flags.go:64] FLAG: --healthz-port="10248" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636639 2561 flags.go:64] FLAG: --help="false" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636642 2561 flags.go:64] FLAG: --hostname-override="ip-10-0-128-201.ec2.internal" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636645 2561 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636648 2561 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636651 2561 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636655 2561 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636659 2561 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636662 2561 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636665 2561 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636667 2561 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636670 2561 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636673 2561 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636676 2561 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636679 2561 flags.go:64] FLAG: --kube-reserved="" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636682 2561 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636685 2561 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636688 2561 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636691 2561 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636694 2561 flags.go:64] FLAG: --lock-file="" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636696 2561 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636699 2561 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 19:53:42.640595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636703 2561 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636707 2561 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636710 2561 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636713 2561 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636716 2561 flags.go:64] FLAG: --logging-format="text" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636719 2561 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636722 2561 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636725 2561 flags.go:64] FLAG: --manifest-url="" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636727 2561 flags.go:64] FLAG: --manifest-url-header="" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636732 2561 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636737 2561 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636741 2561 flags.go:64] FLAG: --max-pods="110" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636744 2561 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636747 2561 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636749 2561 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636754 2561 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636757 2561 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636760 2561 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636764 2561 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636771 2561 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636774 2561 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636777 2561 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636780 2561 flags.go:64] FLAG: --pod-cidr="" Apr 16 19:53:42.641224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636782 2561 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636802 2561 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636805 2561 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636809 2561 flags.go:64] FLAG: --pods-per-core="0" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636811 2561 flags.go:64] FLAG: --port="10250" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636814 2561 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636817 2561 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-06214e2d4ae63e271" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636820 2561 flags.go:64] FLAG: --qos-reserved="" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636823 2561 flags.go:64] FLAG: --read-only-port="10255" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636826 2561 flags.go:64] FLAG: --register-node="true" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636829 2561 flags.go:64] FLAG: --register-schedulable="true" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636832 2561 flags.go:64] FLAG: --register-with-taints="" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636835 2561 flags.go:64] FLAG: --registry-burst="10" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636838 2561 flags.go:64] FLAG: --registry-qps="5" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636841 2561 flags.go:64] FLAG: --reserved-cpus="" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636844 2561 flags.go:64] FLAG: --reserved-memory="" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636847 2561 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636850 2561 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636853 2561 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636858 2561 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636861 2561 flags.go:64] FLAG: --runonce="false" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636863 2561 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636866 2561 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636870 2561 flags.go:64] FLAG: --seccomp-default="false" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636873 2561 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636877 2561 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 19:53:42.641761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636880 2561 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636884 2561 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636887 2561 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636890 2561 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636893 2561 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636895 2561 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636898 2561 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636901 2561 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636904 2561 flags.go:64] FLAG: --system-cgroups="" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636907 2561 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636913 2561 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636915 2561 flags.go:64] FLAG: --tls-cert-file="" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636918 2561 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636923 2561 flags.go:64] FLAG: --tls-min-version="" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636925 2561 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636928 2561 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636931 2561 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636934 2561 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636937 2561 flags.go:64] FLAG: --v="2" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636942 2561 flags.go:64] FLAG: --version="false" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636946 2561 flags.go:64] FLAG: --vmodule="" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636950 2561 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.636953 2561 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637039 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637043 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:42.642437 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637047 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:42.643066 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637050 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:42.643066 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637053 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:42.643066 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637056 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:42.643066 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637058 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:42.643066 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637061 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:42.643066 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637065 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:42.643066 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637067 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:42.643066 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637070 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:42.643066 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637073 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:42.643066 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637076 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:42.643066 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637078 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:42.643066 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637081 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:42.643066 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637083 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:42.643066 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637087 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:42.643066 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637091 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:42.643066 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637094 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:42.643066 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637097 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:42.643066 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637099 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:42.643066 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637102 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:42.643066 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637104 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:42.643567 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637107 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:42.643567 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637110 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:42.643567 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637112 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:42.643567 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637115 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:42.643567 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637118 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:42.643567 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637120 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:42.643567 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637123 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:42.643567 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637125 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:42.643567 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637128 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:42.643567 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637130 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:42.643567 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637133 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:42.643567 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637140 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:42.643567 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637143 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:42.643567 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637145 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:42.643567 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637148 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:42.643567 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637151 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:42.643567 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637154 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:42.643567 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637157 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:42.643567 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637160 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:42.643567 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637163 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:42.644072 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637166 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:42.644072 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637168 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:42.644072 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637171 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:42.644072 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637173 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:42.644072 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637176 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:42.644072 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637179 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:42.644072 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637181 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:42.644072 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637184 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:42.644072 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637186 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:42.644072 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637189 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:42.644072 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637191 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:42.644072 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637194 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:42.644072 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637196 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:42.644072 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637199 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:42.644072 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637201 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:42.644072 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637204 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:42.644072 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637207 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:42.644072 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637209 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:42.644072 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637212 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:42.644072 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637214 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:42.644575 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637216 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:42.644575 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637219 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:42.644575 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637221 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:42.644575 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637225 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:42.644575 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637227 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:42.644575 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637230 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:42.644575 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637233 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:42.644575 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637235 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:42.644575 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637238 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:42.644575 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637242 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:42.644575 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637245 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:42.644575 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637248 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:42.644575 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637250 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:42.644575 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637253 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:42.644575 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637255 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:42.644575 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637260 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:42.644575 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637263 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:42.644575 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637266 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:42.644575 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637269 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:42.645069 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637272 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:42.645069 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637274 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:42.645069 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637277 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:42.645069 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.637279 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:42.645069 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.637891 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:42.645069 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.644029 2561 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 19:53:42.645069 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.644043 2561 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 19:53:42.645069 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644089 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:42.645069 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644094 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:42.645069 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644097 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:42.645069 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644100 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:42.645069 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644104 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:42.645069 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644107 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:42.645069 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644109 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:42.645069 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644112 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:42.645069 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644115 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:42.645466 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644117 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:42.645466 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644120 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:42.645466 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644122 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:42.645466 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644125 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:42.645466 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644127 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:42.645466 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644130 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:42.645466 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644132 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:42.645466 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644135 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:42.645466 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644138 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:42.645466 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644140 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:42.645466 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644144 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:42.645466 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644146 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:42.645466 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644149 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:42.645466 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644151 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:42.645466 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644154 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:42.645466 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644156 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:42.645466 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644159 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:42.645466 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644161 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:42.645466 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644164 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:42.645466 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644166 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:42.646001 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644169 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:42.646001 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644171 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:42.646001 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644175 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:42.646001 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644178 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:42.646001 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644180 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:42.646001 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644183 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:42.646001 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644185 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:42.646001 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644188 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:42.646001 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644190 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:42.646001 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644193 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:42.646001 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644195 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:42.646001 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644198 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:42.646001 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644201 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:42.646001 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644203 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:42.646001 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644206 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:42.646001 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644208 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:42.646001 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644211 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:42.646001 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644213 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:42.646001 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644215 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:42.646001 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644218 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:42.646474 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644222 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:42.646474 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644226 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:42.646474 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644229 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:42.646474 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644232 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:42.646474 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644235 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:42.646474 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644237 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:42.646474 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644239 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:42.646474 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644242 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:42.646474 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644244 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:42.646474 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644247 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:42.646474 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644250 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:42.646474 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644253 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:42.646474 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644255 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:42.646474 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644258 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:42.646474 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644260 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:42.646474 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644263 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:42.646474 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644266 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:42.646474 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644268 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:42.646474 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644271 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:42.646948 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644273 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:42.646948 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644276 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:42.646948 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644279 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:42.646948 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644281 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:42.646948 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644284 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:42.646948 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644286 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:42.646948 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644289 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:42.646948 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644291 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:42.646948 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644293 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:42.646948 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644297 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:42.646948 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644301 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:42.646948 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644303 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:42.646948 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644306 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:42.646948 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644309 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:42.646948 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644311 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:42.646948 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644314 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:42.646948 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644316 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:42.646948 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644320 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:42.647372 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.644325 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:42.647372 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644431 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:42.647372 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644436 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:42.647372 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644439 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:42.647372 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644442 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:42.647372 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644445 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:42.647372 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644448 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:42.647372 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644451 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:42.647372 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644454 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:42.647372 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644456 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:42.647372 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644459 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:42.647372 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644461 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:42.647372 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644464 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:42.647372 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644466 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:42.647372 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644469 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:42.647764 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644471 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:42.647764 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644474 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:42.647764 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644476 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:42.647764 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644479 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:42.647764 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644481 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:42.647764 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644484 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:42.647764 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644486 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:42.647764 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644489 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:42.647764 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644491 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:42.647764 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644493 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:42.647764 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644496 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:42.647764 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644498 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:42.647764 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644501 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:42.647764 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644503 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:42.647764 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644505 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:42.647764 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644508 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:42.647764 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644510 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:42.647764 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644513 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:42.647764 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644517 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:42.647764 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644520 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:42.648260 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644523 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:42.648260 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644525 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:42.648260 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644528 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:42.648260 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644530 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:42.648260 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644533 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:42.648260 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644535 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:42.648260 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644538 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:42.648260 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644540 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:42.648260 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644543 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:42.648260 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644546 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:42.648260 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644549 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:42.648260 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644551 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:42.648260 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644553 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:42.648260 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644556 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:42.648260 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644559 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:42.648260 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644561 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:42.648260 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644564 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:42.648260 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644566 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:42.648260 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644568 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:42.648260 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644571 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:42.648771 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644575 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:42.648771 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644578 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:42.648771 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644581 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:42.648771 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644583 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:42.648771 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644586 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:42.648771 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644589 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:42.648771 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644592 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:42.648771 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644595 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:42.648771 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644597 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:42.648771 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644600 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:42.648771 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644603 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:42.648771 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644606 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:42.648771 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644608 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:42.648771 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644611 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:42.648771 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644613 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:42.648771 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644616 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:42.648771 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644619 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:42.648771 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644621 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:42.648771 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644624 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:42.648771 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644626 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:42.649455 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644629 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:42.649455 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644632 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:42.649455 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644634 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:42.649455 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644637 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:42.649455 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644639 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:42.649455 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644642 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:42.649455 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644644 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:42.649455 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644646 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:42.649455 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644649 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:42.649455 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644651 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:42.649455 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644654 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:42.649455 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:42.644656 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:42.649455 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.644661 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:42.649455 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.645355 2561 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 19:53:42.649455 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.647235 2561 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 19:53:42.649862 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.648408 2561 server.go:1019] "Starting client certificate rotation" Apr 16 19:53:42.649862 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.648508 2561 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:53:42.649862 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.648541 2561 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:53:42.674675 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.674657 2561 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:53:42.677465 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.677442 2561 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:53:42.696739 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.696634 2561 log.go:25] "Validated CRI v1 runtime API" Apr 16 19:53:42.702926 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.702911 2561 log.go:25] "Validated CRI v1 image API" Apr 16 19:53:42.704197 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.704173 2561 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 19:53:42.707351 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.707335 2561 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:53:42.708866 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.708843 2561 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 945ff89e-d34e-4f56-915f-af15b9c6a6d3:/dev/nvme0n1p4 f1446f9d-5b96-48e3-b273-a3505e0ee0f3:/dev/nvme0n1p3] Apr 16 19:53:42.708925 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.708866 2561 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 19:53:42.714517 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.714421 2561 manager.go:217] Machine: {Timestamp:2026-04-16 19:53:42.712445552 +0000 UTC m=+0.435491451 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098117 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2fabe3fa23e2a59c711db362135ff5 SystemUUID:ec2fabe3-fa23-e2a5-9c71-1db362135ff5 BootID:daf6983b-8615-43b7-851e-1f2197d86084 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:8f:42:95:69:7b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:8f:42:95:69:7b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:72:20:e7:f5:4f:81 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 19:53:42.714517 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.714511 2561 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 19:53:42.714627 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.714576 2561 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 19:53:42.715881 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.715859 2561 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 19:53:42.716014 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.715884 2561 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-201.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 19:53:42.716064 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.716023 2561 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 19:53:42.716064 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.716031 2561 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 19:53:42.716064 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.716044 2561 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:53:42.716921 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.716911 2561 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:53:42.717784 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.717775 2561 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:53:42.717898 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.717889 2561 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 19:53:42.720814 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.720805 2561 kubelet.go:491] "Attempting to sync node with API server" Apr 16 19:53:42.720854 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.720818 2561 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 19:53:42.720854 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.720829 2561 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 19:53:42.720854 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.720838 2561 kubelet.go:397] "Adding apiserver pod source" Apr 16 19:53:42.720854 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.720846 2561 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 19:53:42.722048 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.722033 2561 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:53:42.722092 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.722060 2561 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:53:42.725204 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.725182 2561 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 19:53:42.726557 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.726543 2561 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 19:53:42.728381 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.728362 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 19:53:42.728448 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.728402 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 19:53:42.728448 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.728416 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 19:53:42.728448 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.728427 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 19:53:42.728448 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.728440 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 19:53:42.728562 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.728452 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 19:53:42.728562 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.728482 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 19:53:42.728562 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.728494 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 19:53:42.728562 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.728509 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 19:53:42.728562 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.728522 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 19:53:42.728562 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.728550 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 19:53:42.728725 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.728569 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 19:53:42.729484 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.729465 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 19:53:42.729484 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.729486 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 19:53:42.733076 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.733062 2561 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 19:53:42.733147 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.733097 2561 server.go:1295] "Started kubelet" Apr 16 19:53:42.737383 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.737310 2561 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 19:53:42.737468 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.737411 2561 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 19:53:42.737777 ip-10-0-128-201 systemd[1]: Started Kubernetes Kubelet. Apr 16 19:53:42.737914 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:42.737890 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 19:53:42.738182 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.738155 2561 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 19:53:42.738460 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.738444 2561 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 19:53:42.738504 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:42.738463 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-201.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 19:53:42.738669 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.733184 2561 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-201.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 19:53:42.741011 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.740996 2561 server.go:317] "Adding debug handlers to kubelet server" Apr 16 19:53:42.744116 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.744098 2561 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 19:53:42.744210 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.744120 2561 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 19:53:42.744731 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.744715 2561 factory.go:55] Registering systemd factory Apr 16 19:53:42.744731 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.744734 2561 factory.go:223] Registration of the systemd container factory successfully Apr 16 19:53:42.744877 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.744827 2561 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 19:53:42.744877 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.744829 2561 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 19:53:42.744877 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.744852 2561 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 19:53:42.745003 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:42.744922 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-201.ec2.internal\" not found" Apr 16 19:53:42.745003 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.744959 2561 reconstruct.go:97] "Volume reconstruction finished" Apr 16 19:53:42.745003 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.744966 2561 reconciler.go:26] "Reconciler: start to sync state" Apr 16 19:53:42.745003 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.744993 2561 factory.go:153] Registering CRI-O factory Apr 16 19:53:42.745003 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.745003 2561 factory.go:223] Registration of the crio container factory successfully Apr 16 19:53:42.745188 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.745065 2561 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 19:53:42.745188 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.745091 2561 factory.go:103] Registering Raw factory Apr 16 19:53:42.745188 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.745108 2561 manager.go:1196] Started watching for new ooms in manager Apr 16 19:53:42.745188 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:42.744090 2561 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-201.ec2.internal.18a6ee657ef10e85 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-201.ec2.internal,UID:ip-10-0-128-201.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-201.ec2.internal,},FirstTimestamp:2026-04-16 19:53:42.733074053 +0000 UTC m=+0.456119949,LastTimestamp:2026-04-16 19:53:42.733074053 +0000 UTC m=+0.456119949,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-201.ec2.internal,}" Apr 16 19:53:42.745445 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.745433 2561 manager.go:319] Starting recovery of all containers Apr 16 19:53:42.746053 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:42.746028 2561 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 19:53:42.751612 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:42.751583 2561 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-201.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 19:53:42.751718 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:42.751683 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 19:53:42.754694 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.754679 2561 manager.go:324] Recovery completed Apr 16 19:53:42.756135 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.756115 2561 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-b68vb" Apr 16 19:53:42.758910 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.758897 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:42.761258 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.761243 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-201.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:42.761319 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.761272 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:42.761319 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.761285 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-201.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:42.762234 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.762219 2561 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 19:53:42.762234 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.762231 2561 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 19:53:42.762328 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.762245 2561 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:53:42.764414 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.764400 2561 policy_none.go:49] "None policy: Start" Apr 16 19:53:42.764488 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.764419 2561 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 19:53:42.764488 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.764430 2561 state_mem.go:35] "Initializing new in-memory state store" Apr 16 19:53:42.764818 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.764804 2561 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-b68vb" Apr 16 19:53:42.804405 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.804384 2561 manager.go:341] "Starting Device Plugin manager" Apr 16 19:53:42.810209 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:42.804422 2561 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 19:53:42.810209 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.804433 2561 server.go:85] "Starting device plugin registration server" Apr 16 19:53:42.810209 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.804631 2561 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 19:53:42.810209 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.804644 2561 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 19:53:42.810209 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.804717 2561 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 19:53:42.810209 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.804812 2561 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 19:53:42.810209 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.804822 2561 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 19:53:42.810209 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:42.805330 2561 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 19:53:42.810209 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:42.805366 2561 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-201.ec2.internal\" not found" Apr 16 19:53:42.833606 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.833584 2561 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 19:53:42.834740 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.834713 2561 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 19:53:42.834839 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.834745 2561 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 19:53:42.834839 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.834762 2561 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 19:53:42.834839 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.834768 2561 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 19:53:42.834839 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:42.834811 2561 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 19:53:42.837417 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.837397 2561 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:42.905534 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.905488 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:42.906228 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.906203 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-201.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:42.906307 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.906234 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:42.906307 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.906249 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-201.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:42.906307 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.906276 2561 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-201.ec2.internal" Apr 16 19:53:42.916606 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.916591 2561 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-201.ec2.internal" Apr 16 19:53:42.916669 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:42.916609 2561 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-201.ec2.internal\": node \"ip-10-0-128-201.ec2.internal\" not found" Apr 16 19:53:42.928647 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:42.928629 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-201.ec2.internal\" not found" Apr 16 19:53:42.935684 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.935666 2561 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-201.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-201.ec2.internal"] Apr 16 19:53:42.935738 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.935718 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:42.936331 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.936319 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-201.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:42.936382 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.936342 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:42.936382 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.936352 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-201.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:42.937519 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.937508 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:42.937663 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.937648 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-201.ec2.internal" Apr 16 19:53:42.937710 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.937675 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:42.938113 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.938101 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-201.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:42.938113 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.938108 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-201.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:42.938202 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.938122 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:42.938202 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.938128 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:42.938202 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.938136 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-201.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:42.938202 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.938143 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-201.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:42.939308 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.939290 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-201.ec2.internal" Apr 16 19:53:42.939384 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.939322 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:42.939938 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.939921 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-201.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:42.940020 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.939945 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:42.940020 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.939954 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-201.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:42.945976 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.945961 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d9f1723847b6ccf58b5c375746506d34-config\") pod \"kube-apiserver-proxy-ip-10-0-128-201.ec2.internal\" (UID: \"d9f1723847b6ccf58b5c375746506d34\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-201.ec2.internal" Apr 16 19:53:42.946036 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.945984 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f65a2c043ff418743b8e4d4f642d65c0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-201.ec2.internal\" (UID: \"f65a2c043ff418743b8e4d4f642d65c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-201.ec2.internal" Apr 16 19:53:42.946036 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:42.946002 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f65a2c043ff418743b8e4d4f642d65c0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-201.ec2.internal\" (UID: \"f65a2c043ff418743b8e4d4f642d65c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-201.ec2.internal" Apr 16 19:53:42.954297 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:42.954283 2561 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-201.ec2.internal\" not found" node="ip-10-0-128-201.ec2.internal" Apr 16 19:53:42.957855 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:42.957842 2561 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-201.ec2.internal\" not found" node="ip-10-0-128-201.ec2.internal" Apr 16 19:53:43.028775 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:43.028757 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-201.ec2.internal\" not found" Apr 16 19:53:43.046563 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.046544 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d9f1723847b6ccf58b5c375746506d34-config\") pod \"kube-apiserver-proxy-ip-10-0-128-201.ec2.internal\" (UID: \"d9f1723847b6ccf58b5c375746506d34\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-201.ec2.internal" Apr 16 19:53:43.046608 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.046568 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f65a2c043ff418743b8e4d4f642d65c0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-201.ec2.internal\" (UID: \"f65a2c043ff418743b8e4d4f642d65c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-201.ec2.internal" Apr 16 19:53:43.046608 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.046586 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f65a2c043ff418743b8e4d4f642d65c0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-201.ec2.internal\" (UID: \"f65a2c043ff418743b8e4d4f642d65c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-201.ec2.internal" Apr 16 19:53:43.046670 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.046614 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f65a2c043ff418743b8e4d4f642d65c0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-201.ec2.internal\" (UID: \"f65a2c043ff418743b8e4d4f642d65c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-201.ec2.internal" Apr 16 19:53:43.046670 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.046627 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d9f1723847b6ccf58b5c375746506d34-config\") pod \"kube-apiserver-proxy-ip-10-0-128-201.ec2.internal\" (UID: \"d9f1723847b6ccf58b5c375746506d34\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-201.ec2.internal" Apr 16 19:53:43.046670 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.046643 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f65a2c043ff418743b8e4d4f642d65c0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-201.ec2.internal\" (UID: \"f65a2c043ff418743b8e4d4f642d65c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-201.ec2.internal" Apr 16 19:53:43.129689 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:43.129670 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-201.ec2.internal\" not found" Apr 16 19:53:43.230393 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:43.230355 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-201.ec2.internal\" not found" Apr 16 19:53:43.255844 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.255822 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-201.ec2.internal" Apr 16 19:53:43.260342 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.260327 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-201.ec2.internal" Apr 16 19:53:43.330618 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:43.330596 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-201.ec2.internal\" not found" Apr 16 19:53:43.431146 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:43.431126 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-201.ec2.internal\" not found" Apr 16 19:53:43.531608 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:43.531560 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-201.ec2.internal\" not found" Apr 16 19:53:43.632254 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:43.632228 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-201.ec2.internal\" not found" Apr 16 19:53:43.647640 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.647618 2561 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 19:53:43.647799 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.647773 2561 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:53:43.706604 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:43.706574 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9f1723847b6ccf58b5c375746506d34.slice/crio-93fd89677a4d705a5c4ffcd7620f00e07e1a61a92fa832fd1105251eecd3c379 WatchSource:0}: Error finding container 93fd89677a4d705a5c4ffcd7620f00e07e1a61a92fa832fd1105251eecd3c379: Status 404 returned error can't find the container with id 93fd89677a4d705a5c4ffcd7620f00e07e1a61a92fa832fd1105251eecd3c379 Apr 16 19:53:43.707117 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:43.707097 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf65a2c043ff418743b8e4d4f642d65c0.slice/crio-82008cd6920001b50f363b4018ed5c728bf015e7a8e3d5dce01e276b4cb2146a WatchSource:0}: Error finding container 82008cd6920001b50f363b4018ed5c728bf015e7a8e3d5dce01e276b4cb2146a: Status 404 returned error can't find the container with id 82008cd6920001b50f363b4018ed5c728bf015e7a8e3d5dce01e276b4cb2146a Apr 16 19:53:43.711194 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.711179 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:53:43.732901 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:43.732882 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-201.ec2.internal\" not found" Apr 16 19:53:43.734406 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.734388 2561 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:43.744285 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.744268 2561 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 19:53:43.744380 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.744366 2561 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-201.ec2.internal" Apr 16 19:53:43.747499 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.747485 2561 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:43.753334 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.753320 2561 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:53:43.754164 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.754154 2561 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-201.ec2.internal" Apr 16 19:53:43.758179 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.758163 2561 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:53:43.765688 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.765676 2561 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:53:43.767835 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.767815 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 19:48:42 +0000 UTC" deadline="2027-09-25 14:02:05.671309969 +0000 UTC" Apr 16 19:53:43.767878 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.767835 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12642h8m21.903477336s" Apr 16 19:53:43.781044 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.781026 2561 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-cxngt" Apr 16 19:53:43.788621 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.788586 2561 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-cxngt" Apr 16 19:53:43.837609 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.837568 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-201.ec2.internal" event={"ID":"d9f1723847b6ccf58b5c375746506d34","Type":"ContainerStarted","Data":"93fd89677a4d705a5c4ffcd7620f00e07e1a61a92fa832fd1105251eecd3c379"} Apr 16 19:53:43.838418 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:43.838399 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-201.ec2.internal" event={"ID":"f65a2c043ff418743b8e4d4f642d65c0","Type":"ContainerStarted","Data":"82008cd6920001b50f363b4018ed5c728bf015e7a8e3d5dce01e276b4cb2146a"} Apr 16 19:53:44.014755 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.014598 2561 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:44.722498 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.722466 2561 apiserver.go:52] "Watching apiserver" Apr 16 19:53:44.731853 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.731829 2561 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 19:53:44.732184 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.732160 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-wm2x4","kube-system/kube-apiserver-proxy-ip-10-0-128-201.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f","openshift-cluster-node-tuning-operator/tuned-rlrsk","openshift-multus/multus-additional-cni-plugins-88wb2","openshift-multus/network-metrics-daemon-nx45q","openshift-network-diagnostics/network-check-target-qnt55","openshift-image-registry/node-ca-ddx6x","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-201.ec2.internal","openshift-multus/multus-2wctn","openshift-network-operator/iptables-alerter-84k6z","openshift-ovn-kubernetes/ovnkube-node-k4p2g"] Apr 16 19:53:44.734031 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.734008 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wm2x4" Apr 16 19:53:44.735061 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.735042 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:44.736095 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.736077 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.736685 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.736666 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 19:53:44.736768 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.736751 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-thpgj\"" Apr 16 19:53:44.736768 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.736758 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 19:53:44.737338 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.737319 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.737939 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.737914 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 19:53:44.738038 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.737919 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 19:53:44.738038 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.737950 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 19:53:44.738038 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.737971 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-pzhfx\"" Apr 16 19:53:44.738411 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.738394 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:53:44.738496 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.738418 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-94xj9\"" Apr 16 19:53:44.738556 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.738536 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:53:44.738666 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:44.738642 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nx45q" podUID="e0b9420a-1c3e-47b5-b187-827cb7f39aea" Apr 16 19:53:44.738963 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.738944 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 19:53:44.739766 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.739745 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:53:44.739875 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:44.739850 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qnt55" podUID="58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f" Apr 16 19:53:44.740071 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.740055 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 19:53:44.740148 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.740120 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 19:53:44.740255 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.740240 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-m46wp\"" Apr 16 19:53:44.740338 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.740321 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 19:53:44.740517 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.740497 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 19:53:44.740592 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.740540 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 19:53:44.741345 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.741325 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ddx6x" Apr 16 19:53:44.743338 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.743321 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.743831 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.743771 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 19:53:44.743929 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.743837 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 19:53:44.743929 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.743841 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 19:53:44.743929 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.743907 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-q6cfv\"" Apr 16 19:53:44.744601 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.744586 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-84k6z" Apr 16 19:53:44.745880 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.745861 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 19:53:44.745981 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.745954 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-x25sx\"" Apr 16 19:53:44.746042 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.746024 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.746899 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.746883 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 19:53:44.746984 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.746949 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:53:44.747072 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.747054 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-8sc85\"" Apr 16 19:53:44.747311 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.747295 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 19:53:44.748303 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.748287 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 19:53:44.751947 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.749274 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 19:53:44.751947 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.749400 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 19:53:44.751947 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.749489 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hdjfc\"" Apr 16 19:53:44.751947 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.749727 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 19:53:44.751947 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.750149 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 19:53:44.751947 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.750469 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 19:53:44.754460 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.754431 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-sys\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.754548 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.754461 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/24e2ccb6-cefd-4e6b-baff-95b016092cf8-iptables-alerter-script\") pod \"iptables-alerter-84k6z\" (UID: \"24e2ccb6-cefd-4e6b-baff-95b016092cf8\") " pod="openshift-network-operator/iptables-alerter-84k6z" Apr 16 19:53:44.754548 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.754478 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/344966ff-23f6-4f65-ae57-5f820201e8b8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rbn2f\" (UID: \"344966ff-23f6-4f65-ae57-5f820201e8b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:44.754548 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.754494 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/344966ff-23f6-4f65-ae57-5f820201e8b8-sys-fs\") pod \"aws-ebs-csi-driver-node-rbn2f\" (UID: \"344966ff-23f6-4f65-ae57-5f820201e8b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:44.754548 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.754520 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-run-ovn\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.754694 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.754559 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-var-lib-kubelet\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.754694 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.754600 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-host\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.754694 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.754644 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6f07541a-6ad1-43d0-9a04-540a16f67cec-cni-binary-copy\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.754694 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.754676 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-os-release\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.754840 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.754703 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6f07541a-6ad1-43d0-9a04-540a16f67cec-tuning-conf-dir\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.754840 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.754739 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/54ec5af6-8d3b-4667-ae54-83fef36ee26c-konnectivity-ca\") pod \"konnectivity-agent-wm2x4\" (UID: \"54ec5af6-8d3b-4667-ae54-83fef36ee26c\") " pod="kube-system/konnectivity-agent-wm2x4" Apr 16 19:53:44.754840 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.754767 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-cni-binary-copy\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.754840 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.754803 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-etc-sysctl-conf\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.754996 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.754845 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6f07541a-6ad1-43d0-9a04-540a16f67cec-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.754996 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.754873 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/54ec5af6-8d3b-4667-ae54-83fef36ee26c-agent-certs\") pod \"konnectivity-agent-wm2x4\" (UID: \"54ec5af6-8d3b-4667-ae54-83fef36ee26c\") " pod="kube-system/konnectivity-agent-wm2x4" Apr 16 19:53:44.754996 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.754897 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-system-cni-dir\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.754996 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.754942 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-multus-cni-dir\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.754996 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.754993 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgdd5\" (UniqueName: \"kubernetes.io/projected/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-kube-api-access-kgdd5\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.755174 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755039 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-run-systemd\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.755174 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755068 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl5d9\" (UniqueName: \"kubernetes.io/projected/24e2ccb6-cefd-4e6b-baff-95b016092cf8-kube-api-access-cl5d9\") pod \"iptables-alerter-84k6z\" (UID: \"24e2ccb6-cefd-4e6b-baff-95b016092cf8\") " pod="openshift-network-operator/iptables-alerter-84k6z" Apr 16 19:53:44.755174 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755095 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-etc-sysconfig\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.755174 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755124 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-etc-kubernetes\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.755174 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755154 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-host-cni-bin\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.755347 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755177 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkskm\" (UniqueName: \"kubernetes.io/projected/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-kube-api-access-hkskm\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.755347 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755202 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6f07541a-6ad1-43d0-9a04-540a16f67cec-cnibin\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.755347 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755227 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlrbv\" (UniqueName: \"kubernetes.io/projected/6f07541a-6ad1-43d0-9a04-540a16f67cec-kube-api-access-hlrbv\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.755347 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755250 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-multus-daemon-config\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.755347 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755271 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-etc-openvswitch\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.755347 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755291 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-lib-modules\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.755347 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755312 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vhq5\" (UniqueName: \"kubernetes.io/projected/09d46eec-98b3-409a-adf0-e27e7e7fa496-kube-api-access-6vhq5\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.755347 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755334 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c3753347-dfcc-47be-a251-65c3470b8045-serviceca\") pod \"node-ca-ddx6x\" (UID: \"c3753347-dfcc-47be-a251-65c3470b8045\") " pod="openshift-image-registry/node-ca-ddx6x" Apr 16 19:53:44.755619 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755368 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/344966ff-23f6-4f65-ae57-5f820201e8b8-socket-dir\") pod \"aws-ebs-csi-driver-node-rbn2f\" (UID: \"344966ff-23f6-4f65-ae57-5f820201e8b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:44.755619 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755394 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-ovn-node-metrics-cert\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.755619 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755417 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/09d46eec-98b3-409a-adf0-e27e7e7fa496-etc-tuned\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.755619 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755439 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/09d46eec-98b3-409a-adf0-e27e7e7fa496-tmp\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.755619 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755478 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6f07541a-6ad1-43d0-9a04-540a16f67cec-os-release\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.755619 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755509 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/344966ff-23f6-4f65-ae57-5f820201e8b8-etc-selinux\") pod \"aws-ebs-csi-driver-node-rbn2f\" (UID: \"344966ff-23f6-4f65-ae57-5f820201e8b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:44.755619 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755530 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-cnibin\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.755619 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755543 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-host-cni-netd\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.755619 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755556 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-etc-sysctl-d\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.755619 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755573 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-host-var-lib-cni-bin\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.755619 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755607 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-host-var-lib-kubelet\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.756045 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755636 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-host-slash\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.756045 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755668 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-multus-socket-dir-parent\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.756045 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755693 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-host-run-netns\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.756045 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755731 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-hostroot\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.756045 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755752 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-run-openvswitch\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.756045 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755767 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-etc-kubernetes\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.756045 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755781 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-ovnkube-script-lib\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.756045 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755818 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-host-run-netns\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.756045 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755837 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.756045 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755852 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-env-overrides\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.756045 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755866 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-etc-systemd\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.756045 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755883 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-multus-conf-dir\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.756045 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755908 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-var-lib-openvswitch\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.756045 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755928 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-log-socket\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.756045 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755949 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f07541a-6ad1-43d0-9a04-540a16f67cec-system-cni-dir\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.756045 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.755991 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24e2ccb6-cefd-4e6b-baff-95b016092cf8-host-slash\") pod \"iptables-alerter-84k6z\" (UID: \"24e2ccb6-cefd-4e6b-baff-95b016092cf8\") " pod="openshift-network-operator/iptables-alerter-84k6z" Apr 16 19:53:44.756572 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.756024 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz95h\" (UniqueName: \"kubernetes.io/projected/344966ff-23f6-4f65-ae57-5f820201e8b8-kube-api-access-lz95h\") pod \"aws-ebs-csi-driver-node-rbn2f\" (UID: \"344966ff-23f6-4f65-ae57-5f820201e8b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:44.756572 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.756044 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-host-run-multus-certs\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.756572 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.756058 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-systemd-units\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.756572 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.756076 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-etc-modprobe-d\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.756572 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.756100 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c3753347-dfcc-47be-a251-65c3470b8045-host\") pod \"node-ca-ddx6x\" (UID: \"c3753347-dfcc-47be-a251-65c3470b8045\") " pod="openshift-image-registry/node-ca-ddx6x" Apr 16 19:53:44.756572 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.756128 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/344966ff-23f6-4f65-ae57-5f820201e8b8-registration-dir\") pod \"aws-ebs-csi-driver-node-rbn2f\" (UID: \"344966ff-23f6-4f65-ae57-5f820201e8b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:44.756572 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.756141 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/344966ff-23f6-4f65-ae57-5f820201e8b8-device-dir\") pod \"aws-ebs-csi-driver-node-rbn2f\" (UID: \"344966ff-23f6-4f65-ae57-5f820201e8b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:44.756572 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.756162 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcbfx\" (UniqueName: \"kubernetes.io/projected/58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f-kube-api-access-kcbfx\") pod \"network-check-target-qnt55\" (UID: \"58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f\") " pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:53:44.756572 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.756179 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-host-run-k8s-cni-cncf-io\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.756572 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.756194 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-host-var-lib-cni-multus\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.756572 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.756241 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-host-kubelet\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.756572 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.756283 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6f07541a-6ad1-43d0-9a04-540a16f67cec-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.756572 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.756314 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j54xv\" (UniqueName: \"kubernetes.io/projected/c3753347-dfcc-47be-a251-65c3470b8045-kube-api-access-j54xv\") pod \"node-ca-ddx6x\" (UID: \"c3753347-dfcc-47be-a251-65c3470b8045\") " pod="openshift-image-registry/node-ca-ddx6x" Apr 16 19:53:44.756572 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.756342 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs\") pod \"network-metrics-daemon-nx45q\" (UID: \"e0b9420a-1c3e-47b5-b187-827cb7f39aea\") " pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:53:44.756572 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.756393 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxkwf\" (UniqueName: \"kubernetes.io/projected/e0b9420a-1c3e-47b5-b187-827cb7f39aea-kube-api-access-wxkwf\") pod \"network-metrics-daemon-nx45q\" (UID: \"e0b9420a-1c3e-47b5-b187-827cb7f39aea\") " pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:53:44.756572 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.756423 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-node-log\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.757150 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.756451 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-host-run-ovn-kubernetes\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.757150 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.756475 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-ovnkube-config\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.757150 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.756497 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-run\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.789414 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.789390 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:48:43 +0000 UTC" deadline="2028-01-03 06:24:44.637416124 +0000 UTC" Apr 16 19:53:44.789414 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.789413 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15034h30m59.848005587s" Apr 16 19:53:44.846603 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.846581 2561 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 19:53:44.856864 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.856840 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-multus-daemon-config\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.856970 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.856874 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-etc-openvswitch\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.856970 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.856895 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-lib-modules\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.856970 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.856918 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vhq5\" (UniqueName: \"kubernetes.io/projected/09d46eec-98b3-409a-adf0-e27e7e7fa496-kube-api-access-6vhq5\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.857106 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.856992 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-etc-openvswitch\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.857106 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857041 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-lib-modules\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.857106 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857092 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c3753347-dfcc-47be-a251-65c3470b8045-serviceca\") pod \"node-ca-ddx6x\" (UID: \"c3753347-dfcc-47be-a251-65c3470b8045\") " pod="openshift-image-registry/node-ca-ddx6x" Apr 16 19:53:44.857242 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857125 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/344966ff-23f6-4f65-ae57-5f820201e8b8-socket-dir\") pod \"aws-ebs-csi-driver-node-rbn2f\" (UID: \"344966ff-23f6-4f65-ae57-5f820201e8b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:44.857242 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857149 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-ovn-node-metrics-cert\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.857242 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857192 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/09d46eec-98b3-409a-adf0-e27e7e7fa496-etc-tuned\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.857242 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857214 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/09d46eec-98b3-409a-adf0-e27e7e7fa496-tmp\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.857381 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857247 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6f07541a-6ad1-43d0-9a04-540a16f67cec-os-release\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.857381 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857263 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/344966ff-23f6-4f65-ae57-5f820201e8b8-socket-dir\") pod \"aws-ebs-csi-driver-node-rbn2f\" (UID: \"344966ff-23f6-4f65-ae57-5f820201e8b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:44.857381 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857273 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/344966ff-23f6-4f65-ae57-5f820201e8b8-etc-selinux\") pod \"aws-ebs-csi-driver-node-rbn2f\" (UID: \"344966ff-23f6-4f65-ae57-5f820201e8b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:44.857381 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857296 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-cnibin\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.857381 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857319 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-host-cni-netd\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.857381 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857344 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-etc-sysctl-d\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.857381 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857366 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-host-var-lib-cni-bin\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.857594 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857389 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-host-var-lib-kubelet\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.857594 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857411 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-host-slash\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.857594 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857436 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-multus-socket-dir-parent\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.857594 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857459 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-host-run-netns\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.857594 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857467 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c3753347-dfcc-47be-a251-65c3470b8045-serviceca\") pod \"node-ca-ddx6x\" (UID: \"c3753347-dfcc-47be-a251-65c3470b8045\") " pod="openshift-image-registry/node-ca-ddx6x" Apr 16 19:53:44.857594 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857482 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-hostroot\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.857594 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857509 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-run-openvswitch\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.857594 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857530 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-multus-daemon-config\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.857594 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857540 2561 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 19:53:44.857594 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857571 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/344966ff-23f6-4f65-ae57-5f820201e8b8-etc-selinux\") pod \"aws-ebs-csi-driver-node-rbn2f\" (UID: \"344966ff-23f6-4f65-ae57-5f820201e8b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:44.857594 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857586 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-host-var-lib-cni-bin\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.857992 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857602 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-multus-socket-dir-parent\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.857992 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857609 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-etc-sysctl-d\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.857992 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857624 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-host-var-lib-kubelet\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.857992 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857644 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-etc-kubernetes\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.857992 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857653 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-host-cni-netd\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.857992 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857668 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-cnibin\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.857992 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857678 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-host-run-netns\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.857992 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857536 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-etc-kubernetes\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.857992 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857714 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-ovnkube-script-lib\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.857992 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857714 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-run-openvswitch\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.857992 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857739 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-host-run-netns\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.857992 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857768 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-host-run-netns\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.857992 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857771 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.857992 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857826 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.857992 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857684 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-hostroot\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.857992 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857849 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-env-overrides\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.857992 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857875 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-etc-systemd\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.857992 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857898 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-multus-conf-dir\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.858860 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857917 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6f07541a-6ad1-43d0-9a04-540a16f67cec-os-release\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.858860 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857876 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-host-slash\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.858860 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857941 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-var-lib-openvswitch\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.858860 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857965 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-log-socket\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.858860 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857979 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-etc-systemd\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.858860 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.857988 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f07541a-6ad1-43d0-9a04-540a16f67cec-system-cni-dir\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.858860 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858010 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24e2ccb6-cefd-4e6b-baff-95b016092cf8-host-slash\") pod \"iptables-alerter-84k6z\" (UID: \"24e2ccb6-cefd-4e6b-baff-95b016092cf8\") " pod="openshift-network-operator/iptables-alerter-84k6z" Apr 16 19:53:44.858860 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858034 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lz95h\" (UniqueName: \"kubernetes.io/projected/344966ff-23f6-4f65-ae57-5f820201e8b8-kube-api-access-lz95h\") pod \"aws-ebs-csi-driver-node-rbn2f\" (UID: \"344966ff-23f6-4f65-ae57-5f820201e8b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:44.858860 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858077 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-host-run-multus-certs\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.858860 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858098 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-systemd-units\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.858860 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858127 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-etc-modprobe-d\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.858860 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858153 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c3753347-dfcc-47be-a251-65c3470b8045-host\") pod \"node-ca-ddx6x\" (UID: \"c3753347-dfcc-47be-a251-65c3470b8045\") " pod="openshift-image-registry/node-ca-ddx6x" Apr 16 19:53:44.858860 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858179 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/344966ff-23f6-4f65-ae57-5f820201e8b8-registration-dir\") pod \"aws-ebs-csi-driver-node-rbn2f\" (UID: \"344966ff-23f6-4f65-ae57-5f820201e8b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:44.858860 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858189 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f07541a-6ad1-43d0-9a04-540a16f67cec-system-cni-dir\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.858860 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858204 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/344966ff-23f6-4f65-ae57-5f820201e8b8-device-dir\") pod \"aws-ebs-csi-driver-node-rbn2f\" (UID: \"344966ff-23f6-4f65-ae57-5f820201e8b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:44.858860 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858232 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcbfx\" (UniqueName: \"kubernetes.io/projected/58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f-kube-api-access-kcbfx\") pod \"network-check-target-qnt55\" (UID: \"58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f\") " pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:53:44.858860 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858235 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-multus-conf-dir\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.859696 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858257 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-host-run-k8s-cni-cncf-io\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.859696 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858270 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-env-overrides\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.859696 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858275 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-log-socket\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.859696 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858280 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-host-var-lib-cni-multus\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.859696 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858282 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-var-lib-openvswitch\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.859696 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858317 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-host-kubelet\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.859696 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858324 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24e2ccb6-cefd-4e6b-baff-95b016092cf8-host-slash\") pod \"iptables-alerter-84k6z\" (UID: \"24e2ccb6-cefd-4e6b-baff-95b016092cf8\") " pod="openshift-network-operator/iptables-alerter-84k6z" Apr 16 19:53:44.859696 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858335 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c3753347-dfcc-47be-a251-65c3470b8045-host\") pod \"node-ca-ddx6x\" (UID: \"c3753347-dfcc-47be-a251-65c3470b8045\") " pod="openshift-image-registry/node-ca-ddx6x" Apr 16 19:53:44.859696 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858352 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/344966ff-23f6-4f65-ae57-5f820201e8b8-registration-dir\") pod \"aws-ebs-csi-driver-node-rbn2f\" (UID: \"344966ff-23f6-4f65-ae57-5f820201e8b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:44.859696 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858358 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-host-kubelet\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.859696 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858356 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6f07541a-6ad1-43d0-9a04-540a16f67cec-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.859696 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858390 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-host-var-lib-cni-multus\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.859696 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858398 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j54xv\" (UniqueName: \"kubernetes.io/projected/c3753347-dfcc-47be-a251-65c3470b8045-kube-api-access-j54xv\") pod \"node-ca-ddx6x\" (UID: \"c3753347-dfcc-47be-a251-65c3470b8045\") " pod="openshift-image-registry/node-ca-ddx6x" Apr 16 19:53:44.859696 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858407 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/344966ff-23f6-4f65-ae57-5f820201e8b8-device-dir\") pod \"aws-ebs-csi-driver-node-rbn2f\" (UID: \"344966ff-23f6-4f65-ae57-5f820201e8b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:44.859696 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858425 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs\") pod \"network-metrics-daemon-nx45q\" (UID: \"e0b9420a-1c3e-47b5-b187-827cb7f39aea\") " pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:53:44.859696 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858451 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxkwf\" (UniqueName: \"kubernetes.io/projected/e0b9420a-1c3e-47b5-b187-827cb7f39aea-kube-api-access-wxkwf\") pod \"network-metrics-daemon-nx45q\" (UID: \"e0b9420a-1c3e-47b5-b187-827cb7f39aea\") " pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:53:44.859696 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858463 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-host-run-k8s-cni-cncf-io\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.860520 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858474 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-node-log\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.860520 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858499 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-systemd-units\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.860520 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858498 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-host-run-ovn-kubernetes\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.860520 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858527 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-ovnkube-config\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.860520 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858554 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-run\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.860520 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858587 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-sys\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.860520 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858613 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/24e2ccb6-cefd-4e6b-baff-95b016092cf8-iptables-alerter-script\") pod \"iptables-alerter-84k6z\" (UID: \"24e2ccb6-cefd-4e6b-baff-95b016092cf8\") " pod="openshift-network-operator/iptables-alerter-84k6z" Apr 16 19:53:44.860520 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858639 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/344966ff-23f6-4f65-ae57-5f820201e8b8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rbn2f\" (UID: \"344966ff-23f6-4f65-ae57-5f820201e8b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:44.860520 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858665 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/344966ff-23f6-4f65-ae57-5f820201e8b8-sys-fs\") pod \"aws-ebs-csi-driver-node-rbn2f\" (UID: \"344966ff-23f6-4f65-ae57-5f820201e8b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:44.860520 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858691 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-run-ovn\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.860520 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858713 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-var-lib-kubelet\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.860520 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858737 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-host\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.860520 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858777 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6f07541a-6ad1-43d0-9a04-540a16f67cec-cni-binary-copy\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.860520 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858820 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-os-release\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.860520 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858860 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-node-log\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.860520 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858885 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-os-release\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.860520 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858899 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6f07541a-6ad1-43d0-9a04-540a16f67cec-tuning-conf-dir\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.861239 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858930 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/54ec5af6-8d3b-4667-ae54-83fef36ee26c-konnectivity-ca\") pod \"konnectivity-agent-wm2x4\" (UID: \"54ec5af6-8d3b-4667-ae54-83fef36ee26c\") " pod="kube-system/konnectivity-agent-wm2x4" Apr 16 19:53:44.861239 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858936 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-host-run-multus-certs\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.861239 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858946 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-ovnkube-script-lib\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.861239 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858960 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6f07541a-6ad1-43d0-9a04-540a16f67cec-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.861239 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858530 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-host-run-ovn-kubernetes\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.861239 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:44.858613 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:44.861239 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859018 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-run\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.861239 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:44.859040 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs podName:e0b9420a-1c3e-47b5-b187-827cb7f39aea nodeName:}" failed. No retries permitted until 2026-04-16 19:53:45.35901924 +0000 UTC m=+3.082065144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs") pod "network-metrics-daemon-nx45q" (UID: "e0b9420a-1c3e-47b5-b187-827cb7f39aea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:44.861239 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859046 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6f07541a-6ad1-43d0-9a04-540a16f67cec-tuning-conf-dir\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.861239 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.858963 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-cni-binary-copy\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.861239 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859082 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-sys\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.861239 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859089 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-etc-sysctl-conf\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.861239 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859115 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6f07541a-6ad1-43d0-9a04-540a16f67cec-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.861239 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859138 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/54ec5af6-8d3b-4667-ae54-83fef36ee26c-agent-certs\") pod \"konnectivity-agent-wm2x4\" (UID: \"54ec5af6-8d3b-4667-ae54-83fef36ee26c\") " pod="kube-system/konnectivity-agent-wm2x4" Apr 16 19:53:44.861239 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859182 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-system-cni-dir\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.861239 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859198 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-etc-modprobe-d\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.861239 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859206 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-var-lib-kubelet\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.861919 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859208 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-multus-cni-dir\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.861919 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859234 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgdd5\" (UniqueName: \"kubernetes.io/projected/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-kube-api-access-kgdd5\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.861919 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859258 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/344966ff-23f6-4f65-ae57-5f820201e8b8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rbn2f\" (UID: \"344966ff-23f6-4f65-ae57-5f820201e8b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:44.861919 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859260 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-run-systemd\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.861919 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859293 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cl5d9\" (UniqueName: \"kubernetes.io/projected/24e2ccb6-cefd-4e6b-baff-95b016092cf8-kube-api-access-cl5d9\") pod \"iptables-alerter-84k6z\" (UID: \"24e2ccb6-cefd-4e6b-baff-95b016092cf8\") " pod="openshift-network-operator/iptables-alerter-84k6z" Apr 16 19:53:44.861919 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859300 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-run-systemd\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.861919 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859320 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-etc-sysconfig\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.861919 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859344 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-etc-kubernetes\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.861919 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859344 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-cni-binary-copy\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.861919 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859370 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-host-cni-bin\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.861919 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859394 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkskm\" (UniqueName: \"kubernetes.io/projected/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-kube-api-access-hkskm\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.861919 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859420 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6f07541a-6ad1-43d0-9a04-540a16f67cec-cnibin\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.861919 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859446 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlrbv\" (UniqueName: \"kubernetes.io/projected/6f07541a-6ad1-43d0-9a04-540a16f67cec-kube-api-access-hlrbv\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.861919 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859627 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-host\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.861919 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859669 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-run-ovn\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.861919 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859630 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/344966ff-23f6-4f65-ae57-5f820201e8b8-sys-fs\") pod \"aws-ebs-csi-driver-node-rbn2f\" (UID: \"344966ff-23f6-4f65-ae57-5f820201e8b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:44.861919 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859756 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/24e2ccb6-cefd-4e6b-baff-95b016092cf8-iptables-alerter-script\") pod \"iptables-alerter-84k6z\" (UID: \"24e2ccb6-cefd-4e6b-baff-95b016092cf8\") " pod="openshift-network-operator/iptables-alerter-84k6z" Apr 16 19:53:44.862563 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859836 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-system-cni-dir\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.862563 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859845 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6f07541a-6ad1-43d0-9a04-540a16f67cec-cnibin\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.862563 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859885 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-host-cni-bin\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.862563 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859927 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-etc-sysconfig\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.862563 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.859977 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-multus-cni-dir\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.862563 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.860008 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-etc-kubernetes\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.862563 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.860033 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/09d46eec-98b3-409a-adf0-e27e7e7fa496-etc-sysctl-conf\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.862563 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.860242 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6f07541a-6ad1-43d0-9a04-540a16f67cec-cni-binary-copy\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.862563 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.860393 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/54ec5af6-8d3b-4667-ae54-83fef36ee26c-konnectivity-ca\") pod \"konnectivity-agent-wm2x4\" (UID: \"54ec5af6-8d3b-4667-ae54-83fef36ee26c\") " pod="kube-system/konnectivity-agent-wm2x4" Apr 16 19:53:44.862563 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.860531 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6f07541a-6ad1-43d0-9a04-540a16f67cec-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.862563 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.861105 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-ovnkube-config\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.862563 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.861156 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-ovn-node-metrics-cert\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.862563 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.862195 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/09d46eec-98b3-409a-adf0-e27e7e7fa496-etc-tuned\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.863064 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.862639 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/09d46eec-98b3-409a-adf0-e27e7e7fa496-tmp\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.863171 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.863150 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/54ec5af6-8d3b-4667-ae54-83fef36ee26c-agent-certs\") pod \"konnectivity-agent-wm2x4\" (UID: \"54ec5af6-8d3b-4667-ae54-83fef36ee26c\") " pod="kube-system/konnectivity-agent-wm2x4" Apr 16 19:53:44.867116 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.867050 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vhq5\" (UniqueName: \"kubernetes.io/projected/09d46eec-98b3-409a-adf0-e27e7e7fa496-kube-api-access-6vhq5\") pod \"tuned-rlrsk\" (UID: \"09d46eec-98b3-409a-adf0-e27e7e7fa496\") " pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:44.871495 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:44.871417 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:44.871495 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:44.871440 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:44.871495 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:44.871449 2561 projected.go:194] Error preparing data for projected volume kube-api-access-kcbfx for pod openshift-network-diagnostics/network-check-target-qnt55: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:44.871692 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:44.871503 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f-kube-api-access-kcbfx podName:58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f nodeName:}" failed. No retries permitted until 2026-04-16 19:53:45.371485548 +0000 UTC m=+3.094531435 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kcbfx" (UniqueName: "kubernetes.io/projected/58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f-kube-api-access-kcbfx") pod "network-check-target-qnt55" (UID: "58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:44.873695 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.873672 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlrbv\" (UniqueName: \"kubernetes.io/projected/6f07541a-6ad1-43d0-9a04-540a16f67cec-kube-api-access-hlrbv\") pod \"multus-additional-cni-plugins-88wb2\" (UID: \"6f07541a-6ad1-43d0-9a04-540a16f67cec\") " pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:44.874779 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.874378 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkskm\" (UniqueName: \"kubernetes.io/projected/356a3ae0-1448-42b5-a8eb-eb35ac7b6f96-kube-api-access-hkskm\") pod \"ovnkube-node-k4p2g\" (UID: \"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96\") " pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:44.874779 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.874714 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz95h\" (UniqueName: \"kubernetes.io/projected/344966ff-23f6-4f65-ae57-5f820201e8b8-kube-api-access-lz95h\") pod \"aws-ebs-csi-driver-node-rbn2f\" (UID: \"344966ff-23f6-4f65-ae57-5f820201e8b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:44.875449 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.875426 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j54xv\" (UniqueName: \"kubernetes.io/projected/c3753347-dfcc-47be-a251-65c3470b8045-kube-api-access-j54xv\") pod \"node-ca-ddx6x\" (UID: \"c3753347-dfcc-47be-a251-65c3470b8045\") " pod="openshift-image-registry/node-ca-ddx6x" Apr 16 19:53:44.875558 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.875539 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgdd5\" (UniqueName: \"kubernetes.io/projected/8012fac3-113c-4958-9ea3-7cdbc5e5c6e9-kube-api-access-kgdd5\") pod \"multus-2wctn\" (UID: \"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9\") " pod="openshift-multus/multus-2wctn" Apr 16 19:53:44.875632 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.875614 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl5d9\" (UniqueName: \"kubernetes.io/projected/24e2ccb6-cefd-4e6b-baff-95b016092cf8-kube-api-access-cl5d9\") pod \"iptables-alerter-84k6z\" (UID: \"24e2ccb6-cefd-4e6b-baff-95b016092cf8\") " pod="openshift-network-operator/iptables-alerter-84k6z" Apr 16 19:53:44.876277 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:44.876258 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxkwf\" (UniqueName: \"kubernetes.io/projected/e0b9420a-1c3e-47b5-b187-827cb7f39aea-kube-api-access-wxkwf\") pod \"network-metrics-daemon-nx45q\" (UID: \"e0b9420a-1c3e-47b5-b187-827cb7f39aea\") " pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:53:45.046654 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:45.046549 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wm2x4" Apr 16 19:53:45.053439 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:45.053418 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" Apr 16 19:53:45.061102 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:45.061083 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" Apr 16 19:53:45.065638 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:45.065619 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-88wb2" Apr 16 19:53:45.073167 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:45.073151 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ddx6x" Apr 16 19:53:45.080698 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:45.080680 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2wctn" Apr 16 19:53:45.087864 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:45.087845 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-84k6z" Apr 16 19:53:45.087957 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:45.087900 2561 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:45.093351 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:45.093334 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:53:45.310485 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:45.310405 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f07541a_6ad1_43d0_9a04_540a16f67cec.slice/crio-3b50db2e47ce9756716b6f9d8b0cfdf3beae066aa305853051b76b33a875c6ff WatchSource:0}: Error finding container 3b50db2e47ce9756716b6f9d8b0cfdf3beae066aa305853051b76b33a875c6ff: Status 404 returned error can't find the container with id 3b50db2e47ce9756716b6f9d8b0cfdf3beae066aa305853051b76b33a875c6ff Apr 16 19:53:45.311877 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:45.311823 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod344966ff_23f6_4f65_ae57_5f820201e8b8.slice/crio-db90f91b45d65c2c96e7d963c8512e41d9407c48d032d8ad42f88a0bce0e19e8 WatchSource:0}: Error finding container db90f91b45d65c2c96e7d963c8512e41d9407c48d032d8ad42f88a0bce0e19e8: Status 404 returned error can't find the container with id db90f91b45d65c2c96e7d963c8512e41d9407c48d032d8ad42f88a0bce0e19e8 Apr 16 19:53:45.312800 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:45.312756 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8012fac3_113c_4958_9ea3_7cdbc5e5c6e9.slice/crio-5ee6c2a017f21fed275c8439db354f34accc352ca491e55c5f54d2f989c14b98 WatchSource:0}: Error finding container 5ee6c2a017f21fed275c8439db354f34accc352ca491e55c5f54d2f989c14b98: Status 404 returned error can't find the container with id 5ee6c2a017f21fed275c8439db354f34accc352ca491e55c5f54d2f989c14b98 Apr 16 19:53:45.318579 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:45.318424 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3753347_dfcc_47be_a251_65c3470b8045.slice/crio-06a92bdb48a2d936535a4aad13997a00fb0936bbb8db9d084e457adc4212575b WatchSource:0}: Error finding container 06a92bdb48a2d936535a4aad13997a00fb0936bbb8db9d084e457adc4212575b: Status 404 returned error can't find the container with id 06a92bdb48a2d936535a4aad13997a00fb0936bbb8db9d084e457adc4212575b Apr 16 19:53:45.319839 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:45.319815 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24e2ccb6_cefd_4e6b_baff_95b016092cf8.slice/crio-7f2c107541348f32a445149a8011c88754e90317294336cc3b5797966283b1f2 WatchSource:0}: Error finding container 7f2c107541348f32a445149a8011c88754e90317294336cc3b5797966283b1f2: Status 404 returned error can't find the container with id 7f2c107541348f32a445149a8011c88754e90317294336cc3b5797966283b1f2 Apr 16 19:53:45.321003 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:45.320610 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod356a3ae0_1448_42b5_a8eb_eb35ac7b6f96.slice/crio-0cab298ce97a008d34e2fa5c0c0d84bf5b79f71c45b81424856f9325278b0adc WatchSource:0}: Error finding container 0cab298ce97a008d34e2fa5c0c0d84bf5b79f71c45b81424856f9325278b0adc: Status 404 returned error can't find the container with id 0cab298ce97a008d34e2fa5c0c0d84bf5b79f71c45b81424856f9325278b0adc Apr 16 19:53:45.321556 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:53:45.321533 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09d46eec_98b3_409a_adf0_e27e7e7fa496.slice/crio-cc7b8df117141ed3e22853d30fefd1ec1c3f044db4144fad4172f47272540c98 WatchSource:0}: Error finding container cc7b8df117141ed3e22853d30fefd1ec1c3f044db4144fad4172f47272540c98: Status 404 returned error can't find the container with id cc7b8df117141ed3e22853d30fefd1ec1c3f044db4144fad4172f47272540c98 Apr 16 19:53:45.363317 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:45.363295 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs\") pod \"network-metrics-daemon-nx45q\" (UID: \"e0b9420a-1c3e-47b5-b187-827cb7f39aea\") " pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:53:45.363431 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:45.363418 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:45.363472 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:45.363463 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs podName:e0b9420a-1c3e-47b5-b187-827cb7f39aea nodeName:}" failed. No retries permitted until 2026-04-16 19:53:46.363450522 +0000 UTC m=+4.086496407 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs") pod "network-metrics-daemon-nx45q" (UID: "e0b9420a-1c3e-47b5-b187-827cb7f39aea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:45.463689 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:45.463666 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcbfx\" (UniqueName: \"kubernetes.io/projected/58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f-kube-api-access-kcbfx\") pod \"network-check-target-qnt55\" (UID: \"58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f\") " pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:53:45.463855 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:45.463836 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:45.463943 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:45.463861 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:45.463943 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:45.463875 2561 projected.go:194] Error preparing data for projected volume kube-api-access-kcbfx for pod openshift-network-diagnostics/network-check-target-qnt55: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:45.463943 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:45.463928 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f-kube-api-access-kcbfx podName:58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f nodeName:}" failed. No retries permitted until 2026-04-16 19:53:46.463910575 +0000 UTC m=+4.186956461 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-kcbfx" (UniqueName: "kubernetes.io/projected/58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f-kube-api-access-kcbfx") pod "network-check-target-qnt55" (UID: "58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:45.790046 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:45.789870 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:48:43 +0000 UTC" deadline="2027-09-30 01:55:52.604363225 +0000 UTC" Apr 16 19:53:45.790046 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:45.789931 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12750h2m6.814437079s" Apr 16 19:53:45.847429 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:45.846832 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-201.ec2.internal" event={"ID":"d9f1723847b6ccf58b5c375746506d34","Type":"ContainerStarted","Data":"42bc6d405cdaefe7c45de3ee62f369cf98ca02665c74c8e17b95044680d9ca08"} Apr 16 19:53:45.850803 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:45.850681 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" event={"ID":"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96","Type":"ContainerStarted","Data":"0cab298ce97a008d34e2fa5c0c0d84bf5b79f71c45b81424856f9325278b0adc"} Apr 16 19:53:45.858367 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:45.858332 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ddx6x" event={"ID":"c3753347-dfcc-47be-a251-65c3470b8045","Type":"ContainerStarted","Data":"06a92bdb48a2d936535a4aad13997a00fb0936bbb8db9d084e457adc4212575b"} Apr 16 19:53:45.862893 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:45.862841 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" event={"ID":"344966ff-23f6-4f65-ae57-5f820201e8b8","Type":"ContainerStarted","Data":"db90f91b45d65c2c96e7d963c8512e41d9407c48d032d8ad42f88a0bce0e19e8"} Apr 16 19:53:45.867963 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:45.867926 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" event={"ID":"09d46eec-98b3-409a-adf0-e27e7e7fa496","Type":"ContainerStarted","Data":"cc7b8df117141ed3e22853d30fefd1ec1c3f044db4144fad4172f47272540c98"} Apr 16 19:53:45.874108 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:45.874016 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-84k6z" event={"ID":"24e2ccb6-cefd-4e6b-baff-95b016092cf8","Type":"ContainerStarted","Data":"7f2c107541348f32a445149a8011c88754e90317294336cc3b5797966283b1f2"} Apr 16 19:53:45.879837 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:45.878811 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wm2x4" event={"ID":"54ec5af6-8d3b-4667-ae54-83fef36ee26c","Type":"ContainerStarted","Data":"cc32e0b6d0fb69d0cc44956b7cba82ab26d6043c75c851641661e522ae8cbd52"} Apr 16 19:53:45.883101 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:45.883077 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2wctn" event={"ID":"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9","Type":"ContainerStarted","Data":"5ee6c2a017f21fed275c8439db354f34accc352ca491e55c5f54d2f989c14b98"} Apr 16 19:53:45.887263 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:45.887241 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88wb2" event={"ID":"6f07541a-6ad1-43d0-9a04-540a16f67cec","Type":"ContainerStarted","Data":"3b50db2e47ce9756716b6f9d8b0cfdf3beae066aa305853051b76b33a875c6ff"} Apr 16 19:53:46.372033 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:46.371938 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs\") pod \"network-metrics-daemon-nx45q\" (UID: \"e0b9420a-1c3e-47b5-b187-827cb7f39aea\") " pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:53:46.372169 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:46.372092 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:46.372169 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:46.372151 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs podName:e0b9420a-1c3e-47b5-b187-827cb7f39aea nodeName:}" failed. No retries permitted until 2026-04-16 19:53:48.372134019 +0000 UTC m=+6.095179908 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs") pod "network-metrics-daemon-nx45q" (UID: "e0b9420a-1c3e-47b5-b187-827cb7f39aea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:46.472997 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:46.472964 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcbfx\" (UniqueName: \"kubernetes.io/projected/58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f-kube-api-access-kcbfx\") pod \"network-check-target-qnt55\" (UID: \"58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f\") " pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:53:46.473153 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:46.473136 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:46.473208 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:46.473155 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:46.473208 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:46.473167 2561 projected.go:194] Error preparing data for projected volume kube-api-access-kcbfx for pod openshift-network-diagnostics/network-check-target-qnt55: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:46.473305 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:46.473227 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f-kube-api-access-kcbfx podName:58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f nodeName:}" failed. No retries permitted until 2026-04-16 19:53:48.473207898 +0000 UTC m=+6.196253783 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-kcbfx" (UniqueName: "kubernetes.io/projected/58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f-kube-api-access-kcbfx") pod "network-check-target-qnt55" (UID: "58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:46.835380 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:46.835305 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:53:46.835810 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:46.835445 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nx45q" podUID="e0b9420a-1c3e-47b5-b187-827cb7f39aea" Apr 16 19:53:46.835975 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:46.835955 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:53:46.836090 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:46.836054 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qnt55" podUID="58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f" Apr 16 19:53:46.897364 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:46.897320 2561 generic.go:358] "Generic (PLEG): container finished" podID="f65a2c043ff418743b8e4d4f642d65c0" containerID="ee7ef3fc3ee2c9be17ffe8832b089de352d8160ff06d5065a5265eaca8b8f316" exitCode=0 Apr 16 19:53:46.898289 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:46.898260 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-201.ec2.internal" event={"ID":"f65a2c043ff418743b8e4d4f642d65c0","Type":"ContainerDied","Data":"ee7ef3fc3ee2c9be17ffe8832b089de352d8160ff06d5065a5265eaca8b8f316"} Apr 16 19:53:46.914387 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:46.913979 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-201.ec2.internal" podStartSLOduration=3.913961887 podStartE2EDuration="3.913961887s" podCreationTimestamp="2026-04-16 19:53:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:53:45.862703813 +0000 UTC m=+3.585749721" watchObservedRunningTime="2026-04-16 19:53:46.913961887 +0000 UTC m=+4.637007795" Apr 16 19:53:47.902721 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:47.902679 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-201.ec2.internal" event={"ID":"f65a2c043ff418743b8e4d4f642d65c0","Type":"ContainerStarted","Data":"fd91359530d2355a42874d7e02454c254f486fe2ebf1962b06bdc5b9d4ee16f3"} Apr 16 19:53:48.388999 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:48.388920 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs\") pod \"network-metrics-daemon-nx45q\" (UID: \"e0b9420a-1c3e-47b5-b187-827cb7f39aea\") " pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:53:48.389295 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:48.389084 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:48.389295 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:48.389160 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs podName:e0b9420a-1c3e-47b5-b187-827cb7f39aea nodeName:}" failed. No retries permitted until 2026-04-16 19:53:52.389136658 +0000 UTC m=+10.112182544 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs") pod "network-metrics-daemon-nx45q" (UID: "e0b9420a-1c3e-47b5-b187-827cb7f39aea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:48.489999 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:48.489923 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcbfx\" (UniqueName: \"kubernetes.io/projected/58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f-kube-api-access-kcbfx\") pod \"network-check-target-qnt55\" (UID: \"58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f\") " pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:53:48.490163 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:48.490113 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:48.490163 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:48.490132 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:48.490163 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:48.490145 2561 projected.go:194] Error preparing data for projected volume kube-api-access-kcbfx for pod openshift-network-diagnostics/network-check-target-qnt55: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:48.490311 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:48.490195 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f-kube-api-access-kcbfx podName:58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f nodeName:}" failed. No retries permitted until 2026-04-16 19:53:52.49017845 +0000 UTC m=+10.213224340 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-kcbfx" (UniqueName: "kubernetes.io/projected/58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f-kube-api-access-kcbfx") pod "network-check-target-qnt55" (UID: "58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:48.836043 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:48.835509 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:53:48.836043 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:48.835647 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nx45q" podUID="e0b9420a-1c3e-47b5-b187-827cb7f39aea" Apr 16 19:53:48.837566 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:48.837427 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:53:48.837566 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:48.837523 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qnt55" podUID="58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f" Apr 16 19:53:50.835919 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:50.835273 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:53:50.835919 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:50.835387 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qnt55" podUID="58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f" Apr 16 19:53:50.835919 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:50.835464 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:53:50.835919 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:50.835547 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nx45q" podUID="e0b9420a-1c3e-47b5-b187-827cb7f39aea" Apr 16 19:53:52.428106 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:52.428042 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs\") pod \"network-metrics-daemon-nx45q\" (UID: \"e0b9420a-1c3e-47b5-b187-827cb7f39aea\") " pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:53:52.428598 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:52.428167 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:52.428598 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:52.428227 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs podName:e0b9420a-1c3e-47b5-b187-827cb7f39aea nodeName:}" failed. No retries permitted until 2026-04-16 19:54:00.428210314 +0000 UTC m=+18.151256216 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs") pod "network-metrics-daemon-nx45q" (UID: "e0b9420a-1c3e-47b5-b187-827cb7f39aea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:52.529046 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:52.529009 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcbfx\" (UniqueName: \"kubernetes.io/projected/58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f-kube-api-access-kcbfx\") pod \"network-check-target-qnt55\" (UID: \"58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f\") " pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:53:52.529242 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:52.529203 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:52.529242 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:52.529225 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:52.529242 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:52.529239 2561 projected.go:194] Error preparing data for projected volume kube-api-access-kcbfx for pod openshift-network-diagnostics/network-check-target-qnt55: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:52.529399 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:52.529302 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f-kube-api-access-kcbfx podName:58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f nodeName:}" failed. No retries permitted until 2026-04-16 19:54:00.529285099 +0000 UTC m=+18.252330996 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-kcbfx" (UniqueName: "kubernetes.io/projected/58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f-kube-api-access-kcbfx") pod "network-check-target-qnt55" (UID: "58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:52.836966 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:52.836692 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:53:52.837106 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:52.837015 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nx45q" podUID="e0b9420a-1c3e-47b5-b187-827cb7f39aea" Apr 16 19:53:52.837370 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:52.837354 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:53:52.837460 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:52.837441 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qnt55" podUID="58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f" Apr 16 19:53:54.835798 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:54.835754 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:53:54.836234 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:54.835755 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:53:54.836234 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:54.835871 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qnt55" podUID="58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f" Apr 16 19:53:54.836234 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:54.835986 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nx45q" podUID="e0b9420a-1c3e-47b5-b187-827cb7f39aea" Apr 16 19:53:56.835944 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:56.835909 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:53:56.836379 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:56.835909 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:53:56.836379 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:56.836033 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nx45q" podUID="e0b9420a-1c3e-47b5-b187-827cb7f39aea" Apr 16 19:53:56.836379 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:56.836125 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qnt55" podUID="58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f" Apr 16 19:53:58.834989 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:58.834953 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:53:58.835450 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:58.835076 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qnt55" podUID="58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f" Apr 16 19:53:58.835450 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:58.835139 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:53:58.835450 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:58.835257 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nx45q" podUID="e0b9420a-1c3e-47b5-b187-827cb7f39aea" Apr 16 19:53:59.486349 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:59.486291 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-201.ec2.internal" podStartSLOduration=16.486275274 podStartE2EDuration="16.486275274s" podCreationTimestamp="2026-04-16 19:53:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:53:47.917882158 +0000 UTC m=+5.640928065" watchObservedRunningTime="2026-04-16 19:53:59.486275274 +0000 UTC m=+17.209321183" Apr 16 19:53:59.486823 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:59.486803 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-wfp62"] Apr 16 19:53:59.500083 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:59.500057 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:53:59.500226 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:59.500156 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wfp62" podUID="8593da00-be3a-459a-9bf6-ee2f4988af66" Apr 16 19:53:59.579889 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:59.579857 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8593da00-be3a-459a-9bf6-ee2f4988af66-original-pull-secret\") pod \"global-pull-secret-syncer-wfp62\" (UID: \"8593da00-be3a-459a-9bf6-ee2f4988af66\") " pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:53:59.580062 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:59.579946 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8593da00-be3a-459a-9bf6-ee2f4988af66-kubelet-config\") pod \"global-pull-secret-syncer-wfp62\" (UID: \"8593da00-be3a-459a-9bf6-ee2f4988af66\") " pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:53:59.580062 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:59.579998 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8593da00-be3a-459a-9bf6-ee2f4988af66-dbus\") pod \"global-pull-secret-syncer-wfp62\" (UID: \"8593da00-be3a-459a-9bf6-ee2f4988af66\") " pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:53:59.680354 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:59.680312 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8593da00-be3a-459a-9bf6-ee2f4988af66-kubelet-config\") pod \"global-pull-secret-syncer-wfp62\" (UID: \"8593da00-be3a-459a-9bf6-ee2f4988af66\") " pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:53:59.680536 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:59.680365 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8593da00-be3a-459a-9bf6-ee2f4988af66-dbus\") pod \"global-pull-secret-syncer-wfp62\" (UID: \"8593da00-be3a-459a-9bf6-ee2f4988af66\") " pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:53:59.680536 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:59.680423 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8593da00-be3a-459a-9bf6-ee2f4988af66-original-pull-secret\") pod \"global-pull-secret-syncer-wfp62\" (UID: \"8593da00-be3a-459a-9bf6-ee2f4988af66\") " pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:53:59.680536 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:59.680432 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8593da00-be3a-459a-9bf6-ee2f4988af66-kubelet-config\") pod \"global-pull-secret-syncer-wfp62\" (UID: \"8593da00-be3a-459a-9bf6-ee2f4988af66\") " pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:53:59.680711 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:59.680563 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:59.680711 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:53:59.680575 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8593da00-be3a-459a-9bf6-ee2f4988af66-dbus\") pod \"global-pull-secret-syncer-wfp62\" (UID: \"8593da00-be3a-459a-9bf6-ee2f4988af66\") " pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:53:59.680711 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:53:59.680630 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8593da00-be3a-459a-9bf6-ee2f4988af66-original-pull-secret podName:8593da00-be3a-459a-9bf6-ee2f4988af66 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:00.18061164 +0000 UTC m=+17.903657570 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8593da00-be3a-459a-9bf6-ee2f4988af66-original-pull-secret") pod "global-pull-secret-syncer-wfp62" (UID: "8593da00-be3a-459a-9bf6-ee2f4988af66") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:00.184752 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:00.184715 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8593da00-be3a-459a-9bf6-ee2f4988af66-original-pull-secret\") pod \"global-pull-secret-syncer-wfp62\" (UID: \"8593da00-be3a-459a-9bf6-ee2f4988af66\") " pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:54:00.185147 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:00.184864 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:00.185147 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:00.184944 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8593da00-be3a-459a-9bf6-ee2f4988af66-original-pull-secret podName:8593da00-be3a-459a-9bf6-ee2f4988af66 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:01.18492108 +0000 UTC m=+18.907966970 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8593da00-be3a-459a-9bf6-ee2f4988af66-original-pull-secret") pod "global-pull-secret-syncer-wfp62" (UID: "8593da00-be3a-459a-9bf6-ee2f4988af66") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:00.488095 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:00.488013 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs\") pod \"network-metrics-daemon-nx45q\" (UID: \"e0b9420a-1c3e-47b5-b187-827cb7f39aea\") " pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:54:00.488242 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:00.488180 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:00.488304 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:00.488254 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs podName:e0b9420a-1c3e-47b5-b187-827cb7f39aea nodeName:}" failed. No retries permitted until 2026-04-16 19:54:16.488233434 +0000 UTC m=+34.211279320 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs") pod "network-metrics-daemon-nx45q" (UID: "e0b9420a-1c3e-47b5-b187-827cb7f39aea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:00.589127 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:00.589084 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcbfx\" (UniqueName: \"kubernetes.io/projected/58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f-kube-api-access-kcbfx\") pod \"network-check-target-qnt55\" (UID: \"58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f\") " pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:54:00.589295 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:00.589248 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:00.589295 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:00.589270 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:00.589295 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:00.589280 2561 projected.go:194] Error preparing data for projected volume kube-api-access-kcbfx for pod openshift-network-diagnostics/network-check-target-qnt55: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:00.589404 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:00.589328 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f-kube-api-access-kcbfx podName:58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f nodeName:}" failed. No retries permitted until 2026-04-16 19:54:16.589315576 +0000 UTC m=+34.312361463 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-kcbfx" (UniqueName: "kubernetes.io/projected/58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f-kube-api-access-kcbfx") pod "network-check-target-qnt55" (UID: "58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:00.835979 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:00.835896 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:54:00.836138 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:00.835896 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:54:00.836138 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:00.836027 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qnt55" podUID="58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f" Apr 16 19:54:00.836231 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:00.836130 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nx45q" podUID="e0b9420a-1c3e-47b5-b187-827cb7f39aea" Apr 16 19:54:01.195110 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:01.195025 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8593da00-be3a-459a-9bf6-ee2f4988af66-original-pull-secret\") pod \"global-pull-secret-syncer-wfp62\" (UID: \"8593da00-be3a-459a-9bf6-ee2f4988af66\") " pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:54:01.195528 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:01.195187 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:01.195528 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:01.195267 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8593da00-be3a-459a-9bf6-ee2f4988af66-original-pull-secret podName:8593da00-be3a-459a-9bf6-ee2f4988af66 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:03.195246286 +0000 UTC m=+20.918292190 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8593da00-be3a-459a-9bf6-ee2f4988af66-original-pull-secret") pod "global-pull-secret-syncer-wfp62" (UID: "8593da00-be3a-459a-9bf6-ee2f4988af66") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:01.835160 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:01.835135 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:54:01.835263 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:01.835237 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wfp62" podUID="8593da00-be3a-459a-9bf6-ee2f4988af66" Apr 16 19:54:02.836645 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:02.836429 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:54:02.837421 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:02.836502 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:54:02.837421 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:02.836739 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nx45q" podUID="e0b9420a-1c3e-47b5-b187-827cb7f39aea" Apr 16 19:54:02.837421 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:02.836824 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qnt55" podUID="58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f" Apr 16 19:54:02.929165 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:02.929079 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" event={"ID":"09d46eec-98b3-409a-adf0-e27e7e7fa496","Type":"ContainerStarted","Data":"704f89298b9549ac7dae462ab7d4cbd987eca165a3ef88d667026e2282b425ca"} Apr 16 19:54:02.930384 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:02.930357 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wm2x4" event={"ID":"54ec5af6-8d3b-4667-ae54-83fef36ee26c","Type":"ContainerStarted","Data":"63fd91826296cf596f56bf17f294303f5aac4e154bc6e4eace3d4b1243039c5a"} Apr 16 19:54:02.931676 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:02.931655 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2wctn" event={"ID":"8012fac3-113c-4958-9ea3-7cdbc5e5c6e9","Type":"ContainerStarted","Data":"05c7e63cc0da866706b949e27c41f2fecee9e366a0ac3e5bdfb59f83555d6f44"} Apr 16 19:54:02.933126 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:02.933104 2561 generic.go:358] "Generic (PLEG): container finished" podID="6f07541a-6ad1-43d0-9a04-540a16f67cec" containerID="5f7d7558d42561b08b6cb45ab0887e69f8508746b9e4598bb9164194e2997ef0" exitCode=0 Apr 16 19:54:02.933203 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:02.933187 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88wb2" event={"ID":"6f07541a-6ad1-43d0-9a04-540a16f67cec","Type":"ContainerDied","Data":"5f7d7558d42561b08b6cb45ab0887e69f8508746b9e4598bb9164194e2997ef0"} Apr 16 19:54:02.935670 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:02.935652 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 19:54:02.935995 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:02.935974 2561 generic.go:358] "Generic (PLEG): container finished" podID="356a3ae0-1448-42b5-a8eb-eb35ac7b6f96" containerID="7ff8eb40ce3e830194bfdd31424299fe2fdecafd250084e3fc3ec06952606d0c" exitCode=1 Apr 16 19:54:02.936088 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:02.936030 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" event={"ID":"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96","Type":"ContainerStarted","Data":"8d2affbae688bfea92c2623dba740a34f1a0500f1f605debe5c20c7b4cf8d334"} Apr 16 19:54:02.936088 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:02.936049 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" event={"ID":"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96","Type":"ContainerStarted","Data":"18922c098933ada717fd1fed27bf6aae3a690315907543a1052ec73aa3b61d25"} Apr 16 19:54:02.936088 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:02.936059 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" event={"ID":"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96","Type":"ContainerStarted","Data":"1ff630d6105348fc13fd0908dcc792960389d780b358bfe7348299da8192b696"} Apr 16 19:54:02.936088 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:02.936067 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" event={"ID":"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96","Type":"ContainerStarted","Data":"dace0bc447c5b48beb6aa5bbbdbd78475597f0c532b68f76c5ddc2fbb5f5e7a9"} Apr 16 19:54:02.936088 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:02.936075 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" event={"ID":"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96","Type":"ContainerDied","Data":"7ff8eb40ce3e830194bfdd31424299fe2fdecafd250084e3fc3ec06952606d0c"} Apr 16 19:54:02.936088 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:02.936084 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" event={"ID":"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96","Type":"ContainerStarted","Data":"9a70d33cd9b282db880d0c2c125798746044b74416c45573314b07aa3eac3322"} Apr 16 19:54:02.937378 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:02.937358 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ddx6x" event={"ID":"c3753347-dfcc-47be-a251-65c3470b8045","Type":"ContainerStarted","Data":"9a4db7a2852b09dc7315821a530d3d70ff005bb39e92740f8da5bf5da2df3ff0"} Apr 16 19:54:02.938508 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:02.938489 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" event={"ID":"344966ff-23f6-4f65-ae57-5f820201e8b8","Type":"ContainerStarted","Data":"76e5f765093983c94b5234cd6938a470f0fd054d0daad7540c006427c844f4e0"} Apr 16 19:54:02.945821 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:02.945766 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-rlrsk" podStartSLOduration=3.130714192 podStartE2EDuration="19.94575422s" podCreationTimestamp="2026-04-16 19:53:43 +0000 UTC" firstStartedPulling="2026-04-16 19:53:45.323326253 +0000 UTC m=+3.046372150" lastFinishedPulling="2026-04-16 19:54:02.138366292 +0000 UTC m=+19.861412178" observedRunningTime="2026-04-16 19:54:02.94509383 +0000 UTC m=+20.668139736" watchObservedRunningTime="2026-04-16 19:54:02.94575422 +0000 UTC m=+20.668800126" Apr 16 19:54:02.961406 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:02.961364 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-wm2x4" podStartSLOduration=4.141141946 podStartE2EDuration="20.961353455s" podCreationTimestamp="2026-04-16 19:53:42 +0000 UTC" firstStartedPulling="2026-04-16 19:53:45.318158303 +0000 UTC m=+3.041204188" lastFinishedPulling="2026-04-16 19:54:02.138369812 +0000 UTC m=+19.861415697" observedRunningTime="2026-04-16 19:54:02.960959435 +0000 UTC m=+20.684005338" watchObservedRunningTime="2026-04-16 19:54:02.961353455 +0000 UTC m=+20.684399361" Apr 16 19:54:03.008815 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:03.008160 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ddx6x" podStartSLOduration=3.519282951 podStartE2EDuration="20.008143681s" podCreationTimestamp="2026-04-16 19:53:43 +0000 UTC" firstStartedPulling="2026-04-16 19:53:45.320262314 +0000 UTC m=+3.043308207" lastFinishedPulling="2026-04-16 19:54:01.809123034 +0000 UTC m=+19.532168937" observedRunningTime="2026-04-16 19:54:02.980209127 +0000 UTC m=+20.703255033" watchObservedRunningTime="2026-04-16 19:54:03.008143681 +0000 UTC m=+20.731189589" Apr 16 19:54:03.029732 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:03.029681 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2wctn" podStartSLOduration=3.110636659 podStartE2EDuration="20.029666993s" podCreationTimestamp="2026-04-16 19:53:43 +0000 UTC" firstStartedPulling="2026-04-16 19:53:45.315292475 +0000 UTC m=+3.038338366" lastFinishedPulling="2026-04-16 19:54:02.234322799 +0000 UTC m=+19.957368700" observedRunningTime="2026-04-16 19:54:03.008269843 +0000 UTC m=+20.731315750" watchObservedRunningTime="2026-04-16 19:54:03.029666993 +0000 UTC m=+20.752712899" Apr 16 19:54:03.210181 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:03.209975 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8593da00-be3a-459a-9bf6-ee2f4988af66-original-pull-secret\") pod \"global-pull-secret-syncer-wfp62\" (UID: \"8593da00-be3a-459a-9bf6-ee2f4988af66\") " pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:54:03.210298 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:03.210122 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:03.210375 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:03.210316 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8593da00-be3a-459a-9bf6-ee2f4988af66-original-pull-secret podName:8593da00-be3a-459a-9bf6-ee2f4988af66 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:07.210299101 +0000 UTC m=+24.933344987 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8593da00-be3a-459a-9bf6-ee2f4988af66-original-pull-secret") pod "global-pull-secret-syncer-wfp62" (UID: "8593da00-be3a-459a-9bf6-ee2f4988af66") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:03.261406 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:03.261380 2561 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 19:54:03.815244 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:03.815067 2561 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T19:54:03.261403837Z","UUID":"04e0937a-dc83-47f4-9a62-2b063879bd56","Handler":null,"Name":"","Endpoint":""} Apr 16 19:54:03.818761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:03.818735 2561 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 19:54:03.818761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:03.818769 2561 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 19:54:03.835000 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:03.834973 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:54:03.835140 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:03.835118 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wfp62" podUID="8593da00-be3a-459a-9bf6-ee2f4988af66" Apr 16 19:54:03.942501 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:03.942465 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" event={"ID":"344966ff-23f6-4f65-ae57-5f820201e8b8","Type":"ContainerStarted","Data":"b556d7ffbfe2eaf48e1b35ba153ce7fc4acc985a7f060b4c9c8979a0570234b2"} Apr 16 19:54:03.944259 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:03.944232 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-84k6z" event={"ID":"24e2ccb6-cefd-4e6b-baff-95b016092cf8","Type":"ContainerStarted","Data":"43471136e627079c220bd964ff4e56be8e463929409dda6071ae57d859a8994f"} Apr 16 19:54:03.957486 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:03.957436 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-84k6z" podStartSLOduration=4.140446021 podStartE2EDuration="20.957419743s" podCreationTimestamp="2026-04-16 19:53:43 +0000 UTC" firstStartedPulling="2026-04-16 19:53:45.321390611 +0000 UTC m=+3.044436508" lastFinishedPulling="2026-04-16 19:54:02.138364345 +0000 UTC m=+19.861410230" observedRunningTime="2026-04-16 19:54:03.95655272 +0000 UTC m=+21.679598626" watchObservedRunningTime="2026-04-16 19:54:03.957419743 +0000 UTC m=+21.680465651" Apr 16 19:54:04.835164 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:04.835128 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:54:04.835357 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:04.835264 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nx45q" podUID="e0b9420a-1c3e-47b5-b187-827cb7f39aea" Apr 16 19:54:04.835468 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:04.835135 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:54:04.835593 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:04.835550 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qnt55" podUID="58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f" Apr 16 19:54:04.949598 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:04.949567 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 19:54:04.950142 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:04.949979 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" event={"ID":"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96","Type":"ContainerStarted","Data":"7abb4356afb026ba701eab19dc0ca3068d987906095dd71f6fa8327e41f1c4e3"} Apr 16 19:54:04.952001 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:04.951969 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" event={"ID":"344966ff-23f6-4f65-ae57-5f820201e8b8","Type":"ContainerStarted","Data":"be7f0817883841ba9156d357520913a31f17638c33519f296b51f2377130b690"} Apr 16 19:54:04.986322 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:04.986236 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbn2f" podStartSLOduration=4.237979511 podStartE2EDuration="22.986223264s" podCreationTimestamp="2026-04-16 19:53:42 +0000 UTC" firstStartedPulling="2026-04-16 19:53:45.313851978 +0000 UTC m=+3.036897866" lastFinishedPulling="2026-04-16 19:54:04.062095736 +0000 UTC m=+21.785141619" observedRunningTime="2026-04-16 19:54:04.986128528 +0000 UTC m=+22.709174435" watchObservedRunningTime="2026-04-16 19:54:04.986223264 +0000 UTC m=+22.709269169" Apr 16 19:54:05.835005 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:05.834973 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:54:05.835190 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:05.835086 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wfp62" podUID="8593da00-be3a-459a-9bf6-ee2f4988af66" Apr 16 19:54:06.102970 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:06.102893 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-wm2x4" Apr 16 19:54:06.103631 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:06.103548 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-wm2x4" Apr 16 19:54:06.835265 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:06.835242 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:54:06.835353 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:06.835287 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:54:06.835411 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:06.835389 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nx45q" podUID="e0b9420a-1c3e-47b5-b187-827cb7f39aea" Apr 16 19:54:06.835485 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:06.835468 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qnt55" podUID="58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f" Apr 16 19:54:06.961178 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:06.960860 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 19:54:06.963168 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:06.961890 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" event={"ID":"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96","Type":"ContainerStarted","Data":"1d93047a614a5792f6b1fd7394df85c4a7490d25393cb91cbade2115501002c7"} Apr 16 19:54:06.963168 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:06.962255 2561 scope.go:117] "RemoveContainer" containerID="7ff8eb40ce3e830194bfdd31424299fe2fdecafd250084e3fc3ec06952606d0c" Apr 16 19:54:06.963168 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:06.962337 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:54:06.963168 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:06.962361 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:54:06.963168 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:06.962370 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:54:06.963168 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:06.962380 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-wm2x4" Apr 16 19:54:06.963168 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:06.963126 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-wm2x4" Apr 16 19:54:06.980373 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:06.980339 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:54:06.981419 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:06.981368 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:54:07.240014 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:07.239984 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8593da00-be3a-459a-9bf6-ee2f4988af66-original-pull-secret\") pod \"global-pull-secret-syncer-wfp62\" (UID: \"8593da00-be3a-459a-9bf6-ee2f4988af66\") " pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:54:07.240496 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:07.240115 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:07.240496 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:07.240167 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8593da00-be3a-459a-9bf6-ee2f4988af66-original-pull-secret podName:8593da00-be3a-459a-9bf6-ee2f4988af66 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:15.240153616 +0000 UTC m=+32.963199500 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8593da00-be3a-459a-9bf6-ee2f4988af66-original-pull-secret") pod "global-pull-secret-syncer-wfp62" (UID: "8593da00-be3a-459a-9bf6-ee2f4988af66") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:07.835422 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:07.835391 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:54:07.835577 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:07.835487 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wfp62" podUID="8593da00-be3a-459a-9bf6-ee2f4988af66" Apr 16 19:54:07.965364 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:07.965329 2561 generic.go:358] "Generic (PLEG): container finished" podID="6f07541a-6ad1-43d0-9a04-540a16f67cec" containerID="2855661f285c5bcf1d630b92c8cdeb7298dc0bd1b1bfc0a743c754d89b64bbd3" exitCode=0 Apr 16 19:54:07.965542 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:07.965419 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88wb2" event={"ID":"6f07541a-6ad1-43d0-9a04-540a16f67cec","Type":"ContainerDied","Data":"2855661f285c5bcf1d630b92c8cdeb7298dc0bd1b1bfc0a743c754d89b64bbd3"} Apr 16 19:54:07.968652 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:07.968637 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 19:54:07.969036 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:07.969007 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" event={"ID":"356a3ae0-1448-42b5-a8eb-eb35ac7b6f96","Type":"ContainerStarted","Data":"e0b1729b950a538905f545e7e30cd7bc172481f0362ea12e67169b15ed3aad80"} Apr 16 19:54:08.022672 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:08.022631 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" podStartSLOduration=8.13379225 podStartE2EDuration="25.022619388s" podCreationTimestamp="2026-04-16 19:53:43 +0000 UTC" firstStartedPulling="2026-04-16 19:53:45.322919942 +0000 UTC m=+3.045965840" lastFinishedPulling="2026-04-16 19:54:02.211747088 +0000 UTC m=+19.934792978" observedRunningTime="2026-04-16 19:54:08.020899067 +0000 UTC m=+25.743944973" watchObservedRunningTime="2026-04-16 19:54:08.022619388 +0000 UTC m=+25.745665294" Apr 16 19:54:08.821419 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:08.821262 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nx45q"] Apr 16 19:54:08.821720 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:08.821508 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:54:08.821720 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:08.821593 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nx45q" podUID="e0b9420a-1c3e-47b5-b187-827cb7f39aea" Apr 16 19:54:08.821810 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:08.821719 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wfp62"] Apr 16 19:54:08.821810 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:08.821784 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:54:08.821886 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:08.821868 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wfp62" podUID="8593da00-be3a-459a-9bf6-ee2f4988af66" Apr 16 19:54:08.831565 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:08.831545 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qnt55"] Apr 16 19:54:08.831656 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:08.831625 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:54:08.831707 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:08.831692 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qnt55" podUID="58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f" Apr 16 19:54:08.972537 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:08.972511 2561 generic.go:358] "Generic (PLEG): container finished" podID="6f07541a-6ad1-43d0-9a04-540a16f67cec" containerID="0f43fbf54fdadf02a6fd818fcff115a6542d1a7a64e971a0dc661121b15d42bb" exitCode=0 Apr 16 19:54:08.972654 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:08.972576 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88wb2" event={"ID":"6f07541a-6ad1-43d0-9a04-540a16f67cec","Type":"ContainerDied","Data":"0f43fbf54fdadf02a6fd818fcff115a6542d1a7a64e971a0dc661121b15d42bb"} Apr 16 19:54:09.976551 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:09.976522 2561 generic.go:358] "Generic (PLEG): container finished" podID="6f07541a-6ad1-43d0-9a04-540a16f67cec" containerID="e4d98c3c4ac54e5ca5a00de7f3843dfe3ca299ee298cbf00ae231c0a863d3af0" exitCode=0 Apr 16 19:54:09.976924 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:09.976573 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88wb2" event={"ID":"6f07541a-6ad1-43d0-9a04-540a16f67cec","Type":"ContainerDied","Data":"e4d98c3c4ac54e5ca5a00de7f3843dfe3ca299ee298cbf00ae231c0a863d3af0"} Apr 16 19:54:10.835390 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:10.835360 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:54:10.835597 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:10.835396 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:54:10.835597 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:10.835443 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:54:10.835597 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:10.835546 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nx45q" podUID="e0b9420a-1c3e-47b5-b187-827cb7f39aea" Apr 16 19:54:10.835743 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:10.835676 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wfp62" podUID="8593da00-be3a-459a-9bf6-ee2f4988af66" Apr 16 19:54:10.835823 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:10.835780 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qnt55" podUID="58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f" Apr 16 19:54:12.836258 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:12.836225 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:54:12.836986 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:12.836323 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qnt55" podUID="58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f" Apr 16 19:54:12.836986 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:12.836418 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:54:12.836986 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:12.836545 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nx45q" podUID="e0b9420a-1c3e-47b5-b187-827cb7f39aea" Apr 16 19:54:12.836986 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:12.836607 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:54:12.836986 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:12.836713 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wfp62" podUID="8593da00-be3a-459a-9bf6-ee2f4988af66" Apr 16 19:54:14.081154 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.081115 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-201.ec2.internal" event="NodeReady" Apr 16 19:54:14.081593 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.081271 2561 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 19:54:14.127663 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.127641 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-c684876c-h8496"] Apr 16 19:54:14.155452 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.155417 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ln2rf"] Apr 16 19:54:14.155589 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.155508 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.161674 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.161447 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 19:54:14.161674 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.161564 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 19:54:14.162413 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.162323 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 19:54:14.162561 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.162447 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jx8h5\"" Apr 16 19:54:14.173438 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.173410 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-mwqfp"] Apr 16 19:54:14.173595 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.173566 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" Apr 16 19:54:14.176937 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.176916 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 19:54:14.178926 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.178907 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 19:54:14.179056 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.179039 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:14.179132 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.179085 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-xmjnh\"" Apr 16 19:54:14.191045 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.191025 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 19:54:14.196860 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.196840 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:14.197747 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.197726 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bjppc"] Apr 16 19:54:14.197944 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.197920 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-mwqfp" Apr 16 19:54:14.221410 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.221378 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-98598b8fb-wkrmq"] Apr 16 19:54:14.221561 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.221539 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bjppc" Apr 16 19:54:14.224894 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.224734 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 19:54:14.224894 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.224742 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 19:54:14.225073 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.225057 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 19:54:14.225127 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.225080 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-qgkcf\"" Apr 16 19:54:14.225176 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.225058 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 19:54:14.226041 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.226023 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:14.226175 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.226135 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 19:54:14.226371 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.226347 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-gmntr\"" Apr 16 19:54:14.226636 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.226595 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:14.236254 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.236237 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 19:54:14.248080 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.248059 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gjqjl"] Apr 16 19:54:14.248256 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.248239 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:14.252123 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.252106 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-p7c6q\"" Apr 16 19:54:14.252215 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.252106 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 19:54:14.252284 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.252107 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 19:54:14.252439 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.252421 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 19:54:14.252675 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.252659 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 19:54:14.253285 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.253163 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 19:54:14.253285 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.253163 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 19:54:14.272057 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.272021 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 19:54:14.273359 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.273338 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb"] Apr 16 19:54:14.273513 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.273492 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gjqjl" Apr 16 19:54:14.276542 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.276523 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:14.276727 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.276529 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:14.283905 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.283888 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-s8l76\"" Apr 16 19:54:14.293812 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.293778 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ln2rf"] Apr 16 19:54:14.293910 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.293816 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bjppc"] Apr 16 19:54:14.293910 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.293829 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gjqjl"] Apr 16 19:54:14.293910 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.293840 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-c684876c-h8496"] Apr 16 19:54:14.293910 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.293853 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-mwqfp"] Apr 16 19:54:14.293910 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.293866 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-94k49"] Apr 16 19:54:14.294143 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.293973 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb" Apr 16 19:54:14.298807 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.298656 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 19:54:14.298807 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.298738 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e54086bc-258c-4204-8105-7a5e491494fa-image-registry-private-configuration\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.298807 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.298768 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-dwpjk\"" Apr 16 19:54:14.299068 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.298770 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh8tt\" (UniqueName: \"kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-kube-api-access-gh8tt\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.299068 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.298924 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e54086bc-258c-4204-8105-7a5e491494fa-ca-trust-extracted\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.299068 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.298941 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 19:54:14.299068 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.298951 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be232f65-8167-4e83-83a8-d40670fbf702-trusted-ca\") pod \"console-operator-9d4b6777b-ln2rf\" (UID: \"be232f65-8167-4e83-83a8-d40670fbf702\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" Apr 16 19:54:14.299068 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.298980 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b33484f0-9ae5-4b31-8baa-d4219e39ddd9-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-mwqfp\" (UID: \"b33484f0-9ae5-4b31-8baa-d4219e39ddd9\") " pod="openshift-insights/insights-operator-585dfdc468-mwqfp" Apr 16 19:54:14.299068 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.299011 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e54086bc-258c-4204-8105-7a5e491494fa-registry-certificates\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.299068 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.299033 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b33484f0-9ae5-4b31-8baa-d4219e39ddd9-serving-cert\") pod \"insights-operator-585dfdc468-mwqfp\" (UID: \"b33484f0-9ae5-4b31-8baa-d4219e39ddd9\") " pod="openshift-insights/insights-operator-585dfdc468-mwqfp" Apr 16 19:54:14.299068 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.299072 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 19:54:14.299495 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.299082 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18721546-1063-46a1-8715-a40872933b22-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bjppc\" (UID: \"18721546-1063-46a1-8715-a40872933b22\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bjppc" Apr 16 19:54:14.299495 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.299108 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mnpr\" (UniqueName: \"kubernetes.io/projected/18721546-1063-46a1-8715-a40872933b22-kube-api-access-9mnpr\") pod \"cluster-samples-operator-6dc5bdb6b4-bjppc\" (UID: \"18721546-1063-46a1-8715-a40872933b22\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bjppc" Apr 16 19:54:14.299495 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.299246 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e54086bc-258c-4204-8105-7a5e491494fa-installation-pull-secrets\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.299495 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.299282 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e54086bc-258c-4204-8105-7a5e491494fa-trusted-ca\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.299495 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.299401 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jgb2\" (UniqueName: \"kubernetes.io/projected/b33484f0-9ae5-4b31-8baa-d4219e39ddd9-kube-api-access-9jgb2\") pod \"insights-operator-585dfdc468-mwqfp\" (UID: \"b33484f0-9ae5-4b31-8baa-d4219e39ddd9\") " pod="openshift-insights/insights-operator-585dfdc468-mwqfp" Apr 16 19:54:14.299495 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.299438 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.299495 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.299461 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be232f65-8167-4e83-83a8-d40670fbf702-config\") pod \"console-operator-9d4b6777b-ln2rf\" (UID: \"be232f65-8167-4e83-83a8-d40670fbf702\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" Apr 16 19:54:14.299495 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.299481 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be232f65-8167-4e83-83a8-d40670fbf702-serving-cert\") pod \"console-operator-9d4b6777b-ln2rf\" (UID: \"be232f65-8167-4e83-83a8-d40670fbf702\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" Apr 16 19:54:14.299880 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.299501 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dh47\" (UniqueName: \"kubernetes.io/projected/be232f65-8167-4e83-83a8-d40670fbf702-kube-api-access-9dh47\") pod \"console-operator-9d4b6777b-ln2rf\" (UID: \"be232f65-8167-4e83-83a8-d40670fbf702\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" Apr 16 19:54:14.299880 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.299527 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b33484f0-9ae5-4b31-8baa-d4219e39ddd9-tmp\") pod \"insights-operator-585dfdc468-mwqfp\" (UID: \"b33484f0-9ae5-4b31-8baa-d4219e39ddd9\") " pod="openshift-insights/insights-operator-585dfdc468-mwqfp" Apr 16 19:54:14.299880 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.299566 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b33484f0-9ae5-4b31-8baa-d4219e39ddd9-snapshots\") pod \"insights-operator-585dfdc468-mwqfp\" (UID: \"b33484f0-9ae5-4b31-8baa-d4219e39ddd9\") " pod="openshift-insights/insights-operator-585dfdc468-mwqfp" Apr 16 19:54:14.299880 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.299660 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-bound-sa-token\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.299880 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.299727 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b33484f0-9ae5-4b31-8baa-d4219e39ddd9-service-ca-bundle\") pod \"insights-operator-585dfdc468-mwqfp\" (UID: \"b33484f0-9ae5-4b31-8baa-d4219e39ddd9\") " pod="openshift-insights/insights-operator-585dfdc468-mwqfp" Apr 16 19:54:14.300333 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.300278 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 19:54:14.313876 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.313856 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb"] Apr 16 19:54:14.313979 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.313883 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-sqd4v"] Apr 16 19:54:14.314032 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.313998 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-94k49" Apr 16 19:54:14.318434 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.318347 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-bwqwm\"" Apr 16 19:54:14.318434 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.318372 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 19:54:14.318434 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.318384 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:14.318680 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.318504 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 19:54:14.320173 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.320141 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:14.343019 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.342935 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-spvh5"] Apr 16 19:54:14.343147 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.343031 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-sqd4v" Apr 16 19:54:14.346690 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.346636 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-2dssl\"" Apr 16 19:54:14.346690 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.346679 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:54:14.346872 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.346749 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:54:14.358435 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.358415 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-xbnwr"] Apr 16 19:54:14.358582 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.358561 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-spvh5" Apr 16 19:54:14.364107 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.364088 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 19:54:14.364193 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.364142 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 19:54:14.364193 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.364156 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-bkl87\"" Apr 16 19:54:14.364316 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.364192 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:14.364316 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.364231 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:14.376234 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.376208 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-94k49"] Apr 16 19:54:14.376234 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.376231 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-xbnwr"] Apr 16 19:54:14.376234 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.376240 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-sqd4v"] Apr 16 19:54:14.376438 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.376249 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-98598b8fb-wkrmq"] Apr 16 19:54:14.376438 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.376258 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-spvh5"] Apr 16 19:54:14.376438 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.376268 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-k2dcx"] Apr 16 19:54:14.376438 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.376337 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-xbnwr" Apr 16 19:54:14.380647 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.380598 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-z96j5\"" Apr 16 19:54:14.380757 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.380652 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 19:54:14.380757 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.380669 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 19:54:14.400538 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.400508 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k2dcx"] Apr 16 19:54:14.400633 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.400557 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cdc8w"] Apr 16 19:54:14.400696 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.400649 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhgst\" (UniqueName: \"kubernetes.io/projected/fe409e83-4bf3-40c8-b46d-61a088fdae77-kube-api-access-xhgst\") pod \"volume-data-source-validator-7c6cbb6c87-gjqjl\" (UID: \"fe409e83-4bf3-40c8-b46d-61a088fdae77\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gjqjl" Apr 16 19:54:14.400761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.400692 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e54086bc-258c-4204-8105-7a5e491494fa-registry-certificates\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.400761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.400656 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k2dcx" Apr 16 19:54:14.400761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.400724 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b33484f0-9ae5-4b31-8baa-d4219e39ddd9-serving-cert\") pod \"insights-operator-585dfdc468-mwqfp\" (UID: \"b33484f0-9ae5-4b31-8baa-d4219e39ddd9\") " pod="openshift-insights/insights-operator-585dfdc468-mwqfp" Apr 16 19:54:14.400965 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.400823 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18721546-1063-46a1-8715-a40872933b22-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bjppc\" (UID: \"18721546-1063-46a1-8715-a40872933b22\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bjppc" Apr 16 19:54:14.400965 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.400861 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b33484f0-9ae5-4b31-8baa-d4219e39ddd9-snapshots\") pod \"insights-operator-585dfdc468-mwqfp\" (UID: \"b33484f0-9ae5-4b31-8baa-d4219e39ddd9\") " pod="openshift-insights/insights-operator-585dfdc468-mwqfp" Apr 16 19:54:14.400965 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.400905 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jgb2\" (UniqueName: \"kubernetes.io/projected/b33484f0-9ae5-4b31-8baa-d4219e39ddd9-kube-api-access-9jgb2\") pod \"insights-operator-585dfdc468-mwqfp\" (UID: \"b33484f0-9ae5-4b31-8baa-d4219e39ddd9\") " pod="openshift-insights/insights-operator-585dfdc468-mwqfp" Apr 16 19:54:14.400965 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.400932 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e23a4d9-ff98-49a4-a888-9d26648f61cf-config\") pod \"service-ca-operator-d6fc45fc5-94k49\" (UID: \"3e23a4d9-ff98-49a4-a888-9d26648f61cf\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-94k49" Apr 16 19:54:14.400965 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.400959 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be232f65-8167-4e83-83a8-d40670fbf702-config\") pod \"console-operator-9d4b6777b-ln2rf\" (UID: \"be232f65-8167-4e83-83a8-d40670fbf702\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" Apr 16 19:54:14.401233 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.400996 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be232f65-8167-4e83-83a8-d40670fbf702-serving-cert\") pod \"console-operator-9d4b6777b-ln2rf\" (UID: \"be232f65-8167-4e83-83a8-d40670fbf702\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" Apr 16 19:54:14.401233 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401025 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b33484f0-9ae5-4b31-8baa-d4219e39ddd9-tmp\") pod \"insights-operator-585dfdc468-mwqfp\" (UID: \"b33484f0-9ae5-4b31-8baa-d4219e39ddd9\") " pod="openshift-insights/insights-operator-585dfdc468-mwqfp" Apr 16 19:54:14.401233 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401051 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-bound-sa-token\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.401233 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401088 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/0f95dfef-42c3-454a-9807-3c895a970729-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tpdvb\" (UID: \"0f95dfef-42c3-454a-9807-3c895a970729\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb" Apr 16 19:54:14.401233 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401123 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e23a4d9-ff98-49a4-a888-9d26648f61cf-serving-cert\") pod \"service-ca-operator-d6fc45fc5-94k49\" (UID: \"3e23a4d9-ff98-49a4-a888-9d26648f61cf\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-94k49" Apr 16 19:54:14.401233 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401149 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6lw8\" (UniqueName: \"kubernetes.io/projected/3e23a4d9-ff98-49a4-a888-9d26648f61cf-kube-api-access-g6lw8\") pod \"service-ca-operator-d6fc45fc5-94k49\" (UID: \"3e23a4d9-ff98-49a4-a888-9d26648f61cf\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-94k49" Apr 16 19:54:14.401233 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401175 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e54086bc-258c-4204-8105-7a5e491494fa-image-registry-private-configuration\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.401233 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401202 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gh8tt\" (UniqueName: \"kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-kube-api-access-gh8tt\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.401233 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401229 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps2fd\" (UniqueName: \"kubernetes.io/projected/b3bac591-c06b-4b40-b42a-f85548b297f0-kube-api-access-ps2fd\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:14.401884 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401258 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e54086bc-258c-4204-8105-7a5e491494fa-ca-trust-extracted\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.401884 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401284 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be232f65-8167-4e83-83a8-d40670fbf702-trusted-ca\") pod \"console-operator-9d4b6777b-ln2rf\" (UID: \"be232f65-8167-4e83-83a8-d40670fbf702\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" Apr 16 19:54:14.401884 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401313 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3bac591-c06b-4b40-b42a-f85548b297f0-service-ca-bundle\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:14.401884 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401338 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnjvr\" (UniqueName: \"kubernetes.io/projected/0f95dfef-42c3-454a-9807-3c895a970729-kube-api-access-tnjvr\") pod \"cluster-monitoring-operator-75587bd455-tpdvb\" (UID: \"0f95dfef-42c3-454a-9807-3c895a970729\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb" Apr 16 19:54:14.401884 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401417 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mnpr\" (UniqueName: \"kubernetes.io/projected/18721546-1063-46a1-8715-a40872933b22-kube-api-access-9mnpr\") pod \"cluster-samples-operator-6dc5bdb6b4-bjppc\" (UID: \"18721546-1063-46a1-8715-a40872933b22\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bjppc" Apr 16 19:54:14.401884 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401469 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e54086bc-258c-4204-8105-7a5e491494fa-trusted-ca\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.401884 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401501 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dh47\" (UniqueName: \"kubernetes.io/projected/be232f65-8167-4e83-83a8-d40670fbf702-kube-api-access-9dh47\") pod \"console-operator-9d4b6777b-ln2rf\" (UID: \"be232f65-8167-4e83-83a8-d40670fbf702\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" Apr 16 19:54:14.401884 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401534 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e54086bc-258c-4204-8105-7a5e491494fa-installation-pull-secrets\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.401884 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401565 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.401884 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401593 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-default-certificate\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:14.401884 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401651 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f95dfef-42c3-454a-9807-3c895a970729-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tpdvb\" (UID: \"0f95dfef-42c3-454a-9807-3c895a970729\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb" Apr 16 19:54:14.401884 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401688 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b33484f0-9ae5-4b31-8baa-d4219e39ddd9-service-ca-bundle\") pod \"insights-operator-585dfdc468-mwqfp\" (UID: \"b33484f0-9ae5-4b31-8baa-d4219e39ddd9\") " pod="openshift-insights/insights-operator-585dfdc468-mwqfp" Apr 16 19:54:14.401884 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401722 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-stats-auth\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:14.401884 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401767 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-metrics-certs\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:14.401884 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401817 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b33484f0-9ae5-4b31-8baa-d4219e39ddd9-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-mwqfp\" (UID: \"b33484f0-9ae5-4b31-8baa-d4219e39ddd9\") " pod="openshift-insights/insights-operator-585dfdc468-mwqfp" Apr 16 19:54:14.402464 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.401843 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e54086bc-258c-4204-8105-7a5e491494fa-registry-certificates\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.402464 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.402085 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be232f65-8167-4e83-83a8-d40670fbf702-config\") pod \"console-operator-9d4b6777b-ln2rf\" (UID: \"be232f65-8167-4e83-83a8-d40670fbf702\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" Apr 16 19:54:14.403138 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.402674 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be232f65-8167-4e83-83a8-d40670fbf702-trusted-ca\") pod \"console-operator-9d4b6777b-ln2rf\" (UID: \"be232f65-8167-4e83-83a8-d40670fbf702\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" Apr 16 19:54:14.403138 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.402705 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b33484f0-9ae5-4b31-8baa-d4219e39ddd9-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-mwqfp\" (UID: \"b33484f0-9ae5-4b31-8baa-d4219e39ddd9\") " pod="openshift-insights/insights-operator-585dfdc468-mwqfp" Apr 16 19:54:14.403138 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:14.402803 2561 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:14.403138 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:14.402892 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18721546-1063-46a1-8715-a40872933b22-samples-operator-tls podName:18721546-1063-46a1-8715-a40872933b22 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:14.902872194 +0000 UTC m=+32.625918082 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/18721546-1063-46a1-8715-a40872933b22-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bjppc" (UID: "18721546-1063-46a1-8715-a40872933b22") : secret "samples-operator-tls" not found Apr 16 19:54:14.403138 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.402983 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b33484f0-9ae5-4b31-8baa-d4219e39ddd9-tmp\") pod \"insights-operator-585dfdc468-mwqfp\" (UID: \"b33484f0-9ae5-4b31-8baa-d4219e39ddd9\") " pod="openshift-insights/insights-operator-585dfdc468-mwqfp" Apr 16 19:54:14.404152 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:14.403273 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:14.404152 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:14.403287 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c684876c-h8496: secret "image-registry-tls" not found Apr 16 19:54:14.404152 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:14.403331 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls podName:e54086bc-258c-4204-8105-7a5e491494fa nodeName:}" failed. No retries permitted until 2026-04-16 19:54:14.903317041 +0000 UTC m=+32.626362928 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls") pod "image-registry-c684876c-h8496" (UID: "e54086bc-258c-4204-8105-7a5e491494fa") : secret "image-registry-tls" not found Apr 16 19:54:14.404152 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.403453 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e54086bc-258c-4204-8105-7a5e491494fa-ca-trust-extracted\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.404152 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.403478 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b33484f0-9ae5-4b31-8baa-d4219e39ddd9-service-ca-bundle\") pod \"insights-operator-585dfdc468-mwqfp\" (UID: \"b33484f0-9ae5-4b31-8baa-d4219e39ddd9\") " pod="openshift-insights/insights-operator-585dfdc468-mwqfp" Apr 16 19:54:14.404152 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.403813 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b33484f0-9ae5-4b31-8baa-d4219e39ddd9-snapshots\") pod \"insights-operator-585dfdc468-mwqfp\" (UID: \"b33484f0-9ae5-4b31-8baa-d4219e39ddd9\") " pod="openshift-insights/insights-operator-585dfdc468-mwqfp" Apr 16 19:54:14.404464 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.404170 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 19:54:14.404464 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.404225 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 19:54:14.404464 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.404240 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e54086bc-258c-4204-8105-7a5e491494fa-trusted-ca\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.404613 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.404554 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mmrlb\"" Apr 16 19:54:14.404946 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.404776 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 19:54:14.406534 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.406358 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b33484f0-9ae5-4b31-8baa-d4219e39ddd9-serving-cert\") pod \"insights-operator-585dfdc468-mwqfp\" (UID: \"b33484f0-9ae5-4b31-8baa-d4219e39ddd9\") " pod="openshift-insights/insights-operator-585dfdc468-mwqfp" Apr 16 19:54:14.406642 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.406397 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e54086bc-258c-4204-8105-7a5e491494fa-installation-pull-secrets\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.406642 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.406404 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e54086bc-258c-4204-8105-7a5e491494fa-image-registry-private-configuration\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.413267 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.410303 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-bound-sa-token\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.416092 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.415475 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh8tt\" (UniqueName: \"kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-kube-api-access-gh8tt\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.417477 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.417455 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jgb2\" (UniqueName: \"kubernetes.io/projected/b33484f0-9ae5-4b31-8baa-d4219e39ddd9-kube-api-access-9jgb2\") pod \"insights-operator-585dfdc468-mwqfp\" (UID: \"b33484f0-9ae5-4b31-8baa-d4219e39ddd9\") " pod="openshift-insights/insights-operator-585dfdc468-mwqfp" Apr 16 19:54:14.418391 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.418367 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mnpr\" (UniqueName: \"kubernetes.io/projected/18721546-1063-46a1-8715-a40872933b22-kube-api-access-9mnpr\") pod \"cluster-samples-operator-6dc5bdb6b4-bjppc\" (UID: \"18721546-1063-46a1-8715-a40872933b22\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bjppc" Apr 16 19:54:14.419385 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.419366 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be232f65-8167-4e83-83a8-d40670fbf702-serving-cert\") pod \"console-operator-9d4b6777b-ln2rf\" (UID: \"be232f65-8167-4e83-83a8-d40670fbf702\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" Apr 16 19:54:14.426174 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.426155 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cdc8w"] Apr 16 19:54:14.426306 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.426265 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cdc8w" Apr 16 19:54:14.426553 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.426511 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dh47\" (UniqueName: \"kubernetes.io/projected/be232f65-8167-4e83-83a8-d40670fbf702-kube-api-access-9dh47\") pod \"console-operator-9d4b6777b-ln2rf\" (UID: \"be232f65-8167-4e83-83a8-d40670fbf702\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" Apr 16 19:54:14.430478 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.430456 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 19:54:14.430633 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.430587 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 19:54:14.430633 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.430606 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 19:54:14.430818 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.430672 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 19:54:14.430941 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.430923 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dknqq\"" Apr 16 19:54:14.485358 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.485329 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" Apr 16 19:54:14.502695 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.502651 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e7e6010-fad1-4881-8816-b024c8853151-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-spvh5\" (UID: \"3e7e6010-fad1-4881-8816-b024c8853151\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-spvh5" Apr 16 19:54:14.502860 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.502741 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5brb2\" (UniqueName: \"kubernetes.io/projected/165d7242-2ef1-481d-992a-09e3364e0626-kube-api-access-5brb2\") pod \"ingress-canary-k2dcx\" (UID: \"165d7242-2ef1-481d-992a-09e3364e0626\") " pod="openshift-ingress-canary/ingress-canary-k2dcx" Apr 16 19:54:14.502860 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.502780 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-default-certificate\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:14.502860 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.502832 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqjw4\" (UniqueName: \"kubernetes.io/projected/da823131-b1c7-41ae-a0e6-3fa763f3d110-kube-api-access-wqjw4\") pod \"network-check-source-8894fc9bd-sqd4v\" (UID: \"da823131-b1c7-41ae-a0e6-3fa763f3d110\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-sqd4v" Apr 16 19:54:14.503035 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.502861 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f95dfef-42c3-454a-9807-3c895a970729-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tpdvb\" (UID: \"0f95dfef-42c3-454a-9807-3c895a970729\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb" Apr 16 19:54:14.503035 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.502891 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-stats-auth\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:14.503035 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.502941 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-metrics-certs\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:14.503035 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:14.502994 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:14.503232 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:14.503060 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f95dfef-42c3-454a-9807-3c895a970729-cluster-monitoring-operator-tls podName:0f95dfef-42c3-454a-9807-3c895a970729 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:15.00304606 +0000 UTC m=+32.726091948 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0f95dfef-42c3-454a-9807-3c895a970729-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tpdvb" (UID: "0f95dfef-42c3-454a-9807-3c895a970729") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:14.503232 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:14.503062 2561 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:14.503232 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.502990 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m27m2\" (UniqueName: \"kubernetes.io/projected/3e7e6010-fad1-4881-8816-b024c8853151-kube-api-access-m27m2\") pod \"kube-storage-version-migrator-operator-6769c5d45-spvh5\" (UID: \"3e7e6010-fad1-4881-8816-b024c8853151\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-spvh5" Apr 16 19:54:14.503232 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:14.503120 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-metrics-certs podName:b3bac591-c06b-4b40-b42a-f85548b297f0 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:15.00310443 +0000 UTC m=+32.726150320 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-metrics-certs") pod "router-default-98598b8fb-wkrmq" (UID: "b3bac591-c06b-4b40-b42a-f85548b297f0") : secret "router-metrics-certs-default" not found Apr 16 19:54:14.503232 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.503138 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhgst\" (UniqueName: \"kubernetes.io/projected/fe409e83-4bf3-40c8-b46d-61a088fdae77-kube-api-access-xhgst\") pod \"volume-data-source-validator-7c6cbb6c87-gjqjl\" (UID: \"fe409e83-4bf3-40c8-b46d-61a088fdae77\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gjqjl" Apr 16 19:54:14.503232 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.503168 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e7e6010-fad1-4881-8816-b024c8853151-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-spvh5\" (UID: \"3e7e6010-fad1-4881-8816-b024c8853151\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-spvh5" Apr 16 19:54:14.503534 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.503293 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-xbnwr\" (UID: \"8814ad85-70d4-48f0-8e96-6cc0a48c07eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xbnwr" Apr 16 19:54:14.503534 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.503378 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e23a4d9-ff98-49a4-a888-9d26648f61cf-config\") pod \"service-ca-operator-d6fc45fc5-94k49\" (UID: \"3e23a4d9-ff98-49a4-a888-9d26648f61cf\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-94k49" Apr 16 19:54:14.503534 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.503408 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xbnwr\" (UID: \"8814ad85-70d4-48f0-8e96-6cc0a48c07eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xbnwr" Apr 16 19:54:14.503534 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.503452 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/0f95dfef-42c3-454a-9807-3c895a970729-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tpdvb\" (UID: \"0f95dfef-42c3-454a-9807-3c895a970729\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb" Apr 16 19:54:14.503534 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.503488 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e23a4d9-ff98-49a4-a888-9d26648f61cf-serving-cert\") pod \"service-ca-operator-d6fc45fc5-94k49\" (UID: \"3e23a4d9-ff98-49a4-a888-9d26648f61cf\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-94k49" Apr 16 19:54:14.503534 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.503515 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6lw8\" (UniqueName: \"kubernetes.io/projected/3e23a4d9-ff98-49a4-a888-9d26648f61cf-kube-api-access-g6lw8\") pod \"service-ca-operator-d6fc45fc5-94k49\" (UID: \"3e23a4d9-ff98-49a4-a888-9d26648f61cf\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-94k49" Apr 16 19:54:14.503830 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.503543 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ps2fd\" (UniqueName: \"kubernetes.io/projected/b3bac591-c06b-4b40-b42a-f85548b297f0-kube-api-access-ps2fd\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:14.503830 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.503569 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/165d7242-2ef1-481d-992a-09e3364e0626-cert\") pod \"ingress-canary-k2dcx\" (UID: \"165d7242-2ef1-481d-992a-09e3364e0626\") " pod="openshift-ingress-canary/ingress-canary-k2dcx" Apr 16 19:54:14.503830 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.503746 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3bac591-c06b-4b40-b42a-f85548b297f0-service-ca-bundle\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:14.503830 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.503778 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnjvr\" (UniqueName: \"kubernetes.io/projected/0f95dfef-42c3-454a-9807-3c895a970729-kube-api-access-tnjvr\") pod \"cluster-monitoring-operator-75587bd455-tpdvb\" (UID: \"0f95dfef-42c3-454a-9807-3c895a970729\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb" Apr 16 19:54:14.504020 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:14.503918 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b3bac591-c06b-4b40-b42a-f85548b297f0-service-ca-bundle podName:b3bac591-c06b-4b40-b42a-f85548b297f0 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:15.003903764 +0000 UTC m=+32.726949655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b3bac591-c06b-4b40-b42a-f85548b297f0-service-ca-bundle") pod "router-default-98598b8fb-wkrmq" (UID: "b3bac591-c06b-4b40-b42a-f85548b297f0") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:14.504020 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.503975 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e23a4d9-ff98-49a4-a888-9d26648f61cf-config\") pod \"service-ca-operator-d6fc45fc5-94k49\" (UID: \"3e23a4d9-ff98-49a4-a888-9d26648f61cf\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-94k49" Apr 16 19:54:14.504303 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.504259 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/0f95dfef-42c3-454a-9807-3c895a970729-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tpdvb\" (UID: \"0f95dfef-42c3-454a-9807-3c895a970729\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb" Apr 16 19:54:14.505905 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.505865 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-stats-auth\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:14.506126 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.506107 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-default-certificate\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:14.506209 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.506142 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e23a4d9-ff98-49a4-a888-9d26648f61cf-serving-cert\") pod \"service-ca-operator-d6fc45fc5-94k49\" (UID: \"3e23a4d9-ff98-49a4-a888-9d26648f61cf\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-94k49" Apr 16 19:54:14.507428 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.507409 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-mwqfp" Apr 16 19:54:14.514838 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.514813 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6lw8\" (UniqueName: \"kubernetes.io/projected/3e23a4d9-ff98-49a4-a888-9d26648f61cf-kube-api-access-g6lw8\") pod \"service-ca-operator-d6fc45fc5-94k49\" (UID: \"3e23a4d9-ff98-49a4-a888-9d26648f61cf\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-94k49" Apr 16 19:54:14.515754 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.515712 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps2fd\" (UniqueName: \"kubernetes.io/projected/b3bac591-c06b-4b40-b42a-f85548b297f0-kube-api-access-ps2fd\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:14.516111 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.516067 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhgst\" (UniqueName: \"kubernetes.io/projected/fe409e83-4bf3-40c8-b46d-61a088fdae77-kube-api-access-xhgst\") pod \"volume-data-source-validator-7c6cbb6c87-gjqjl\" (UID: \"fe409e83-4bf3-40c8-b46d-61a088fdae77\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gjqjl" Apr 16 19:54:14.516733 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.516705 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnjvr\" (UniqueName: \"kubernetes.io/projected/0f95dfef-42c3-454a-9807-3c895a970729-kube-api-access-tnjvr\") pod \"cluster-monitoring-operator-75587bd455-tpdvb\" (UID: \"0f95dfef-42c3-454a-9807-3c895a970729\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb" Apr 16 19:54:14.582410 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.582376 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gjqjl" Apr 16 19:54:14.604457 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.604423 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e50b93c-a156-438b-a41f-2b4bac946727-tmp-dir\") pod \"dns-default-cdc8w\" (UID: \"8e50b93c-a156-438b-a41f-2b4bac946727\") " pod="openshift-dns/dns-default-cdc8w" Apr 16 19:54:14.604611 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.604474 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/165d7242-2ef1-481d-992a-09e3364e0626-cert\") pod \"ingress-canary-k2dcx\" (UID: \"165d7242-2ef1-481d-992a-09e3364e0626\") " pod="openshift-ingress-canary/ingress-canary-k2dcx" Apr 16 19:54:14.604611 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.604515 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e7e6010-fad1-4881-8816-b024c8853151-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-spvh5\" (UID: \"3e7e6010-fad1-4881-8816-b024c8853151\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-spvh5" Apr 16 19:54:14.604611 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.604578 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5brb2\" (UniqueName: \"kubernetes.io/projected/165d7242-2ef1-481d-992a-09e3364e0626-kube-api-access-5brb2\") pod \"ingress-canary-k2dcx\" (UID: \"165d7242-2ef1-481d-992a-09e3364e0626\") " pod="openshift-ingress-canary/ingress-canary-k2dcx" Apr 16 19:54:14.604611 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.604607 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e50b93c-a156-438b-a41f-2b4bac946727-metrics-tls\") pod \"dns-default-cdc8w\" (UID: \"8e50b93c-a156-438b-a41f-2b4bac946727\") " pod="openshift-dns/dns-default-cdc8w" Apr 16 19:54:14.604836 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.604650 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqjw4\" (UniqueName: \"kubernetes.io/projected/da823131-b1c7-41ae-a0e6-3fa763f3d110-kube-api-access-wqjw4\") pod \"network-check-source-8894fc9bd-sqd4v\" (UID: \"da823131-b1c7-41ae-a0e6-3fa763f3d110\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-sqd4v" Apr 16 19:54:14.604836 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:14.604708 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:14.604836 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:14.604804 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/165d7242-2ef1-481d-992a-09e3364e0626-cert podName:165d7242-2ef1-481d-992a-09e3364e0626 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:15.104767398 +0000 UTC m=+32.827813284 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/165d7242-2ef1-481d-992a-09e3364e0626-cert") pod "ingress-canary-k2dcx" (UID: "165d7242-2ef1-481d-992a-09e3364e0626") : secret "canary-serving-cert" not found Apr 16 19:54:14.604963 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.604708 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m27m2\" (UniqueName: \"kubernetes.io/projected/3e7e6010-fad1-4881-8816-b024c8853151-kube-api-access-m27m2\") pod \"kube-storage-version-migrator-operator-6769c5d45-spvh5\" (UID: \"3e7e6010-fad1-4881-8816-b024c8853151\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-spvh5" Apr 16 19:54:14.604963 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.604946 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e7e6010-fad1-4881-8816-b024c8853151-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-spvh5\" (UID: \"3e7e6010-fad1-4881-8816-b024c8853151\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-spvh5" Apr 16 19:54:14.605067 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.604981 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkh8m\" (UniqueName: \"kubernetes.io/projected/8e50b93c-a156-438b-a41f-2b4bac946727-kube-api-access-lkh8m\") pod \"dns-default-cdc8w\" (UID: \"8e50b93c-a156-438b-a41f-2b4bac946727\") " pod="openshift-dns/dns-default-cdc8w" Apr 16 19:54:14.605067 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.605015 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-xbnwr\" (UID: \"8814ad85-70d4-48f0-8e96-6cc0a48c07eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xbnwr" Apr 16 19:54:14.605067 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.605045 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e50b93c-a156-438b-a41f-2b4bac946727-config-volume\") pod \"dns-default-cdc8w\" (UID: \"8e50b93c-a156-438b-a41f-2b4bac946727\") " pod="openshift-dns/dns-default-cdc8w" Apr 16 19:54:14.605209 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.605143 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e7e6010-fad1-4881-8816-b024c8853151-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-spvh5\" (UID: \"3e7e6010-fad1-4881-8816-b024c8853151\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-spvh5" Apr 16 19:54:14.605263 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.605207 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xbnwr\" (UID: \"8814ad85-70d4-48f0-8e96-6cc0a48c07eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xbnwr" Apr 16 19:54:14.605352 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:14.605317 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:14.605407 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:14.605384 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-networking-console-plugin-cert podName:8814ad85-70d4-48f0-8e96-6cc0a48c07eb nodeName:}" failed. No retries permitted until 2026-04-16 19:54:15.105372399 +0000 UTC m=+32.828418284 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-xbnwr" (UID: "8814ad85-70d4-48f0-8e96-6cc0a48c07eb") : secret "networking-console-plugin-cert" not found Apr 16 19:54:14.605685 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.605667 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-xbnwr\" (UID: \"8814ad85-70d4-48f0-8e96-6cc0a48c07eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xbnwr" Apr 16 19:54:14.607308 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.607288 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e7e6010-fad1-4881-8816-b024c8853151-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-spvh5\" (UID: \"3e7e6010-fad1-4881-8816-b024c8853151\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-spvh5" Apr 16 19:54:14.616190 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.616159 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m27m2\" (UniqueName: \"kubernetes.io/projected/3e7e6010-fad1-4881-8816-b024c8853151-kube-api-access-m27m2\") pod \"kube-storage-version-migrator-operator-6769c5d45-spvh5\" (UID: \"3e7e6010-fad1-4881-8816-b024c8853151\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-spvh5" Apr 16 19:54:14.616347 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.616328 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqjw4\" (UniqueName: \"kubernetes.io/projected/da823131-b1c7-41ae-a0e6-3fa763f3d110-kube-api-access-wqjw4\") pod \"network-check-source-8894fc9bd-sqd4v\" (UID: \"da823131-b1c7-41ae-a0e6-3fa763f3d110\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-sqd4v" Apr 16 19:54:14.616568 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.616549 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5brb2\" (UniqueName: \"kubernetes.io/projected/165d7242-2ef1-481d-992a-09e3364e0626-kube-api-access-5brb2\") pod \"ingress-canary-k2dcx\" (UID: \"165d7242-2ef1-481d-992a-09e3364e0626\") " pod="openshift-ingress-canary/ingress-canary-k2dcx" Apr 16 19:54:14.624409 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.624392 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-94k49" Apr 16 19:54:14.653046 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.652987 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-sqd4v" Apr 16 19:54:14.668680 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.668650 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-spvh5" Apr 16 19:54:14.690121 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.690097 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hnzkl"] Apr 16 19:54:14.706115 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.706090 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e50b93c-a156-438b-a41f-2b4bac946727-metrics-tls\") pod \"dns-default-cdc8w\" (UID: \"8e50b93c-a156-438b-a41f-2b4bac946727\") " pod="openshift-dns/dns-default-cdc8w" Apr 16 19:54:14.706229 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.706175 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lkh8m\" (UniqueName: \"kubernetes.io/projected/8e50b93c-a156-438b-a41f-2b4bac946727-kube-api-access-lkh8m\") pod \"dns-default-cdc8w\" (UID: \"8e50b93c-a156-438b-a41f-2b4bac946727\") " pod="openshift-dns/dns-default-cdc8w" Apr 16 19:54:14.706229 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.706204 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e50b93c-a156-438b-a41f-2b4bac946727-config-volume\") pod \"dns-default-cdc8w\" (UID: \"8e50b93c-a156-438b-a41f-2b4bac946727\") " pod="openshift-dns/dns-default-cdc8w" Apr 16 19:54:14.706333 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:14.706244 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:14.706333 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:14.706311 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e50b93c-a156-438b-a41f-2b4bac946727-metrics-tls podName:8e50b93c-a156-438b-a41f-2b4bac946727 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:15.206288554 +0000 UTC m=+32.929334461 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8e50b93c-a156-438b-a41f-2b4bac946727-metrics-tls") pod "dns-default-cdc8w" (UID: "8e50b93c-a156-438b-a41f-2b4bac946727") : secret "dns-default-metrics-tls" not found Apr 16 19:54:14.706483 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.706424 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e50b93c-a156-438b-a41f-2b4bac946727-tmp-dir\") pod \"dns-default-cdc8w\" (UID: \"8e50b93c-a156-438b-a41f-2b4bac946727\") " pod="openshift-dns/dns-default-cdc8w" Apr 16 19:54:14.706761 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.706738 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e50b93c-a156-438b-a41f-2b4bac946727-tmp-dir\") pod \"dns-default-cdc8w\" (UID: \"8e50b93c-a156-438b-a41f-2b4bac946727\") " pod="openshift-dns/dns-default-cdc8w" Apr 16 19:54:14.715758 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.715739 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkh8m\" (UniqueName: \"kubernetes.io/projected/8e50b93c-a156-438b-a41f-2b4bac946727-kube-api-access-lkh8m\") pod \"dns-default-cdc8w\" (UID: \"8e50b93c-a156-438b-a41f-2b4bac946727\") " pod="openshift-dns/dns-default-cdc8w" Apr 16 19:54:14.716169 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.716153 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e50b93c-a156-438b-a41f-2b4bac946727-config-volume\") pod \"dns-default-cdc8w\" (UID: \"8e50b93c-a156-438b-a41f-2b4bac946727\") " pod="openshift-dns/dns-default-cdc8w" Apr 16 19:54:14.723306 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.723288 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hnzkl" Apr 16 19:54:14.726008 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.725991 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wgp9m\"" Apr 16 19:54:14.807207 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.807174 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4219a57d-3ca0-4b3e-9d49-d7d1178e2c5e-tmp-dir\") pod \"node-resolver-hnzkl\" (UID: \"4219a57d-3ca0-4b3e-9d49-d7d1178e2c5e\") " pod="openshift-dns/node-resolver-hnzkl" Apr 16 19:54:14.807375 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.807296 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4219a57d-3ca0-4b3e-9d49-d7d1178e2c5e-hosts-file\") pod \"node-resolver-hnzkl\" (UID: \"4219a57d-3ca0-4b3e-9d49-d7d1178e2c5e\") " pod="openshift-dns/node-resolver-hnzkl" Apr 16 19:54:14.807375 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.807360 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfrsq\" (UniqueName: \"kubernetes.io/projected/4219a57d-3ca0-4b3e-9d49-d7d1178e2c5e-kube-api-access-wfrsq\") pod \"node-resolver-hnzkl\" (UID: \"4219a57d-3ca0-4b3e-9d49-d7d1178e2c5e\") " pod="openshift-dns/node-resolver-hnzkl" Apr 16 19:54:14.835108 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.835080 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:54:14.835244 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.835080 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:54:14.835302 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.835080 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:54:14.838725 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.838581 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:54:14.838725 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.838599 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-t9ktw\"" Apr 16 19:54:14.838725 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.838645 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 19:54:14.838725 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.838677 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kmsjn\"" Apr 16 19:54:14.907872 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.907807 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18721546-1063-46a1-8715-a40872933b22-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bjppc\" (UID: \"18721546-1063-46a1-8715-a40872933b22\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bjppc" Apr 16 19:54:14.907982 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.907876 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4219a57d-3ca0-4b3e-9d49-d7d1178e2c5e-hosts-file\") pod \"node-resolver-hnzkl\" (UID: \"4219a57d-3ca0-4b3e-9d49-d7d1178e2c5e\") " pod="openshift-dns/node-resolver-hnzkl" Apr 16 19:54:14.907982 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.907935 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfrsq\" (UniqueName: \"kubernetes.io/projected/4219a57d-3ca0-4b3e-9d49-d7d1178e2c5e-kube-api-access-wfrsq\") pod \"node-resolver-hnzkl\" (UID: \"4219a57d-3ca0-4b3e-9d49-d7d1178e2c5e\") " pod="openshift-dns/node-resolver-hnzkl" Apr 16 19:54:14.907982 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:14.907959 2561 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:14.908145 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.908017 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:14.908145 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.908054 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4219a57d-3ca0-4b3e-9d49-d7d1178e2c5e-tmp-dir\") pod \"node-resolver-hnzkl\" (UID: \"4219a57d-3ca0-4b3e-9d49-d7d1178e2c5e\") " pod="openshift-dns/node-resolver-hnzkl" Apr 16 19:54:14.908145 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:14.908099 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18721546-1063-46a1-8715-a40872933b22-samples-operator-tls podName:18721546-1063-46a1-8715-a40872933b22 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:15.908077141 +0000 UTC m=+33.631123029 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/18721546-1063-46a1-8715-a40872933b22-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bjppc" (UID: "18721546-1063-46a1-8715-a40872933b22") : secret "samples-operator-tls" not found Apr 16 19:54:14.908145 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.908016 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4219a57d-3ca0-4b3e-9d49-d7d1178e2c5e-hosts-file\") pod \"node-resolver-hnzkl\" (UID: \"4219a57d-3ca0-4b3e-9d49-d7d1178e2c5e\") " pod="openshift-dns/node-resolver-hnzkl" Apr 16 19:54:14.908145 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:14.908112 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:14.908145 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:14.908126 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c684876c-h8496: secret "image-registry-tls" not found Apr 16 19:54:14.908430 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:14.908168 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls podName:e54086bc-258c-4204-8105-7a5e491494fa nodeName:}" failed. No retries permitted until 2026-04-16 19:54:15.908156586 +0000 UTC m=+33.631202492 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls") pod "image-registry-c684876c-h8496" (UID: "e54086bc-258c-4204-8105-7a5e491494fa") : secret "image-registry-tls" not found Apr 16 19:54:14.908430 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.908335 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4219a57d-3ca0-4b3e-9d49-d7d1178e2c5e-tmp-dir\") pod \"node-resolver-hnzkl\" (UID: \"4219a57d-3ca0-4b3e-9d49-d7d1178e2c5e\") " pod="openshift-dns/node-resolver-hnzkl" Apr 16 19:54:14.917883 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:14.917850 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfrsq\" (UniqueName: \"kubernetes.io/projected/4219a57d-3ca0-4b3e-9d49-d7d1178e2c5e-kube-api-access-wfrsq\") pod \"node-resolver-hnzkl\" (UID: \"4219a57d-3ca0-4b3e-9d49-d7d1178e2c5e\") " pod="openshift-dns/node-resolver-hnzkl" Apr 16 19:54:15.009519 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:15.009490 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3bac591-c06b-4b40-b42a-f85548b297f0-service-ca-bundle\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:15.009688 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:15.009662 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b3bac591-c06b-4b40-b42a-f85548b297f0-service-ca-bundle podName:b3bac591-c06b-4b40-b42a-f85548b297f0 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:16.009642255 +0000 UTC m=+33.732688155 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b3bac591-c06b-4b40-b42a-f85548b297f0-service-ca-bundle") pod "router-default-98598b8fb-wkrmq" (UID: "b3bac591-c06b-4b40-b42a-f85548b297f0") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:15.009767 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:15.009730 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f95dfef-42c3-454a-9807-3c895a970729-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tpdvb\" (UID: \"0f95dfef-42c3-454a-9807-3c895a970729\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb" Apr 16 19:54:15.009856 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:15.009782 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-metrics-certs\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:15.009908 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:15.009879 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:15.009948 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:15.009931 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f95dfef-42c3-454a-9807-3c895a970729-cluster-monitoring-operator-tls podName:0f95dfef-42c3-454a-9807-3c895a970729 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:16.009916412 +0000 UTC m=+33.732962301 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0f95dfef-42c3-454a-9807-3c895a970729-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tpdvb" (UID: "0f95dfef-42c3-454a-9807-3c895a970729") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:15.009989 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:15.009959 2561 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:15.010028 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:15.010014 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-metrics-certs podName:b3bac591-c06b-4b40-b42a-f85548b297f0 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:16.009998692 +0000 UTC m=+33.733044595 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-metrics-certs") pod "router-default-98598b8fb-wkrmq" (UID: "b3bac591-c06b-4b40-b42a-f85548b297f0") : secret "router-metrics-certs-default" not found Apr 16 19:54:15.032998 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:15.032971 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hnzkl" Apr 16 19:54:15.110957 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:15.110916 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xbnwr\" (UID: \"8814ad85-70d4-48f0-8e96-6cc0a48c07eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xbnwr" Apr 16 19:54:15.111471 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:15.110984 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/165d7242-2ef1-481d-992a-09e3364e0626-cert\") pod \"ingress-canary-k2dcx\" (UID: \"165d7242-2ef1-481d-992a-09e3364e0626\") " pod="openshift-ingress-canary/ingress-canary-k2dcx" Apr 16 19:54:15.111471 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:15.111066 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:15.111471 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:15.111125 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-networking-console-plugin-cert podName:8814ad85-70d4-48f0-8e96-6cc0a48c07eb nodeName:}" failed. No retries permitted until 2026-04-16 19:54:16.111111464 +0000 UTC m=+33.834157354 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-xbnwr" (UID: "8814ad85-70d4-48f0-8e96-6cc0a48c07eb") : secret "networking-console-plugin-cert" not found Apr 16 19:54:15.111471 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:15.111164 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:15.111471 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:15.111224 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/165d7242-2ef1-481d-992a-09e3364e0626-cert podName:165d7242-2ef1-481d-992a-09e3364e0626 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:16.111207742 +0000 UTC m=+33.834253645 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/165d7242-2ef1-481d-992a-09e3364e0626-cert") pod "ingress-canary-k2dcx" (UID: "165d7242-2ef1-481d-992a-09e3364e0626") : secret "canary-serving-cert" not found Apr 16 19:54:15.212109 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:15.212028 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e50b93c-a156-438b-a41f-2b4bac946727-metrics-tls\") pod \"dns-default-cdc8w\" (UID: \"8e50b93c-a156-438b-a41f-2b4bac946727\") " pod="openshift-dns/dns-default-cdc8w" Apr 16 19:54:15.212250 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:15.212167 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:15.212250 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:15.212228 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e50b93c-a156-438b-a41f-2b4bac946727-metrics-tls podName:8e50b93c-a156-438b-a41f-2b4bac946727 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:16.212213375 +0000 UTC m=+33.935259259 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8e50b93c-a156-438b-a41f-2b4bac946727-metrics-tls") pod "dns-default-cdc8w" (UID: "8e50b93c-a156-438b-a41f-2b4bac946727") : secret "dns-default-metrics-tls" not found Apr 16 19:54:15.312984 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:15.312950 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8593da00-be3a-459a-9bf6-ee2f4988af66-original-pull-secret\") pod \"global-pull-secret-syncer-wfp62\" (UID: \"8593da00-be3a-459a-9bf6-ee2f4988af66\") " pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:54:15.326068 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:15.326034 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8593da00-be3a-459a-9bf6-ee2f4988af66-original-pull-secret\") pod \"global-pull-secret-syncer-wfp62\" (UID: \"8593da00-be3a-459a-9bf6-ee2f4988af66\") " pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:54:15.455472 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:15.455437 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wfp62" Apr 16 19:54:15.918651 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:15.918607 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18721546-1063-46a1-8715-a40872933b22-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bjppc\" (UID: \"18721546-1063-46a1-8715-a40872933b22\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bjppc" Apr 16 19:54:15.918872 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:15.918779 2561 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:15.918872 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:15.918863 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18721546-1063-46a1-8715-a40872933b22-samples-operator-tls podName:18721546-1063-46a1-8715-a40872933b22 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:17.918843312 +0000 UTC m=+35.641889202 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/18721546-1063-46a1-8715-a40872933b22-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bjppc" (UID: "18721546-1063-46a1-8715-a40872933b22") : secret "samples-operator-tls" not found Apr 16 19:54:15.918996 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:15.918880 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:15.918996 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:15.918907 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c684876c-h8496: secret "image-registry-tls" not found Apr 16 19:54:15.918996 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:15.918957 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls podName:e54086bc-258c-4204-8105-7a5e491494fa nodeName:}" failed. No retries permitted until 2026-04-16 19:54:17.918942248 +0000 UTC m=+35.641988147 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls") pod "image-registry-c684876c-h8496" (UID: "e54086bc-258c-4204-8105-7a5e491494fa") : secret "image-registry-tls" not found Apr 16 19:54:15.918996 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:15.918778 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:16.020310 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:16.020279 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3bac591-c06b-4b40-b42a-f85548b297f0-service-ca-bundle\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:16.020481 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:16.020346 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f95dfef-42c3-454a-9807-3c895a970729-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tpdvb\" (UID: \"0f95dfef-42c3-454a-9807-3c895a970729\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb" Apr 16 19:54:16.020481 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:16.020423 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:16.020481 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:16.020473 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b3bac591-c06b-4b40-b42a-f85548b297f0-service-ca-bundle podName:b3bac591-c06b-4b40-b42a-f85548b297f0 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:18.020452256 +0000 UTC m=+35.743498140 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b3bac591-c06b-4b40-b42a-f85548b297f0-service-ca-bundle") pod "router-default-98598b8fb-wkrmq" (UID: "b3bac591-c06b-4b40-b42a-f85548b297f0") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:16.020651 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:16.020499 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f95dfef-42c3-454a-9807-3c895a970729-cluster-monitoring-operator-tls podName:0f95dfef-42c3-454a-9807-3c895a970729 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:18.020489493 +0000 UTC m=+35.743535377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0f95dfef-42c3-454a-9807-3c895a970729-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tpdvb" (UID: "0f95dfef-42c3-454a-9807-3c895a970729") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:16.020651 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:16.020523 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-metrics-certs\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:16.020738 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:16.020675 2561 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:16.020738 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:16.020728 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-metrics-certs podName:b3bac591-c06b-4b40-b42a-f85548b297f0 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:18.020716062 +0000 UTC m=+35.743761946 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-metrics-certs") pod "router-default-98598b8fb-wkrmq" (UID: "b3bac591-c06b-4b40-b42a-f85548b297f0") : secret "router-metrics-certs-default" not found Apr 16 19:54:16.121831 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:16.121393 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xbnwr\" (UID: \"8814ad85-70d4-48f0-8e96-6cc0a48c07eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xbnwr" Apr 16 19:54:16.121831 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:16.121470 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/165d7242-2ef1-481d-992a-09e3364e0626-cert\") pod \"ingress-canary-k2dcx\" (UID: \"165d7242-2ef1-481d-992a-09e3364e0626\") " pod="openshift-ingress-canary/ingress-canary-k2dcx" Apr 16 19:54:16.121831 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:16.121597 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:16.121831 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:16.121659 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/165d7242-2ef1-481d-992a-09e3364e0626-cert podName:165d7242-2ef1-481d-992a-09e3364e0626 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:18.121640473 +0000 UTC m=+35.844686362 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/165d7242-2ef1-481d-992a-09e3364e0626-cert") pod "ingress-canary-k2dcx" (UID: "165d7242-2ef1-481d-992a-09e3364e0626") : secret "canary-serving-cert" not found Apr 16 19:54:16.122436 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:16.122129 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:16.122436 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:16.122185 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-networking-console-plugin-cert podName:8814ad85-70d4-48f0-8e96-6cc0a48c07eb nodeName:}" failed. No retries permitted until 2026-04-16 19:54:18.122168261 +0000 UTC m=+35.845214148 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-xbnwr" (UID: "8814ad85-70d4-48f0-8e96-6cc0a48c07eb") : secret "networking-console-plugin-cert" not found Apr 16 19:54:16.225831 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:16.225772 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e50b93c-a156-438b-a41f-2b4bac946727-metrics-tls\") pod \"dns-default-cdc8w\" (UID: \"8e50b93c-a156-438b-a41f-2b4bac946727\") " pod="openshift-dns/dns-default-cdc8w" Apr 16 19:54:16.225968 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:16.225947 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:16.226059 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:16.226027 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e50b93c-a156-438b-a41f-2b4bac946727-metrics-tls podName:8e50b93c-a156-438b-a41f-2b4bac946727 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:18.226005 +0000 UTC m=+35.949050887 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8e50b93c-a156-438b-a41f-2b4bac946727-metrics-tls") pod "dns-default-cdc8w" (UID: "8e50b93c-a156-438b-a41f-2b4bac946727") : secret "dns-default-metrics-tls" not found Apr 16 19:54:16.337461 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:16.337417 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-94k49"] Apr 16 19:54:16.341289 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:16.341241 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wfp62"] Apr 16 19:54:16.344402 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:16.344380 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gjqjl"] Apr 16 19:54:16.349637 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:16.349615 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-sqd4v"] Apr 16 19:54:16.359809 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:16.359612 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-spvh5"] Apr 16 19:54:16.365903 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:16.365881 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-mwqfp"] Apr 16 19:54:16.371604 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:16.371582 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ln2rf"] Apr 16 19:54:16.378555 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:54:16.378528 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e23a4d9_ff98_49a4_a888_9d26648f61cf.slice/crio-20fb22edaf704a49f4d28b183927ada5906811b38632ec22479c25bc2c0f8071 WatchSource:0}: Error finding container 20fb22edaf704a49f4d28b183927ada5906811b38632ec22479c25bc2c0f8071: Status 404 returned error can't find the container with id 20fb22edaf704a49f4d28b183927ada5906811b38632ec22479c25bc2c0f8071 Apr 16 19:54:16.379105 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:54:16.379083 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8593da00_be3a_459a_9bf6_ee2f4988af66.slice/crio-024766918c1afdb23370ae12e60310086a2699861bc61071b095a670a02dc84c WatchSource:0}: Error finding container 024766918c1afdb23370ae12e60310086a2699861bc61071b095a670a02dc84c: Status 404 returned error can't find the container with id 024766918c1afdb23370ae12e60310086a2699861bc61071b095a670a02dc84c Apr 16 19:54:16.380149 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:54:16.380105 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe409e83_4bf3_40c8_b46d_61a088fdae77.slice/crio-1da5fa8ef7144a1197d6ad2f40c6705e6ce2f9feb14a997c793f01b392973b6c WatchSource:0}: Error finding container 1da5fa8ef7144a1197d6ad2f40c6705e6ce2f9feb14a997c793f01b392973b6c: Status 404 returned error can't find the container with id 1da5fa8ef7144a1197d6ad2f40c6705e6ce2f9feb14a997c793f01b392973b6c Apr 16 19:54:16.381648 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:54:16.381613 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda823131_b1c7_41ae_a0e6_3fa763f3d110.slice/crio-ddca70f8fce6e71ba341ddcea0a891d9e168d176c9327d67b69a363bf5e28d5b WatchSource:0}: Error finding container ddca70f8fce6e71ba341ddcea0a891d9e168d176c9327d67b69a363bf5e28d5b: Status 404 returned error can't find the container with id ddca70f8fce6e71ba341ddcea0a891d9e168d176c9327d67b69a363bf5e28d5b Apr 16 19:54:16.382134 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:54:16.381910 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e7e6010_fad1_4881_8816_b024c8853151.slice/crio-a94ccf263d79e3af0d55fa0abe2b6a9e483fc3590366a24b83a6e8638b16415b WatchSource:0}: Error finding container a94ccf263d79e3af0d55fa0abe2b6a9e483fc3590366a24b83a6e8638b16415b: Status 404 returned error can't find the container with id a94ccf263d79e3af0d55fa0abe2b6a9e483fc3590366a24b83a6e8638b16415b Apr 16 19:54:16.383099 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:54:16.383020 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb33484f0_9ae5_4b31_8baa_d4219e39ddd9.slice/crio-8083eef7211faa85fb1c11351d5c71ffb31f05eb42544009af2a999ea1e3a831 WatchSource:0}: Error finding container 8083eef7211faa85fb1c11351d5c71ffb31f05eb42544009af2a999ea1e3a831: Status 404 returned error can't find the container with id 8083eef7211faa85fb1c11351d5c71ffb31f05eb42544009af2a999ea1e3a831 Apr 16 19:54:16.384228 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:54:16.384204 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe232f65_8167_4e83_83a8_d40670fbf702.slice/crio-7478768df6436d361f557b293a571a0b4b5395cacd08361853a31b0d90dc1e0a WatchSource:0}: Error finding container 7478768df6436d361f557b293a571a0b4b5395cacd08361853a31b0d90dc1e0a: Status 404 returned error can't find the container with id 7478768df6436d361f557b293a571a0b4b5395cacd08361853a31b0d90dc1e0a Apr 16 19:54:16.528681 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:16.528656 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs\") pod \"network-metrics-daemon-nx45q\" (UID: \"e0b9420a-1c3e-47b5-b187-827cb7f39aea\") " pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:54:16.528877 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:16.528855 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 19:54:16.528951 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:16.528939 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs podName:e0b9420a-1c3e-47b5-b187-827cb7f39aea nodeName:}" failed. No retries permitted until 2026-04-16 19:54:48.528917149 +0000 UTC m=+66.251963056 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs") pod "network-metrics-daemon-nx45q" (UID: "e0b9420a-1c3e-47b5-b187-827cb7f39aea") : secret "metrics-daemon-secret" not found Apr 16 19:54:16.629942 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:16.629907 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcbfx\" (UniqueName: \"kubernetes.io/projected/58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f-kube-api-access-kcbfx\") pod \"network-check-target-qnt55\" (UID: \"58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f\") " pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:54:16.634110 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:16.634084 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcbfx\" (UniqueName: \"kubernetes.io/projected/58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f-kube-api-access-kcbfx\") pod \"network-check-target-qnt55\" (UID: \"58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f\") " pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:54:16.661166 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:16.661133 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:54:16.789868 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:16.789778 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qnt55"] Apr 16 19:54:16.792774 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:54:16.792738 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58737d84_3f2c_46d7_b1e4_6b7f5cda8b4f.slice/crio-41d8082bb31e67e3c3c7ab4716ddc632568cb99a1f076e1d4931171b0f51aaa4 WatchSource:0}: Error finding container 41d8082bb31e67e3c3c7ab4716ddc632568cb99a1f076e1d4931171b0f51aaa4: Status 404 returned error can't find the container with id 41d8082bb31e67e3c3c7ab4716ddc632568cb99a1f076e1d4931171b0f51aaa4 Apr 16 19:54:16.995460 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:16.995378 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qnt55" event={"ID":"58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f","Type":"ContainerStarted","Data":"41d8082bb31e67e3c3c7ab4716ddc632568cb99a1f076e1d4931171b0f51aaa4"} Apr 16 19:54:16.996917 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:16.996891 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" event={"ID":"be232f65-8167-4e83-83a8-d40670fbf702","Type":"ContainerStarted","Data":"7478768df6436d361f557b293a571a0b4b5395cacd08361853a31b0d90dc1e0a"} Apr 16 19:54:16.998359 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:16.998333 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wfp62" event={"ID":"8593da00-be3a-459a-9bf6-ee2f4988af66","Type":"ContainerStarted","Data":"024766918c1afdb23370ae12e60310086a2699861bc61071b095a670a02dc84c"} Apr 16 19:54:16.999637 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:16.999592 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gjqjl" event={"ID":"fe409e83-4bf3-40c8-b46d-61a088fdae77","Type":"ContainerStarted","Data":"1da5fa8ef7144a1197d6ad2f40c6705e6ce2f9feb14a997c793f01b392973b6c"} Apr 16 19:54:17.001044 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:17.001022 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-94k49" event={"ID":"3e23a4d9-ff98-49a4-a888-9d26648f61cf","Type":"ContainerStarted","Data":"20fb22edaf704a49f4d28b183927ada5906811b38632ec22479c25bc2c0f8071"} Apr 16 19:54:17.002594 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:17.002571 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-sqd4v" event={"ID":"da823131-b1c7-41ae-a0e6-3fa763f3d110","Type":"ContainerStarted","Data":"ddca70f8fce6e71ba341ddcea0a891d9e168d176c9327d67b69a363bf5e28d5b"} Apr 16 19:54:17.017925 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:17.016737 2561 generic.go:358] "Generic (PLEG): container finished" podID="6f07541a-6ad1-43d0-9a04-540a16f67cec" containerID="6ff35719c6dbc110d9ea2efde50bbfe28e4696c29c1d95c3691f26cea1f21ed5" exitCode=0 Apr 16 19:54:17.017925 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:17.016821 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88wb2" event={"ID":"6f07541a-6ad1-43d0-9a04-540a16f67cec","Type":"ContainerDied","Data":"6ff35719c6dbc110d9ea2efde50bbfe28e4696c29c1d95c3691f26cea1f21ed5"} Apr 16 19:54:17.020584 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:17.020127 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hnzkl" event={"ID":"4219a57d-3ca0-4b3e-9d49-d7d1178e2c5e","Type":"ContainerStarted","Data":"6ab5b3a5c5d151e30b69a4648d8a0e8d7bdc8967d9a16d00aa17abb8d8cc048a"} Apr 16 19:54:17.020584 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:17.020165 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hnzkl" event={"ID":"4219a57d-3ca0-4b3e-9d49-d7d1178e2c5e","Type":"ContainerStarted","Data":"211b4d038dd5cf5c9a65d10cc9764db41e53059669e2fe363418bae1ff8bd223"} Apr 16 19:54:17.023544 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:17.023459 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-spvh5" event={"ID":"3e7e6010-fad1-4881-8816-b024c8853151","Type":"ContainerStarted","Data":"a94ccf263d79e3af0d55fa0abe2b6a9e483fc3590366a24b83a6e8638b16415b"} Apr 16 19:54:17.025090 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:17.025029 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-mwqfp" event={"ID":"b33484f0-9ae5-4b31-8baa-d4219e39ddd9","Type":"ContainerStarted","Data":"8083eef7211faa85fb1c11351d5c71ffb31f05eb42544009af2a999ea1e3a831"} Apr 16 19:54:17.945026 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:17.944043 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:17.945026 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:17.944366 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18721546-1063-46a1-8715-a40872933b22-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bjppc\" (UID: \"18721546-1063-46a1-8715-a40872933b22\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bjppc" Apr 16 19:54:17.945026 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:17.944539 2561 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:17.945026 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:17.944601 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18721546-1063-46a1-8715-a40872933b22-samples-operator-tls podName:18721546-1063-46a1-8715-a40872933b22 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:21.944582722 +0000 UTC m=+39.667628611 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/18721546-1063-46a1-8715-a40872933b22-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bjppc" (UID: "18721546-1063-46a1-8715-a40872933b22") : secret "samples-operator-tls" not found Apr 16 19:54:17.945026 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:17.944933 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:17.945026 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:17.944952 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c684876c-h8496: secret "image-registry-tls" not found Apr 16 19:54:17.945026 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:17.945010 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls podName:e54086bc-258c-4204-8105-7a5e491494fa nodeName:}" failed. No retries permitted until 2026-04-16 19:54:21.944983674 +0000 UTC m=+39.668029577 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls") pod "image-registry-c684876c-h8496" (UID: "e54086bc-258c-4204-8105-7a5e491494fa") : secret "image-registry-tls" not found Apr 16 19:54:18.037995 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:18.037960 2561 generic.go:358] "Generic (PLEG): container finished" podID="6f07541a-6ad1-43d0-9a04-540a16f67cec" containerID="adc6d9f6ed3bf19d3c40fcbb9c0a83988d7e1612b847852382cdb7dd0027025a" exitCode=0 Apr 16 19:54:18.038148 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:18.038041 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88wb2" event={"ID":"6f07541a-6ad1-43d0-9a04-540a16f67cec","Type":"ContainerDied","Data":"adc6d9f6ed3bf19d3c40fcbb9c0a83988d7e1612b847852382cdb7dd0027025a"} Apr 16 19:54:18.045544 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:18.045515 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f95dfef-42c3-454a-9807-3c895a970729-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tpdvb\" (UID: \"0f95dfef-42c3-454a-9807-3c895a970729\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb" Apr 16 19:54:18.045664 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:18.045566 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-metrics-certs\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:18.045740 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:18.045683 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3bac591-c06b-4b40-b42a-f85548b297f0-service-ca-bundle\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:18.045894 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:18.045857 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b3bac591-c06b-4b40-b42a-f85548b297f0-service-ca-bundle podName:b3bac591-c06b-4b40-b42a-f85548b297f0 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:22.045839293 +0000 UTC m=+39.768885182 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b3bac591-c06b-4b40-b42a-f85548b297f0-service-ca-bundle") pod "router-default-98598b8fb-wkrmq" (UID: "b3bac591-c06b-4b40-b42a-f85548b297f0") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:18.047665 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:18.046184 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:18.047665 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:18.046240 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f95dfef-42c3-454a-9807-3c895a970729-cluster-monitoring-operator-tls podName:0f95dfef-42c3-454a-9807-3c895a970729 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:22.046222439 +0000 UTC m=+39.769268337 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0f95dfef-42c3-454a-9807-3c895a970729-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tpdvb" (UID: "0f95dfef-42c3-454a-9807-3c895a970729") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:18.047665 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:18.046298 2561 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:18.047665 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:18.046338 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-metrics-certs podName:b3bac591-c06b-4b40-b42a-f85548b297f0 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:22.046325166 +0000 UTC m=+39.769371056 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-metrics-certs") pod "router-default-98598b8fb-wkrmq" (UID: "b3bac591-c06b-4b40-b42a-f85548b297f0") : secret "router-metrics-certs-default" not found Apr 16 19:54:18.067373 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:18.067304 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hnzkl" podStartSLOduration=4.067288796 podStartE2EDuration="4.067288796s" podCreationTimestamp="2026-04-16 19:54:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:17.090903139 +0000 UTC m=+34.813949046" watchObservedRunningTime="2026-04-16 19:54:18.067288796 +0000 UTC m=+35.790334702" Apr 16 19:54:18.147495 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:18.147460 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xbnwr\" (UID: \"8814ad85-70d4-48f0-8e96-6cc0a48c07eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xbnwr" Apr 16 19:54:18.147639 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:18.147559 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/165d7242-2ef1-481d-992a-09e3364e0626-cert\") pod \"ingress-canary-k2dcx\" (UID: \"165d7242-2ef1-481d-992a-09e3364e0626\") " pod="openshift-ingress-canary/ingress-canary-k2dcx" Apr 16 19:54:18.147725 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:18.147709 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:18.147863 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:18.147770 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/165d7242-2ef1-481d-992a-09e3364e0626-cert podName:165d7242-2ef1-481d-992a-09e3364e0626 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:22.147752873 +0000 UTC m=+39.870798761 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/165d7242-2ef1-481d-992a-09e3364e0626-cert") pod "ingress-canary-k2dcx" (UID: "165d7242-2ef1-481d-992a-09e3364e0626") : secret "canary-serving-cert" not found Apr 16 19:54:18.148551 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:18.148517 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:18.148638 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:18.148576 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-networking-console-plugin-cert podName:8814ad85-70d4-48f0-8e96-6cc0a48c07eb nodeName:}" failed. No retries permitted until 2026-04-16 19:54:22.148561399 +0000 UTC m=+39.871607285 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-xbnwr" (UID: "8814ad85-70d4-48f0-8e96-6cc0a48c07eb") : secret "networking-console-plugin-cert" not found Apr 16 19:54:18.248089 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:18.248013 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e50b93c-a156-438b-a41f-2b4bac946727-metrics-tls\") pod \"dns-default-cdc8w\" (UID: \"8e50b93c-a156-438b-a41f-2b4bac946727\") " pod="openshift-dns/dns-default-cdc8w" Apr 16 19:54:18.248328 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:18.248310 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:18.248399 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:18.248384 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e50b93c-a156-438b-a41f-2b4bac946727-metrics-tls podName:8e50b93c-a156-438b-a41f-2b4bac946727 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:22.248365276 +0000 UTC m=+39.971411181 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8e50b93c-a156-438b-a41f-2b4bac946727-metrics-tls") pod "dns-default-cdc8w" (UID: "8e50b93c-a156-438b-a41f-2b4bac946727") : secret "dns-default-metrics-tls" not found Apr 16 19:54:21.984623 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:21.984591 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18721546-1063-46a1-8715-a40872933b22-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bjppc\" (UID: \"18721546-1063-46a1-8715-a40872933b22\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bjppc" Apr 16 19:54:21.985075 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:21.984740 2561 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:21.985075 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:21.984819 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18721546-1063-46a1-8715-a40872933b22-samples-operator-tls podName:18721546-1063-46a1-8715-a40872933b22 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.984804897 +0000 UTC m=+47.707850780 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/18721546-1063-46a1-8715-a40872933b22-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bjppc" (UID: "18721546-1063-46a1-8715-a40872933b22") : secret "samples-operator-tls" not found Apr 16 19:54:21.985075 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:21.984848 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:21.985075 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:21.984921 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:21.985075 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:21.984929 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c684876c-h8496: secret "image-registry-tls" not found Apr 16 19:54:21.985075 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:21.984951 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls podName:e54086bc-258c-4204-8105-7a5e491494fa nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.984944707 +0000 UTC m=+47.707990592 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls") pod "image-registry-c684876c-h8496" (UID: "e54086bc-258c-4204-8105-7a5e491494fa") : secret "image-registry-tls" not found Apr 16 19:54:22.085692 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:22.085661 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3bac591-c06b-4b40-b42a-f85548b297f0-service-ca-bundle\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:22.085847 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:22.085746 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f95dfef-42c3-454a-9807-3c895a970729-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tpdvb\" (UID: \"0f95dfef-42c3-454a-9807-3c895a970729\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb" Apr 16 19:54:22.085847 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:22.085780 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-metrics-certs\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:22.085847 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:22.085818 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b3bac591-c06b-4b40-b42a-f85548b297f0-service-ca-bundle podName:b3bac591-c06b-4b40-b42a-f85548b297f0 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.085784826 +0000 UTC m=+47.808830710 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b3bac591-c06b-4b40-b42a-f85548b297f0-service-ca-bundle") pod "router-default-98598b8fb-wkrmq" (UID: "b3bac591-c06b-4b40-b42a-f85548b297f0") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:22.086002 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:22.085872 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:22.086002 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:22.085878 2561 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:22.086002 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:22.085916 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f95dfef-42c3-454a-9807-3c895a970729-cluster-monitoring-operator-tls podName:0f95dfef-42c3-454a-9807-3c895a970729 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.085905713 +0000 UTC m=+47.808951596 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0f95dfef-42c3-454a-9807-3c895a970729-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tpdvb" (UID: "0f95dfef-42c3-454a-9807-3c895a970729") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:22.086002 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:22.085929 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-metrics-certs podName:b3bac591-c06b-4b40-b42a-f85548b297f0 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.085922814 +0000 UTC m=+47.808968697 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-metrics-certs") pod "router-default-98598b8fb-wkrmq" (UID: "b3bac591-c06b-4b40-b42a-f85548b297f0") : secret "router-metrics-certs-default" not found Apr 16 19:54:22.187102 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:22.187072 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/165d7242-2ef1-481d-992a-09e3364e0626-cert\") pod \"ingress-canary-k2dcx\" (UID: \"165d7242-2ef1-481d-992a-09e3364e0626\") " pod="openshift-ingress-canary/ingress-canary-k2dcx" Apr 16 19:54:22.187245 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:22.187227 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:22.187307 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:22.187278 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/165d7242-2ef1-481d-992a-09e3364e0626-cert podName:165d7242-2ef1-481d-992a-09e3364e0626 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.1872646 +0000 UTC m=+47.910310484 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/165d7242-2ef1-481d-992a-09e3364e0626-cert") pod "ingress-canary-k2dcx" (UID: "165d7242-2ef1-481d-992a-09e3364e0626") : secret "canary-serving-cert" not found Apr 16 19:54:22.187307 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:22.187227 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xbnwr\" (UID: \"8814ad85-70d4-48f0-8e96-6cc0a48c07eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xbnwr" Apr 16 19:54:22.187410 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:22.187303 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:22.187410 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:22.187361 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-networking-console-plugin-cert podName:8814ad85-70d4-48f0-8e96-6cc0a48c07eb nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.187345239 +0000 UTC m=+47.910391131 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-xbnwr" (UID: "8814ad85-70d4-48f0-8e96-6cc0a48c07eb") : secret "networking-console-plugin-cert" not found Apr 16 19:54:22.288671 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:22.288635 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e50b93c-a156-438b-a41f-2b4bac946727-metrics-tls\") pod \"dns-default-cdc8w\" (UID: \"8e50b93c-a156-438b-a41f-2b4bac946727\") " pod="openshift-dns/dns-default-cdc8w" Apr 16 19:54:22.288827 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:22.288771 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:22.288880 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:22.288843 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e50b93c-a156-438b-a41f-2b4bac946727-metrics-tls podName:8e50b93c-a156-438b-a41f-2b4bac946727 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.288829215 +0000 UTC m=+48.011875104 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8e50b93c-a156-438b-a41f-2b4bac946727-metrics-tls") pod "dns-default-cdc8w" (UID: "8e50b93c-a156-438b-a41f-2b4bac946727") : secret "dns-default-metrics-tls" not found Apr 16 19:54:26.064293 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.064256 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wfp62" event={"ID":"8593da00-be3a-459a-9bf6-ee2f4988af66","Type":"ContainerStarted","Data":"ecacf02f4dc65599e1369f9abe5153a5600df83bc16b4ac1ac428fceb9f62c3b"} Apr 16 19:54:26.065930 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.065904 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gjqjl" event={"ID":"fe409e83-4bf3-40c8-b46d-61a088fdae77","Type":"ContainerStarted","Data":"8d34212e101c5156aca8a57008d29d023a5ff037ecec38dc17a59a66d3f8522a"} Apr 16 19:54:26.067294 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.067268 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-94k49" event={"ID":"3e23a4d9-ff98-49a4-a888-9d26648f61cf","Type":"ContainerStarted","Data":"5cef1d6ba75b88775ff27aaafb0ca8e90c5314adc98542ef8d896f0852129b02"} Apr 16 19:54:26.068673 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.068650 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-sqd4v" event={"ID":"da823131-b1c7-41ae-a0e6-3fa763f3d110","Type":"ContainerStarted","Data":"dab0b21b90a20da35efbd6f9003e59ba6c4d348407721b31ef90b12805cfd47f"} Apr 16 19:54:26.072584 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.072560 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-88wb2" event={"ID":"6f07541a-6ad1-43d0-9a04-540a16f67cec","Type":"ContainerStarted","Data":"ca9359969b36983eb44e13b05eae05d6c005578fd3efc54856b09d6601f5225b"} Apr 16 19:54:26.074142 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.074120 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-spvh5" event={"ID":"3e7e6010-fad1-4881-8816-b024c8853151","Type":"ContainerStarted","Data":"8276f610bd9e2af177600a1040aafd963d7f772c7449b79c39970a9fb0bf1f71"} Apr 16 19:54:26.075540 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.075519 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-mwqfp" event={"ID":"b33484f0-9ae5-4b31-8baa-d4219e39ddd9","Type":"ContainerStarted","Data":"db25546ffd1112c644c80578a1b34789eaf59441cf5c91f4ffd1f1e302c2bd0c"} Apr 16 19:54:26.077027 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.077007 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qnt55" event={"ID":"58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f","Type":"ContainerStarted","Data":"2c64038a542d337a7666d416f9de7197c6c0cb2a4a3b650a3dad6d1ad3b5cac8"} Apr 16 19:54:26.077191 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.077168 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:54:26.078403 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.078385 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/0.log" Apr 16 19:54:26.078489 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.078421 2561 generic.go:358] "Generic (PLEG): container finished" podID="be232f65-8167-4e83-83a8-d40670fbf702" containerID="78ffcf7ca438992a24bc6b137417a11de0af1d9a33481b6cb5cf71ccc306aff3" exitCode=255 Apr 16 19:54:26.078489 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.078448 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" event={"ID":"be232f65-8167-4e83-83a8-d40670fbf702","Type":"ContainerDied","Data":"78ffcf7ca438992a24bc6b137417a11de0af1d9a33481b6cb5cf71ccc306aff3"} Apr 16 19:54:26.078698 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.078682 2561 scope.go:117] "RemoveContainer" containerID="78ffcf7ca438992a24bc6b137417a11de0af1d9a33481b6cb5cf71ccc306aff3" Apr 16 19:54:26.085384 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.085349 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-wfp62" podStartSLOduration=18.114632282 podStartE2EDuration="27.085339226s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:54:16.38111187 +0000 UTC m=+34.104157757" lastFinishedPulling="2026-04-16 19:54:25.351818802 +0000 UTC m=+43.074864701" observedRunningTime="2026-04-16 19:54:26.084194486 +0000 UTC m=+43.807240394" watchObservedRunningTime="2026-04-16 19:54:26.085339226 +0000 UTC m=+43.808385155" Apr 16 19:54:26.129320 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.129278 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-mwqfp" podStartSLOduration=11.199732492 podStartE2EDuration="20.129264341s" podCreationTimestamp="2026-04-16 19:54:06 +0000 UTC" firstStartedPulling="2026-04-16 19:54:16.408194521 +0000 UTC m=+34.131240410" lastFinishedPulling="2026-04-16 19:54:25.337726373 +0000 UTC m=+43.060772259" observedRunningTime="2026-04-16 19:54:26.128663055 +0000 UTC m=+43.851708964" watchObservedRunningTime="2026-04-16 19:54:26.129264341 +0000 UTC m=+43.852310249" Apr 16 19:54:26.150309 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.149385 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gjqjl" podStartSLOduration=11.369719031 podStartE2EDuration="20.149370487s" podCreationTimestamp="2026-04-16 19:54:06 +0000 UTC" firstStartedPulling="2026-04-16 19:54:16.382609047 +0000 UTC m=+34.105654938" lastFinishedPulling="2026-04-16 19:54:25.162260495 +0000 UTC m=+42.885306394" observedRunningTime="2026-04-16 19:54:26.149138858 +0000 UTC m=+43.872184765" watchObservedRunningTime="2026-04-16 19:54:26.149370487 +0000 UTC m=+43.872416394" Apr 16 19:54:26.174933 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.174825 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-94k49" podStartSLOduration=11.217803046 podStartE2EDuration="20.174782975s" podCreationTimestamp="2026-04-16 19:54:06 +0000 UTC" firstStartedPulling="2026-04-16 19:54:16.380704389 +0000 UTC m=+34.103750276" lastFinishedPulling="2026-04-16 19:54:25.337684318 +0000 UTC m=+43.060730205" observedRunningTime="2026-04-16 19:54:26.174043002 +0000 UTC m=+43.897088909" watchObservedRunningTime="2026-04-16 19:54:26.174782975 +0000 UTC m=+43.897828882" Apr 16 19:54:26.208392 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.208339 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-88wb2" podStartSLOduration=12.089602816 podStartE2EDuration="43.208323612s" podCreationTimestamp="2026-04-16 19:53:43 +0000 UTC" firstStartedPulling="2026-04-16 19:53:45.312901887 +0000 UTC m=+3.035947782" lastFinishedPulling="2026-04-16 19:54:16.43162268 +0000 UTC m=+34.154668578" observedRunningTime="2026-04-16 19:54:26.2069883 +0000 UTC m=+43.930034208" watchObservedRunningTime="2026-04-16 19:54:26.208323612 +0000 UTC m=+43.931369518" Apr 16 19:54:26.258905 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.258860 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qnt55" podStartSLOduration=34.718864039 podStartE2EDuration="43.258844137s" podCreationTimestamp="2026-04-16 19:53:43 +0000 UTC" firstStartedPulling="2026-04-16 19:54:16.799470695 +0000 UTC m=+34.522516585" lastFinishedPulling="2026-04-16 19:54:25.339450786 +0000 UTC m=+43.062496683" observedRunningTime="2026-04-16 19:54:26.228554663 +0000 UTC m=+43.951600570" watchObservedRunningTime="2026-04-16 19:54:26.258844137 +0000 UTC m=+43.981890035" Apr 16 19:54:26.259006 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.258953 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-sqd4v" podStartSLOduration=11.302405762 podStartE2EDuration="20.258945832s" podCreationTimestamp="2026-04-16 19:54:06 +0000 UTC" firstStartedPulling="2026-04-16 19:54:16.383204346 +0000 UTC m=+34.106250233" lastFinishedPulling="2026-04-16 19:54:25.339744418 +0000 UTC m=+43.062790303" observedRunningTime="2026-04-16 19:54:26.257359956 +0000 UTC m=+43.980405862" watchObservedRunningTime="2026-04-16 19:54:26.258945832 +0000 UTC m=+43.981991739" Apr 16 19:54:26.279082 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.279037 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-spvh5" podStartSLOduration=11.349697361 podStartE2EDuration="20.279022232s" podCreationTimestamp="2026-04-16 19:54:06 +0000 UTC" firstStartedPulling="2026-04-16 19:54:16.408334373 +0000 UTC m=+34.131380272" lastFinishedPulling="2026-04-16 19:54:25.337659253 +0000 UTC m=+43.060705143" observedRunningTime="2026-04-16 19:54:26.277500662 +0000 UTC m=+44.000546569" watchObservedRunningTime="2026-04-16 19:54:26.279022232 +0000 UTC m=+44.002068138" Apr 16 19:54:26.660661 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.660628 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-gf47w"] Apr 16 19:54:26.702933 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.702899 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-gf47w"] Apr 16 19:54:26.703097 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.703023 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gf47w" Apr 16 19:54:26.706244 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.705920 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 19:54:26.706244 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.706044 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-9d8rk\"" Apr 16 19:54:26.706244 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.705920 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:26.727316 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.727292 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g66r\" (UniqueName: \"kubernetes.io/projected/2d72cccc-2e90-4646-b78f-8afabf5aee06-kube-api-access-7g66r\") pod \"migrator-74bb7799d9-gf47w\" (UID: \"2d72cccc-2e90-4646-b78f-8afabf5aee06\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gf47w" Apr 16 19:54:26.828033 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.828000 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7g66r\" (UniqueName: \"kubernetes.io/projected/2d72cccc-2e90-4646-b78f-8afabf5aee06-kube-api-access-7g66r\") pod \"migrator-74bb7799d9-gf47w\" (UID: \"2d72cccc-2e90-4646-b78f-8afabf5aee06\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gf47w" Apr 16 19:54:26.837056 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:26.837033 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g66r\" (UniqueName: \"kubernetes.io/projected/2d72cccc-2e90-4646-b78f-8afabf5aee06-kube-api-access-7g66r\") pod \"migrator-74bb7799d9-gf47w\" (UID: \"2d72cccc-2e90-4646-b78f-8afabf5aee06\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gf47w" Apr 16 19:54:27.014874 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:27.014807 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gf47w" Apr 16 19:54:27.083005 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:27.082985 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/1.log" Apr 16 19:54:27.083433 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:27.083410 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/0.log" Apr 16 19:54:27.083505 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:27.083452 2561 generic.go:358] "Generic (PLEG): container finished" podID="be232f65-8167-4e83-83a8-d40670fbf702" containerID="8d23051cca59d5bc7aec1043baee8b464476c51c7d435626ff1c2d2a6471ed7c" exitCode=255 Apr 16 19:54:27.083856 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:27.083831 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" event={"ID":"be232f65-8167-4e83-83a8-d40670fbf702","Type":"ContainerDied","Data":"8d23051cca59d5bc7aec1043baee8b464476c51c7d435626ff1c2d2a6471ed7c"} Apr 16 19:54:27.083953 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:27.083897 2561 scope.go:117] "RemoveContainer" containerID="78ffcf7ca438992a24bc6b137417a11de0af1d9a33481b6cb5cf71ccc306aff3" Apr 16 19:54:27.084680 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:27.084528 2561 scope.go:117] "RemoveContainer" containerID="8d23051cca59d5bc7aec1043baee8b464476c51c7d435626ff1c2d2a6471ed7c" Apr 16 19:54:27.085471 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:27.085286 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ln2rf_openshift-console-operator(be232f65-8167-4e83-83a8-d40670fbf702)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" podUID="be232f65-8167-4e83-83a8-d40670fbf702" Apr 16 19:54:27.137094 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:27.136726 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-gf47w"] Apr 16 19:54:27.137937 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:54:27.137899 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d72cccc_2e90_4646_b78f_8afabf5aee06.slice/crio-f6f9f3429d1a70348ead4fff44c04d933993b2377c0fba8661efd91e108a39b0 WatchSource:0}: Error finding container f6f9f3429d1a70348ead4fff44c04d933993b2377c0fba8661efd91e108a39b0: Status 404 returned error can't find the container with id f6f9f3429d1a70348ead4fff44c04d933993b2377c0fba8661efd91e108a39b0 Apr 16 19:54:28.087456 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:28.087420 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gf47w" event={"ID":"2d72cccc-2e90-4646-b78f-8afabf5aee06","Type":"ContainerStarted","Data":"f6f9f3429d1a70348ead4fff44c04d933993b2377c0fba8661efd91e108a39b0"} Apr 16 19:54:28.088934 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:28.088908 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/1.log" Apr 16 19:54:28.089285 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:28.089267 2561 scope.go:117] "RemoveContainer" containerID="8d23051cca59d5bc7aec1043baee8b464476c51c7d435626ff1c2d2a6471ed7c" Apr 16 19:54:28.089496 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:28.089477 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ln2rf_openshift-console-operator(be232f65-8167-4e83-83a8-d40670fbf702)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" podUID="be232f65-8167-4e83-83a8-d40670fbf702" Apr 16 19:54:28.695653 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:28.695635 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hnzkl_4219a57d-3ca0-4b3e-9d49-d7d1178e2c5e/dns-node-resolver/0.log" Apr 16 19:54:29.093681 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.093609 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gf47w" event={"ID":"2d72cccc-2e90-4646-b78f-8afabf5aee06","Type":"ContainerStarted","Data":"51d58d54f54d52715d86160a8299ce1694965b15a457ae2e3725ec0cd918f5e5"} Apr 16 19:54:29.093681 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.093645 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gf47w" event={"ID":"2d72cccc-2e90-4646-b78f-8afabf5aee06","Type":"ContainerStarted","Data":"ab40090599de85363f11425521ec2a8920c1b3fe8645bf6f9e7de913fa20fb46"} Apr 16 19:54:29.115546 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.115502 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gf47w" podStartSLOduration=1.622612213 podStartE2EDuration="3.115488735s" podCreationTimestamp="2026-04-16 19:54:26 +0000 UTC" firstStartedPulling="2026-04-16 19:54:27.139462723 +0000 UTC m=+44.862508610" lastFinishedPulling="2026-04-16 19:54:28.632339245 +0000 UTC m=+46.355385132" observedRunningTime="2026-04-16 19:54:29.113484644 +0000 UTC m=+46.836530551" watchObservedRunningTime="2026-04-16 19:54:29.115488735 +0000 UTC m=+46.838534641" Apr 16 19:54:29.251808 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.251765 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-fvwsf"] Apr 16 19:54:29.261033 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.261011 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-fvwsf" Apr 16 19:54:29.264318 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.264283 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 19:54:29.264549 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.264532 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-lknsm\"" Apr 16 19:54:29.264825 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.264811 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 19:54:29.265301 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.265286 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 19:54:29.265383 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.265301 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 19:54:29.268718 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.268699 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-fvwsf"] Apr 16 19:54:29.352514 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.352466 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/04ed07fe-cef6-429a-9f51-473ffe8e0b9b-signing-cabundle\") pod \"service-ca-865cb79987-fvwsf\" (UID: \"04ed07fe-cef6-429a-9f51-473ffe8e0b9b\") " pod="openshift-service-ca/service-ca-865cb79987-fvwsf" Apr 16 19:54:29.352514 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.352504 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv49b\" (UniqueName: \"kubernetes.io/projected/04ed07fe-cef6-429a-9f51-473ffe8e0b9b-kube-api-access-mv49b\") pod \"service-ca-865cb79987-fvwsf\" (UID: \"04ed07fe-cef6-429a-9f51-473ffe8e0b9b\") " pod="openshift-service-ca/service-ca-865cb79987-fvwsf" Apr 16 19:54:29.352651 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.352633 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/04ed07fe-cef6-429a-9f51-473ffe8e0b9b-signing-key\") pod \"service-ca-865cb79987-fvwsf\" (UID: \"04ed07fe-cef6-429a-9f51-473ffe8e0b9b\") " pod="openshift-service-ca/service-ca-865cb79987-fvwsf" Apr 16 19:54:29.453073 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.453046 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/04ed07fe-cef6-429a-9f51-473ffe8e0b9b-signing-key\") pod \"service-ca-865cb79987-fvwsf\" (UID: \"04ed07fe-cef6-429a-9f51-473ffe8e0b9b\") " pod="openshift-service-ca/service-ca-865cb79987-fvwsf" Apr 16 19:54:29.453175 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.453123 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/04ed07fe-cef6-429a-9f51-473ffe8e0b9b-signing-cabundle\") pod \"service-ca-865cb79987-fvwsf\" (UID: \"04ed07fe-cef6-429a-9f51-473ffe8e0b9b\") " pod="openshift-service-ca/service-ca-865cb79987-fvwsf" Apr 16 19:54:29.453237 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.453176 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mv49b\" (UniqueName: \"kubernetes.io/projected/04ed07fe-cef6-429a-9f51-473ffe8e0b9b-kube-api-access-mv49b\") pod \"service-ca-865cb79987-fvwsf\" (UID: \"04ed07fe-cef6-429a-9f51-473ffe8e0b9b\") " pod="openshift-service-ca/service-ca-865cb79987-fvwsf" Apr 16 19:54:29.453841 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.453811 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/04ed07fe-cef6-429a-9f51-473ffe8e0b9b-signing-cabundle\") pod \"service-ca-865cb79987-fvwsf\" (UID: \"04ed07fe-cef6-429a-9f51-473ffe8e0b9b\") " pod="openshift-service-ca/service-ca-865cb79987-fvwsf" Apr 16 19:54:29.455383 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.455365 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/04ed07fe-cef6-429a-9f51-473ffe8e0b9b-signing-key\") pod \"service-ca-865cb79987-fvwsf\" (UID: \"04ed07fe-cef6-429a-9f51-473ffe8e0b9b\") " pod="openshift-service-ca/service-ca-865cb79987-fvwsf" Apr 16 19:54:29.462575 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.462559 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv49b\" (UniqueName: \"kubernetes.io/projected/04ed07fe-cef6-429a-9f51-473ffe8e0b9b-kube-api-access-mv49b\") pod \"service-ca-865cb79987-fvwsf\" (UID: \"04ed07fe-cef6-429a-9f51-473ffe8e0b9b\") " pod="openshift-service-ca/service-ca-865cb79987-fvwsf" Apr 16 19:54:29.499084 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.499069 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ddx6x_c3753347-dfcc-47be-a251-65c3470b8045/node-ca/0.log" Apr 16 19:54:29.570526 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.570508 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-fvwsf" Apr 16 19:54:29.696719 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:29.696494 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-fvwsf"] Apr 16 19:54:29.698986 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:54:29.698960 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04ed07fe_cef6_429a_9f51_473ffe8e0b9b.slice/crio-ad9cf1d843eda7ed1a988d6fbe0192ec06eed78a717af24a502ff3ab3aa524c5 WatchSource:0}: Error finding container ad9cf1d843eda7ed1a988d6fbe0192ec06eed78a717af24a502ff3ab3aa524c5: Status 404 returned error can't find the container with id ad9cf1d843eda7ed1a988d6fbe0192ec06eed78a717af24a502ff3ab3aa524c5 Apr 16 19:54:30.059640 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:30.059612 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18721546-1063-46a1-8715-a40872933b22-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bjppc\" (UID: \"18721546-1063-46a1-8715-a40872933b22\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bjppc" Apr 16 19:54:30.059815 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:30.059703 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:30.059815 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:30.059758 2561 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:30.059913 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:30.059828 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:30.059913 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:30.059839 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c684876c-h8496: secret "image-registry-tls" not found Apr 16 19:54:30.059913 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:30.059848 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18721546-1063-46a1-8715-a40872933b22-samples-operator-tls podName:18721546-1063-46a1-8715-a40872933b22 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:46.059832985 +0000 UTC m=+63.782878869 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/18721546-1063-46a1-8715-a40872933b22-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bjppc" (UID: "18721546-1063-46a1-8715-a40872933b22") : secret "samples-operator-tls" not found Apr 16 19:54:30.059913 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:30.059875 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls podName:e54086bc-258c-4204-8105-7a5e491494fa nodeName:}" failed. No retries permitted until 2026-04-16 19:54:46.05986272 +0000 UTC m=+63.782908605 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls") pod "image-registry-c684876c-h8496" (UID: "e54086bc-258c-4204-8105-7a5e491494fa") : secret "image-registry-tls" not found Apr 16 19:54:30.098030 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:30.097999 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-fvwsf" event={"ID":"04ed07fe-cef6-429a-9f51-473ffe8e0b9b","Type":"ContainerStarted","Data":"fb0101124cf1001af779c6343c067753ede656821dfe608159b80e5b4cbc41a1"} Apr 16 19:54:30.098344 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:30.098055 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-fvwsf" event={"ID":"04ed07fe-cef6-429a-9f51-473ffe8e0b9b","Type":"ContainerStarted","Data":"ad9cf1d843eda7ed1a988d6fbe0192ec06eed78a717af24a502ff3ab3aa524c5"} Apr 16 19:54:30.160236 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:30.160209 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3bac591-c06b-4b40-b42a-f85548b297f0-service-ca-bundle\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:30.160362 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:30.160345 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b3bac591-c06b-4b40-b42a-f85548b297f0-service-ca-bundle podName:b3bac591-c06b-4b40-b42a-f85548b297f0 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:46.160320135 +0000 UTC m=+63.883366024 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b3bac591-c06b-4b40-b42a-f85548b297f0-service-ca-bundle") pod "router-default-98598b8fb-wkrmq" (UID: "b3bac591-c06b-4b40-b42a-f85548b297f0") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:30.160435 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:30.160402 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f95dfef-42c3-454a-9807-3c895a970729-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tpdvb\" (UID: \"0f95dfef-42c3-454a-9807-3c895a970729\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb" Apr 16 19:54:30.160488 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:30.160468 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-metrics-certs\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:30.160609 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:30.160589 2561 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:30.160703 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:30.160646 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-metrics-certs podName:b3bac591-c06b-4b40-b42a-f85548b297f0 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:46.160630494 +0000 UTC m=+63.883676385 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-metrics-certs") pod "router-default-98598b8fb-wkrmq" (UID: "b3bac591-c06b-4b40-b42a-f85548b297f0") : secret "router-metrics-certs-default" not found Apr 16 19:54:30.160759 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:30.160704 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:30.160829 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:30.160768 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f95dfef-42c3-454a-9807-3c895a970729-cluster-monitoring-operator-tls podName:0f95dfef-42c3-454a-9807-3c895a970729 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:46.160752933 +0000 UTC m=+63.883798818 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/0f95dfef-42c3-454a-9807-3c895a970729-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tpdvb" (UID: "0f95dfef-42c3-454a-9807-3c895a970729") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:30.261753 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:30.261728 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/165d7242-2ef1-481d-992a-09e3364e0626-cert\") pod \"ingress-canary-k2dcx\" (UID: \"165d7242-2ef1-481d-992a-09e3364e0626\") " pod="openshift-ingress-canary/ingress-canary-k2dcx" Apr 16 19:54:30.261889 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:30.261874 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:30.261932 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:30.261914 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/165d7242-2ef1-481d-992a-09e3364e0626-cert podName:165d7242-2ef1-481d-992a-09e3364e0626 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:46.261903453 +0000 UTC m=+63.984949342 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/165d7242-2ef1-481d-992a-09e3364e0626-cert") pod "ingress-canary-k2dcx" (UID: "165d7242-2ef1-481d-992a-09e3364e0626") : secret "canary-serving-cert" not found Apr 16 19:54:30.261932 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:30.261914 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xbnwr\" (UID: \"8814ad85-70d4-48f0-8e96-6cc0a48c07eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xbnwr" Apr 16 19:54:30.262007 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:30.261993 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:30.262039 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:30.262035 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-networking-console-plugin-cert podName:8814ad85-70d4-48f0-8e96-6cc0a48c07eb nodeName:}" failed. No retries permitted until 2026-04-16 19:54:46.262021088 +0000 UTC m=+63.985066976 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-xbnwr" (UID: "8814ad85-70d4-48f0-8e96-6cc0a48c07eb") : secret "networking-console-plugin-cert" not found Apr 16 19:54:30.362546 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:30.362498 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e50b93c-a156-438b-a41f-2b4bac946727-metrics-tls\") pod \"dns-default-cdc8w\" (UID: \"8e50b93c-a156-438b-a41f-2b4bac946727\") " pod="openshift-dns/dns-default-cdc8w" Apr 16 19:54:30.362651 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:30.362636 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:30.362688 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:30.362681 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e50b93c-a156-438b-a41f-2b4bac946727-metrics-tls podName:8e50b93c-a156-438b-a41f-2b4bac946727 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:46.362668385 +0000 UTC m=+64.085714277 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8e50b93c-a156-438b-a41f-2b4bac946727-metrics-tls") pod "dns-default-cdc8w" (UID: "8e50b93c-a156-438b-a41f-2b4bac946727") : secret "dns-default-metrics-tls" not found Apr 16 19:54:30.499985 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:30.499958 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-gf47w_2d72cccc-2e90-4646-b78f-8afabf5aee06/migrator/0.log" Apr 16 19:54:30.695329 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:30.695300 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-gf47w_2d72cccc-2e90-4646-b78f-8afabf5aee06/graceful-termination/0.log" Apr 16 19:54:30.896495 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:30.896470 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-spvh5_3e7e6010-fad1-4881-8816-b024c8853151/kube-storage-version-migrator-operator/0.log" Apr 16 19:54:34.486132 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:34.486088 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" Apr 16 19:54:34.486132 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:34.486124 2561 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" Apr 16 19:54:34.486603 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:34.486486 2561 scope.go:117] "RemoveContainer" containerID="8d23051cca59d5bc7aec1043baee8b464476c51c7d435626ff1c2d2a6471ed7c" Apr 16 19:54:34.486664 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:54:34.486646 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ln2rf_openshift-console-operator(be232f65-8167-4e83-83a8-d40670fbf702)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" podUID="be232f65-8167-4e83-83a8-d40670fbf702" Apr 16 19:54:38.989431 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:38.989398 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k4p2g" Apr 16 19:54:39.019015 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:39.018968 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-fvwsf" podStartSLOduration=10.018954055 podStartE2EDuration="10.018954055s" podCreationTimestamp="2026-04-16 19:54:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:30.124715562 +0000 UTC m=+47.847761467" watchObservedRunningTime="2026-04-16 19:54:39.018954055 +0000 UTC m=+56.741999961" Apr 16 19:54:46.108062 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.108024 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:46.108484 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.108110 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18721546-1063-46a1-8715-a40872933b22-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bjppc\" (UID: \"18721546-1063-46a1-8715-a40872933b22\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bjppc" Apr 16 19:54:46.110381 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.110350 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls\") pod \"image-registry-c684876c-h8496\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:46.110509 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.110484 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18721546-1063-46a1-8715-a40872933b22-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bjppc\" (UID: \"18721546-1063-46a1-8715-a40872933b22\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bjppc" Apr 16 19:54:46.209511 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.209474 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-metrics-certs\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:46.209675 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.209575 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3bac591-c06b-4b40-b42a-f85548b297f0-service-ca-bundle\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:46.209675 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.209623 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f95dfef-42c3-454a-9807-3c895a970729-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tpdvb\" (UID: \"0f95dfef-42c3-454a-9807-3c895a970729\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb" Apr 16 19:54:46.210295 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.210266 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3bac591-c06b-4b40-b42a-f85548b297f0-service-ca-bundle\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:46.211819 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.211776 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3bac591-c06b-4b40-b42a-f85548b297f0-metrics-certs\") pod \"router-default-98598b8fb-wkrmq\" (UID: \"b3bac591-c06b-4b40-b42a-f85548b297f0\") " pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:46.211995 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.211975 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f95dfef-42c3-454a-9807-3c895a970729-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tpdvb\" (UID: \"0f95dfef-42c3-454a-9807-3c895a970729\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb" Apr 16 19:54:46.273818 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.273767 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jx8h5\"" Apr 16 19:54:46.281614 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.281596 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:46.310148 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.310103 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xbnwr\" (UID: \"8814ad85-70d4-48f0-8e96-6cc0a48c07eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xbnwr" Apr 16 19:54:46.310304 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.310193 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/165d7242-2ef1-481d-992a-09e3364e0626-cert\") pod \"ingress-canary-k2dcx\" (UID: \"165d7242-2ef1-481d-992a-09e3364e0626\") " pod="openshift-ingress-canary/ingress-canary-k2dcx" Apr 16 19:54:46.312465 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.312443 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/165d7242-2ef1-481d-992a-09e3364e0626-cert\") pod \"ingress-canary-k2dcx\" (UID: \"165d7242-2ef1-481d-992a-09e3364e0626\") " pod="openshift-ingress-canary/ingress-canary-k2dcx" Apr 16 19:54:46.312561 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.312549 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8814ad85-70d4-48f0-8e96-6cc0a48c07eb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xbnwr\" (UID: \"8814ad85-70d4-48f0-8e96-6cc0a48c07eb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xbnwr" Apr 16 19:54:46.335578 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.335549 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-qgkcf\"" Apr 16 19:54:46.343281 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.343260 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bjppc" Apr 16 19:54:46.361161 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.361029 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-p7c6q\"" Apr 16 19:54:46.368710 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.368660 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:46.408387 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.407696 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-c684876c-h8496"] Apr 16 19:54:46.408387 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.408192 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-dwpjk\"" Apr 16 19:54:46.410856 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.410834 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e50b93c-a156-438b-a41f-2b4bac946727-metrics-tls\") pod \"dns-default-cdc8w\" (UID: \"8e50b93c-a156-438b-a41f-2b4bac946727\") " pod="openshift-dns/dns-default-cdc8w" Apr 16 19:54:46.412834 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:54:46.411550 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode54086bc_258c_4204_8105_7a5e491494fa.slice/crio-0d43cde37db8d99d569bc40e5ff5b1824e6095da0199d6223ddf6787b3cada5b WatchSource:0}: Error finding container 0d43cde37db8d99d569bc40e5ff5b1824e6095da0199d6223ddf6787b3cada5b: Status 404 returned error can't find the container with id 0d43cde37db8d99d569bc40e5ff5b1824e6095da0199d6223ddf6787b3cada5b Apr 16 19:54:46.414308 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.414131 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb" Apr 16 19:54:46.427196 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.426131 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e50b93c-a156-438b-a41f-2b4bac946727-metrics-tls\") pod \"dns-default-cdc8w\" (UID: \"8e50b93c-a156-438b-a41f-2b4bac946727\") " pod="openshift-dns/dns-default-cdc8w" Apr 16 19:54:46.484304 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.484263 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bjppc"] Apr 16 19:54:46.488343 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.488316 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-z96j5\"" Apr 16 19:54:46.495420 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.495381 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-xbnwr" Apr 16 19:54:46.521191 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.521137 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-98598b8fb-wkrmq"] Apr 16 19:54:46.524863 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:54:46.524829 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3bac591_c06b_4b40_b42a_f85548b297f0.slice/crio-6a567ba60ebba1689f457df2d5cc17e78ec03488fee8b6eef740fe2c925d7fad WatchSource:0}: Error finding container 6a567ba60ebba1689f457df2d5cc17e78ec03488fee8b6eef740fe2c925d7fad: Status 404 returned error can't find the container with id 6a567ba60ebba1689f457df2d5cc17e78ec03488fee8b6eef740fe2c925d7fad Apr 16 19:54:46.535853 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.535826 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mmrlb\"" Apr 16 19:54:46.542489 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.542469 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dknqq\"" Apr 16 19:54:46.543191 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.543174 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k2dcx" Apr 16 19:54:46.550445 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.550423 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cdc8w" Apr 16 19:54:46.565812 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.565637 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb"] Apr 16 19:54:46.586638 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:54:46.586440 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f95dfef_42c3_454a_9807_3c895a970729.slice/crio-7202f2fa42e70131d8d57e2f22ac6310cc280297ba1a6be8411bc028e9b5cc91 WatchSource:0}: Error finding container 7202f2fa42e70131d8d57e2f22ac6310cc280297ba1a6be8411bc028e9b5cc91: Status 404 returned error can't find the container with id 7202f2fa42e70131d8d57e2f22ac6310cc280297ba1a6be8411bc028e9b5cc91 Apr 16 19:54:46.664225 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.664164 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-xbnwr"] Apr 16 19:54:46.669237 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:54:46.669201 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8814ad85_70d4_48f0_8e96_6cc0a48c07eb.slice/crio-f45d01fae45f614345d415c70abaf7cd5836cb8cb0435ff1beaf6e1ed4b23540 WatchSource:0}: Error finding container f45d01fae45f614345d415c70abaf7cd5836cb8cb0435ff1beaf6e1ed4b23540: Status 404 returned error can't find the container with id f45d01fae45f614345d415c70abaf7cd5836cb8cb0435ff1beaf6e1ed4b23540 Apr 16 19:54:46.735852 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.735824 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k2dcx"] Apr 16 19:54:46.737722 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:54:46.737695 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod165d7242_2ef1_481d_992a_09e3364e0626.slice/crio-2aea93c0b3eb9764052bd9aae6669e6845fce9f782c7c45de5cfdbc028b5c0c6 WatchSource:0}: Error finding container 2aea93c0b3eb9764052bd9aae6669e6845fce9f782c7c45de5cfdbc028b5c0c6: Status 404 returned error can't find the container with id 2aea93c0b3eb9764052bd9aae6669e6845fce9f782c7c45de5cfdbc028b5c0c6 Apr 16 19:54:46.756269 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:46.756242 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cdc8w"] Apr 16 19:54:46.759294 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:54:46.759263 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e50b93c_a156_438b_a41f_2b4bac946727.slice/crio-d0a4c23931298006b0b091bc8049eb090e3dcc05d613938c07aa494792da9b43 WatchSource:0}: Error finding container d0a4c23931298006b0b091bc8049eb090e3dcc05d613938c07aa494792da9b43: Status 404 returned error can't find the container with id d0a4c23931298006b0b091bc8049eb090e3dcc05d613938c07aa494792da9b43 Apr 16 19:54:47.152146 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:47.151155 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c684876c-h8496" event={"ID":"e54086bc-258c-4204-8105-7a5e491494fa","Type":"ContainerStarted","Data":"c6c6ce8213b2384d8c139d7729ac85e589a116234e86c28b6e1e1b49a04ecd54"} Apr 16 19:54:47.152146 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:47.151204 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c684876c-h8496" event={"ID":"e54086bc-258c-4204-8105-7a5e491494fa","Type":"ContainerStarted","Data":"0d43cde37db8d99d569bc40e5ff5b1824e6095da0199d6223ddf6787b3cada5b"} Apr 16 19:54:47.152146 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:47.152087 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:54:47.154290 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:47.154226 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k2dcx" event={"ID":"165d7242-2ef1-481d-992a-09e3364e0626","Type":"ContainerStarted","Data":"2aea93c0b3eb9764052bd9aae6669e6845fce9f782c7c45de5cfdbc028b5c0c6"} Apr 16 19:54:47.155621 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:47.155597 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-xbnwr" event={"ID":"8814ad85-70d4-48f0-8e96-6cc0a48c07eb","Type":"ContainerStarted","Data":"f45d01fae45f614345d415c70abaf7cd5836cb8cb0435ff1beaf6e1ed4b23540"} Apr 16 19:54:47.156950 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:47.156927 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bjppc" event={"ID":"18721546-1063-46a1-8715-a40872933b22","Type":"ContainerStarted","Data":"ea663cb9590ed3e72e11b3ac3c891bf54312362e04b688b89c9f768854ab73f6"} Apr 16 19:54:47.159099 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:47.159055 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cdc8w" event={"ID":"8e50b93c-a156-438b-a41f-2b4bac946727","Type":"ContainerStarted","Data":"d0a4c23931298006b0b091bc8049eb090e3dcc05d613938c07aa494792da9b43"} Apr 16 19:54:47.161244 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:47.161198 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb" event={"ID":"0f95dfef-42c3-454a-9807-3c895a970729","Type":"ContainerStarted","Data":"7202f2fa42e70131d8d57e2f22ac6310cc280297ba1a6be8411bc028e9b5cc91"} Apr 16 19:54:47.163742 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:47.163715 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-98598b8fb-wkrmq" event={"ID":"b3bac591-c06b-4b40-b42a-f85548b297f0","Type":"ContainerStarted","Data":"7ec40583cfe978b33e773ba2a6e539a5d60b05a9e6c22ab1c55cd4b69a561ed9"} Apr 16 19:54:47.163742 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:47.163749 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-98598b8fb-wkrmq" event={"ID":"b3bac591-c06b-4b40-b42a-f85548b297f0","Type":"ContainerStarted","Data":"6a567ba60ebba1689f457df2d5cc17e78ec03488fee8b6eef740fe2c925d7fad"} Apr 16 19:54:47.230147 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:47.228637 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-c684876c-h8496" podStartSLOduration=64.228618637 podStartE2EDuration="1m4.228618637s" podCreationTimestamp="2026-04-16 19:53:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:47.193207575 +0000 UTC m=+64.916253484" watchObservedRunningTime="2026-04-16 19:54:47.228618637 +0000 UTC m=+64.951664543" Apr 16 19:54:47.370710 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:47.369839 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:47.373569 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:47.373362 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:47.395649 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:47.395350 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-98598b8fb-wkrmq" podStartSLOduration=41.39533241 podStartE2EDuration="41.39533241s" podCreationTimestamp="2026-04-16 19:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:47.230117293 +0000 UTC m=+64.953163201" watchObservedRunningTime="2026-04-16 19:54:47.39533241 +0000 UTC m=+65.118378351" Apr 16 19:54:48.167072 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:48.167040 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:48.168412 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:48.168389 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-98598b8fb-wkrmq" Apr 16 19:54:48.533135 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:48.533061 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs\") pod \"network-metrics-daemon-nx45q\" (UID: \"e0b9420a-1c3e-47b5-b187-827cb7f39aea\") " pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:54:48.535283 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:48.535266 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0b9420a-1c3e-47b5-b187-827cb7f39aea-metrics-certs\") pod \"network-metrics-daemon-nx45q\" (UID: \"e0b9420a-1c3e-47b5-b187-827cb7f39aea\") " pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:54:48.750260 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:48.750228 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-t9ktw\"" Apr 16 19:54:48.757771 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:48.757740 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nx45q" Apr 16 19:54:48.836089 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:48.836012 2561 scope.go:117] "RemoveContainer" containerID="8d23051cca59d5bc7aec1043baee8b464476c51c7d435626ff1c2d2a6471ed7c" Apr 16 19:54:50.624749 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:50.624719 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9wh9h"] Apr 16 19:54:50.636750 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:50.636725 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9wh9h" Apr 16 19:54:50.642994 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:50.642759 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 19:54:50.642994 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:50.642759 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 19:54:50.643164 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:50.642766 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-676kc\"" Apr 16 19:54:50.643365 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:50.643344 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9wh9h"] Apr 16 19:54:50.751905 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:50.751838 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz24w\" (UniqueName: \"kubernetes.io/projected/74e6b4dd-9a81-4310-8471-186da2714610-kube-api-access-rz24w\") pod \"insights-runtime-extractor-9wh9h\" (UID: \"74e6b4dd-9a81-4310-8471-186da2714610\") " pod="openshift-insights/insights-runtime-extractor-9wh9h" Apr 16 19:54:50.751905 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:50.751876 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/74e6b4dd-9a81-4310-8471-186da2714610-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9wh9h\" (UID: \"74e6b4dd-9a81-4310-8471-186da2714610\") " pod="openshift-insights/insights-runtime-extractor-9wh9h" Apr 16 19:54:50.752061 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:50.751905 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/74e6b4dd-9a81-4310-8471-186da2714610-crio-socket\") pod \"insights-runtime-extractor-9wh9h\" (UID: \"74e6b4dd-9a81-4310-8471-186da2714610\") " pod="openshift-insights/insights-runtime-extractor-9wh9h" Apr 16 19:54:50.752061 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:50.751947 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/74e6b4dd-9a81-4310-8471-186da2714610-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9wh9h\" (UID: \"74e6b4dd-9a81-4310-8471-186da2714610\") " pod="openshift-insights/insights-runtime-extractor-9wh9h" Apr 16 19:54:50.752061 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:50.752011 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/74e6b4dd-9a81-4310-8471-186da2714610-data-volume\") pod \"insights-runtime-extractor-9wh9h\" (UID: \"74e6b4dd-9a81-4310-8471-186da2714610\") " pod="openshift-insights/insights-runtime-extractor-9wh9h" Apr 16 19:54:50.853063 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:50.853033 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/74e6b4dd-9a81-4310-8471-186da2714610-crio-socket\") pod \"insights-runtime-extractor-9wh9h\" (UID: \"74e6b4dd-9a81-4310-8471-186da2714610\") " pod="openshift-insights/insights-runtime-extractor-9wh9h" Apr 16 19:54:50.853213 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:50.853082 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/74e6b4dd-9a81-4310-8471-186da2714610-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9wh9h\" (UID: \"74e6b4dd-9a81-4310-8471-186da2714610\") " pod="openshift-insights/insights-runtime-extractor-9wh9h" Apr 16 19:54:50.853213 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:50.853118 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/74e6b4dd-9a81-4310-8471-186da2714610-data-volume\") pod \"insights-runtime-extractor-9wh9h\" (UID: \"74e6b4dd-9a81-4310-8471-186da2714610\") " pod="openshift-insights/insights-runtime-extractor-9wh9h" Apr 16 19:54:50.853213 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:50.853141 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rz24w\" (UniqueName: \"kubernetes.io/projected/74e6b4dd-9a81-4310-8471-186da2714610-kube-api-access-rz24w\") pod \"insights-runtime-extractor-9wh9h\" (UID: \"74e6b4dd-9a81-4310-8471-186da2714610\") " pod="openshift-insights/insights-runtime-extractor-9wh9h" Apr 16 19:54:50.853213 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:50.853155 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/74e6b4dd-9a81-4310-8471-186da2714610-crio-socket\") pod \"insights-runtime-extractor-9wh9h\" (UID: \"74e6b4dd-9a81-4310-8471-186da2714610\") " pod="openshift-insights/insights-runtime-extractor-9wh9h" Apr 16 19:54:50.853213 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:50.853165 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/74e6b4dd-9a81-4310-8471-186da2714610-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9wh9h\" (UID: \"74e6b4dd-9a81-4310-8471-186da2714610\") " pod="openshift-insights/insights-runtime-extractor-9wh9h" Apr 16 19:54:50.853486 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:50.853470 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/74e6b4dd-9a81-4310-8471-186da2714610-data-volume\") pod \"insights-runtime-extractor-9wh9h\" (UID: \"74e6b4dd-9a81-4310-8471-186da2714610\") " pod="openshift-insights/insights-runtime-extractor-9wh9h" Apr 16 19:54:50.853715 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:50.853689 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/74e6b4dd-9a81-4310-8471-186da2714610-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9wh9h\" (UID: \"74e6b4dd-9a81-4310-8471-186da2714610\") " pod="openshift-insights/insights-runtime-extractor-9wh9h" Apr 16 19:54:50.864168 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:50.864145 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/74e6b4dd-9a81-4310-8471-186da2714610-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9wh9h\" (UID: \"74e6b4dd-9a81-4310-8471-186da2714610\") " pod="openshift-insights/insights-runtime-extractor-9wh9h" Apr 16 19:54:50.871453 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:50.871429 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz24w\" (UniqueName: \"kubernetes.io/projected/74e6b4dd-9a81-4310-8471-186da2714610-kube-api-access-rz24w\") pod \"insights-runtime-extractor-9wh9h\" (UID: \"74e6b4dd-9a81-4310-8471-186da2714610\") " pod="openshift-insights/insights-runtime-extractor-9wh9h" Apr 16 19:54:50.947499 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:50.947477 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9wh9h" Apr 16 19:54:51.121678 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:51.121630 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nx45q"] Apr 16 19:54:51.126035 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:54:51.126010 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0b9420a_1c3e_47b5_b187_827cb7f39aea.slice/crio-c2b4771ba6e90f335a554a636fbcff7f07ab50cd83045866ed7b41108214d5af WatchSource:0}: Error finding container c2b4771ba6e90f335a554a636fbcff7f07ab50cd83045866ed7b41108214d5af: Status 404 returned error can't find the container with id c2b4771ba6e90f335a554a636fbcff7f07ab50cd83045866ed7b41108214d5af Apr 16 19:54:51.147044 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:51.146998 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9wh9h"] Apr 16 19:54:51.154842 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:54:51.154812 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74e6b4dd_9a81_4310_8471_186da2714610.slice/crio-f21d9774d218038a09671c57c7cde2a601f15c7bc190c7b8ddcb41ebc64a7ed5 WatchSource:0}: Error finding container f21d9774d218038a09671c57c7cde2a601f15c7bc190c7b8ddcb41ebc64a7ed5: Status 404 returned error can't find the container with id f21d9774d218038a09671c57c7cde2a601f15c7bc190c7b8ddcb41ebc64a7ed5 Apr 16 19:54:51.177586 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:51.177525 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k2dcx" event={"ID":"165d7242-2ef1-481d-992a-09e3364e0626","Type":"ContainerStarted","Data":"9fb671dba43ca4c231cc81bcc78d1a89e522af9ec4e11761c5cc4242001abed9"} Apr 16 19:54:51.179330 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:51.179278 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-xbnwr" event={"ID":"8814ad85-70d4-48f0-8e96-6cc0a48c07eb","Type":"ContainerStarted","Data":"4fa8f7f57c671a6fb5e1f575d9ee58dced18fb6aa1e025888b01209bfd4cfe44"} Apr 16 19:54:51.180662 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:51.180641 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9wh9h" event={"ID":"74e6b4dd-9a81-4310-8471-186da2714610","Type":"ContainerStarted","Data":"f21d9774d218038a09671c57c7cde2a601f15c7bc190c7b8ddcb41ebc64a7ed5"} Apr 16 19:54:51.182263 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:51.182240 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb" event={"ID":"0f95dfef-42c3-454a-9807-3c895a970729","Type":"ContainerStarted","Data":"339327b92965e70e0da4bf4cee1ead89bb30e5ecb72935415e3512fedf666843"} Apr 16 19:54:51.186458 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:51.186077 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/1.log" Apr 16 19:54:51.186458 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:51.186146 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" event={"ID":"be232f65-8167-4e83-83a8-d40670fbf702","Type":"ContainerStarted","Data":"f747d1fcb421c6cb0f2be42d3b76a3d8d4a198490d38d0171529d265cad7f602"} Apr 16 19:54:51.186944 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:51.186903 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" Apr 16 19:54:51.191315 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:51.190102 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nx45q" event={"ID":"e0b9420a-1c3e-47b5-b187-827cb7f39aea","Type":"ContainerStarted","Data":"c2b4771ba6e90f335a554a636fbcff7f07ab50cd83045866ed7b41108214d5af"} Apr 16 19:54:51.336058 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:51.335961 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" podStartSLOduration=36.406443956 podStartE2EDuration="45.335933636s" podCreationTimestamp="2026-04-16 19:54:06 +0000 UTC" firstStartedPulling="2026-04-16 19:54:16.408170428 +0000 UTC m=+34.131216314" lastFinishedPulling="2026-04-16 19:54:25.337660103 +0000 UTC m=+43.060705994" observedRunningTime="2026-04-16 19:54:51.334812734 +0000 UTC m=+69.057858641" watchObservedRunningTime="2026-04-16 19:54:51.335933636 +0000 UTC m=+69.058979537" Apr 16 19:54:51.338972 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:51.337059 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tpdvb" podStartSLOduration=40.970025377 podStartE2EDuration="45.337047326s" podCreationTimestamp="2026-04-16 19:54:06 +0000 UTC" firstStartedPulling="2026-04-16 19:54:46.588947086 +0000 UTC m=+64.311992990" lastFinishedPulling="2026-04-16 19:54:50.955969045 +0000 UTC m=+68.679014939" observedRunningTime="2026-04-16 19:54:51.235081722 +0000 UTC m=+68.958127630" watchObservedRunningTime="2026-04-16 19:54:51.337047326 +0000 UTC m=+69.060093229" Apr 16 19:54:52.092358 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:52.092327 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-ln2rf" Apr 16 19:54:52.202646 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:52.202535 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bjppc" event={"ID":"18721546-1063-46a1-8715-a40872933b22","Type":"ContainerStarted","Data":"1dd418614ba4ec8be473eeea8db6cab64c33ec9c729312a8863369b70019b537"} Apr 16 19:54:52.202646 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:52.202579 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bjppc" event={"ID":"18721546-1063-46a1-8715-a40872933b22","Type":"ContainerStarted","Data":"486a36bb12160b649f09e6a2bd2cd664996843d26961327c88c5f45e04d90ebb"} Apr 16 19:54:52.204224 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:52.204165 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9wh9h" event={"ID":"74e6b4dd-9a81-4310-8471-186da2714610","Type":"ContainerStarted","Data":"06c3eb71eeca09eff296e53659a4b4499ae158b8891d40cb7edc1c7065b23682"} Apr 16 19:54:52.206887 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:52.206862 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cdc8w" event={"ID":"8e50b93c-a156-438b-a41f-2b4bac946727","Type":"ContainerStarted","Data":"e712fcfe33cc8ecbc738466582bef2bb0488e4e3b5a78145a009e27185ebb3cc"} Apr 16 19:54:52.206992 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:52.206972 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cdc8w" event={"ID":"8e50b93c-a156-438b-a41f-2b4bac946727","Type":"ContainerStarted","Data":"53586298f9e254f75e060594919f9340340ea106921c3aeea8510e8b36274bdc"} Apr 16 19:54:52.234201 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:52.234141 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bjppc" podStartSLOduration=41.812512356 podStartE2EDuration="46.23412263s" podCreationTimestamp="2026-04-16 19:54:06 +0000 UTC" firstStartedPulling="2026-04-16 19:54:46.543128076 +0000 UTC m=+64.266173960" lastFinishedPulling="2026-04-16 19:54:50.964738337 +0000 UTC m=+68.687784234" observedRunningTime="2026-04-16 19:54:52.23264123 +0000 UTC m=+69.955687137" watchObservedRunningTime="2026-04-16 19:54:52.23412263 +0000 UTC m=+69.957168537" Apr 16 19:54:52.298914 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:52.298867 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-k2dcx" podStartSLOduration=34.082278034 podStartE2EDuration="38.298853507s" podCreationTimestamp="2026-04-16 19:54:14 +0000 UTC" firstStartedPulling="2026-04-16 19:54:46.739730416 +0000 UTC m=+64.462776309" lastFinishedPulling="2026-04-16 19:54:50.956305885 +0000 UTC m=+68.679351782" observedRunningTime="2026-04-16 19:54:52.297864377 +0000 UTC m=+70.020910280" watchObservedRunningTime="2026-04-16 19:54:52.298853507 +0000 UTC m=+70.021899413" Apr 16 19:54:52.327431 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:52.327385 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cdc8w" podStartSLOduration=34.132313995 podStartE2EDuration="38.327368883s" podCreationTimestamp="2026-04-16 19:54:14 +0000 UTC" firstStartedPulling="2026-04-16 19:54:46.761540869 +0000 UTC m=+64.484586755" lastFinishedPulling="2026-04-16 19:54:50.956595744 +0000 UTC m=+68.679641643" observedRunningTime="2026-04-16 19:54:52.326715106 +0000 UTC m=+70.049761016" watchObservedRunningTime="2026-04-16 19:54:52.327368883 +0000 UTC m=+70.050414790" Apr 16 19:54:52.363757 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:52.363653 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-xbnwr" podStartSLOduration=37.081922583 podStartE2EDuration="41.363635811s" podCreationTimestamp="2026-04-16 19:54:11 +0000 UTC" firstStartedPulling="2026-04-16 19:54:46.674144665 +0000 UTC m=+64.397190563" lastFinishedPulling="2026-04-16 19:54:50.955857904 +0000 UTC m=+68.678903791" observedRunningTime="2026-04-16 19:54:52.363477657 +0000 UTC m=+70.086523563" watchObservedRunningTime="2026-04-16 19:54:52.363635811 +0000 UTC m=+70.086681718" Apr 16 19:54:53.211866 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:53.211838 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9wh9h" event={"ID":"74e6b4dd-9a81-4310-8471-186da2714610","Type":"ContainerStarted","Data":"b317464b406fa258522e15c044f6a9457b28d82f25bebde132b17f5d1653ebb1"} Apr 16 19:54:53.213212 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:53.213188 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nx45q" event={"ID":"e0b9420a-1c3e-47b5-b187-827cb7f39aea","Type":"ContainerStarted","Data":"f567e16f456268c5d969adf067888eaf7151ddefd36b14f128c62c9dfb78c8c7"} Apr 16 19:54:53.213608 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:53.213587 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-cdc8w" Apr 16 19:54:54.222609 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:54.222567 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nx45q" event={"ID":"e0b9420a-1c3e-47b5-b187-827cb7f39aea","Type":"ContainerStarted","Data":"b2632962c3a20eb2307638c4f24953ec1ea762e7bc166e50a59bc73fb6ed3513"} Apr 16 19:54:54.242121 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:54.242063 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-nx45q" podStartSLOduration=69.472924779 podStartE2EDuration="1m11.242046113s" podCreationTimestamp="2026-04-16 19:53:43 +0000 UTC" firstStartedPulling="2026-04-16 19:54:51.129443529 +0000 UTC m=+68.852489427" lastFinishedPulling="2026-04-16 19:54:52.898564873 +0000 UTC m=+70.621610761" observedRunningTime="2026-04-16 19:54:54.239762623 +0000 UTC m=+71.962808531" watchObservedRunningTime="2026-04-16 19:54:54.242046113 +0000 UTC m=+71.965092000" Apr 16 19:54:55.229129 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:55.229091 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9wh9h" event={"ID":"74e6b4dd-9a81-4310-8471-186da2714610","Type":"ContainerStarted","Data":"849d60249b41f5f0a54b2316e9cbbe4347e6f26a3a9d3186ab40a8b539b1c109"} Apr 16 19:54:55.251550 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:55.251503 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9wh9h" podStartSLOduration=1.686314745 podStartE2EDuration="5.25148807s" podCreationTimestamp="2026-04-16 19:54:50 +0000 UTC" firstStartedPulling="2026-04-16 19:54:51.289295591 +0000 UTC m=+69.012341483" lastFinishedPulling="2026-04-16 19:54:54.854468917 +0000 UTC m=+72.577514808" observedRunningTime="2026-04-16 19:54:55.251099772 +0000 UTC m=+72.974145689" watchObservedRunningTime="2026-04-16 19:54:55.25148807 +0000 UTC m=+72.974533975" Apr 16 19:54:57.086891 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:54:57.086853 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qnt55" Apr 16 19:55:01.317110 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.316980 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-nsnhg"] Apr 16 19:55:01.320540 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.320513 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.327602 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.327390 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 19:55:01.327602 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.327500 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 19:55:01.327602 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.327519 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 19:55:01.327889 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.327872 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 19:55:01.328723 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.328702 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-6xp24\"" Apr 16 19:55:01.428783 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.428750 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm6p9\" (UniqueName: \"kubernetes.io/projected/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-kube-api-access-dm6p9\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.428911 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.428810 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-sys\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.428911 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.428872 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-node-exporter-tls\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.429024 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.428922 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-node-exporter-wtmp\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.429024 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.428956 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-metrics-client-ca\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.429024 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.429018 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.429173 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.429040 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-node-exporter-accelerators-collector-config\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.429173 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.429066 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-node-exporter-textfile\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.429173 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.429087 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-root\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.530091 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.530066 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-node-exporter-tls\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.530217 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.530094 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-node-exporter-wtmp\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.530217 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.530134 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-metrics-client-ca\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.530217 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.530183 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.530217 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.530210 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-node-exporter-accelerators-collector-config\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.530479 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:55:01.530233 2561 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 19:55:01.530479 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.530290 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-node-exporter-wtmp\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.530479 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:55:01.530308 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-node-exporter-tls podName:d24c0325-ce50-47e5-98c8-87e5ba46e3ca nodeName:}" failed. No retries permitted until 2026-04-16 19:55:02.030286537 +0000 UTC m=+79.753332425 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-node-exporter-tls") pod "node-exporter-nsnhg" (UID: "d24c0325-ce50-47e5-98c8-87e5ba46e3ca") : secret "node-exporter-tls" not found Apr 16 19:55:01.530753 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.530238 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-node-exporter-textfile\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.530885 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.530782 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-root\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.530885 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.530841 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dm6p9\" (UniqueName: \"kubernetes.io/projected/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-kube-api-access-dm6p9\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.530885 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.530873 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-metrics-client-ca\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.531041 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.530924 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-root\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.531041 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.530925 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-sys\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.531041 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.530877 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-sys\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.531403 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.531385 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-node-exporter-textfile\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.531561 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.531541 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-node-exporter-accelerators-collector-config\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.532500 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.532485 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:01.542159 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:01.542137 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm6p9\" (UniqueName: \"kubernetes.io/projected/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-kube-api-access-dm6p9\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:02.035765 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:02.035732 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-node-exporter-tls\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:02.038094 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:02.038071 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d24c0325-ce50-47e5-98c8-87e5ba46e3ca-node-exporter-tls\") pod \"node-exporter-nsnhg\" (UID: \"d24c0325-ce50-47e5-98c8-87e5ba46e3ca\") " pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:02.230091 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:02.230062 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nsnhg" Apr 16 19:55:02.238972 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:55:02.238938 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd24c0325_ce50_47e5_98c8_87e5ba46e3ca.slice/crio-94ff95909f487110d52f0256ed3acb545de33a675edb0f537a743532ab013841 WatchSource:0}: Error finding container 94ff95909f487110d52f0256ed3acb545de33a675edb0f537a743532ab013841: Status 404 returned error can't find the container with id 94ff95909f487110d52f0256ed3acb545de33a675edb0f537a743532ab013841 Apr 16 19:55:02.251970 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:02.251944 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nsnhg" event={"ID":"d24c0325-ce50-47e5-98c8-87e5ba46e3ca","Type":"ContainerStarted","Data":"94ff95909f487110d52f0256ed3acb545de33a675edb0f537a743532ab013841"} Apr 16 19:55:03.224629 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:03.224578 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cdc8w" Apr 16 19:55:03.255929 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:03.255896 2561 generic.go:358] "Generic (PLEG): container finished" podID="d24c0325-ce50-47e5-98c8-87e5ba46e3ca" containerID="ab0ad7da24ed70b4b10031e82de368f80d06975a3a69096ff3ce58f1d1c48844" exitCode=0 Apr 16 19:55:03.256033 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:03.255971 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nsnhg" event={"ID":"d24c0325-ce50-47e5-98c8-87e5ba46e3ca","Type":"ContainerDied","Data":"ab0ad7da24ed70b4b10031e82de368f80d06975a3a69096ff3ce58f1d1c48844"} Apr 16 19:55:04.261730 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:04.261689 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nsnhg" event={"ID":"d24c0325-ce50-47e5-98c8-87e5ba46e3ca","Type":"ContainerStarted","Data":"eee40acda57d5356cb43f3980b0f037ef6160982b28d7551be5d844e1f514be4"} Apr 16 19:55:04.261730 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:04.261731 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nsnhg" event={"ID":"d24c0325-ce50-47e5-98c8-87e5ba46e3ca","Type":"ContainerStarted","Data":"4016450e5124e75c738abdda298f792524fabc0992fdeea3b16e5c8031adeb3e"} Apr 16 19:55:04.290702 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:04.290660 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-nsnhg" podStartSLOduration=2.57577575 podStartE2EDuration="3.290648526s" podCreationTimestamp="2026-04-16 19:55:01 +0000 UTC" firstStartedPulling="2026-04-16 19:55:02.240605112 +0000 UTC m=+79.963650996" lastFinishedPulling="2026-04-16 19:55:02.955477881 +0000 UTC m=+80.678523772" observedRunningTime="2026-04-16 19:55:04.289073913 +0000 UTC m=+82.012119821" watchObservedRunningTime="2026-04-16 19:55:04.290648526 +0000 UTC m=+82.013694431" Apr 16 19:55:06.286192 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:06.286159 2561 patch_prober.go:28] interesting pod/image-registry-c684876c-h8496 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 19:55:06.286589 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:06.286212 2561 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-c684876c-h8496" podUID="e54086bc-258c-4204-8105-7a5e491494fa" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 19:55:09.173958 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:09.173927 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:55:14.456322 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:14.456293 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-c684876c-h8496"] Apr 16 19:55:31.345364 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:31.345333 2561 generic.go:358] "Generic (PLEG): container finished" podID="3e7e6010-fad1-4881-8816-b024c8853151" containerID="8276f610bd9e2af177600a1040aafd963d7f772c7449b79c39970a9fb0bf1f71" exitCode=0 Apr 16 19:55:31.345725 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:31.345379 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-spvh5" event={"ID":"3e7e6010-fad1-4881-8816-b024c8853151","Type":"ContainerDied","Data":"8276f610bd9e2af177600a1040aafd963d7f772c7449b79c39970a9fb0bf1f71"} Apr 16 19:55:31.345725 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:31.345635 2561 scope.go:117] "RemoveContainer" containerID="8276f610bd9e2af177600a1040aafd963d7f772c7449b79c39970a9fb0bf1f71" Apr 16 19:55:32.350301 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:32.350264 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-spvh5" event={"ID":"3e7e6010-fad1-4881-8816-b024c8853151","Type":"ContainerStarted","Data":"ae53f1d66f9ceb34ff2fa8daa3b092beff0b4fa3ce891dff5bd46c6a4c6fec2b"} Apr 16 19:55:36.362747 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:36.362664 2561 generic.go:358] "Generic (PLEG): container finished" podID="b33484f0-9ae5-4b31-8baa-d4219e39ddd9" containerID="db25546ffd1112c644c80578a1b34789eaf59441cf5c91f4ffd1f1e302c2bd0c" exitCode=0 Apr 16 19:55:36.362747 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:36.362736 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-mwqfp" event={"ID":"b33484f0-9ae5-4b31-8baa-d4219e39ddd9","Type":"ContainerDied","Data":"db25546ffd1112c644c80578a1b34789eaf59441cf5c91f4ffd1f1e302c2bd0c"} Apr 16 19:55:36.363156 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:36.363069 2561 scope.go:117] "RemoveContainer" containerID="db25546ffd1112c644c80578a1b34789eaf59441cf5c91f4ffd1f1e302c2bd0c" Apr 16 19:55:37.028523 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:37.028492 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-98598b8fb-wkrmq_b3bac591-c06b-4b40-b42a-f85548b297f0/router/0.log" Apr 16 19:55:37.051327 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:37.051303 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-k2dcx_165d7242-2ef1-481d-992a-09e3364e0626/serve-healthcheck-canary/0.log" Apr 16 19:55:37.367300 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:37.367226 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-mwqfp" event={"ID":"b33484f0-9ae5-4b31-8baa-d4219e39ddd9","Type":"ContainerStarted","Data":"80d8c03dbad127ff24cd2b6637230b3ceb8bdbe324cab3c45cb8b3b7ed07ad1d"} Apr 16 19:55:39.475208 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.475164 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-c684876c-h8496" podUID="e54086bc-258c-4204-8105-7a5e491494fa" containerName="registry" containerID="cri-o://c6c6ce8213b2384d8c139d7729ac85e589a116234e86c28b6e1e1b49a04ecd54" gracePeriod=30 Apr 16 19:55:39.709089 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.709064 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:55:39.803045 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.802952 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e54086bc-258c-4204-8105-7a5e491494fa-ca-trust-extracted\") pod \"e54086bc-258c-4204-8105-7a5e491494fa\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " Apr 16 19:55:39.803045 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.803005 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e54086bc-258c-4204-8105-7a5e491494fa-registry-certificates\") pod \"e54086bc-258c-4204-8105-7a5e491494fa\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " Apr 16 19:55:39.803045 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.803027 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e54086bc-258c-4204-8105-7a5e491494fa-trusted-ca\") pod \"e54086bc-258c-4204-8105-7a5e491494fa\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " Apr 16 19:55:39.803318 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.803055 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-bound-sa-token\") pod \"e54086bc-258c-4204-8105-7a5e491494fa\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " Apr 16 19:55:39.803318 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.803113 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh8tt\" (UniqueName: \"kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-kube-api-access-gh8tt\") pod \"e54086bc-258c-4204-8105-7a5e491494fa\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " Apr 16 19:55:39.803318 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.803138 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e54086bc-258c-4204-8105-7a5e491494fa-installation-pull-secrets\") pod \"e54086bc-258c-4204-8105-7a5e491494fa\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " Apr 16 19:55:39.803318 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.803213 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e54086bc-258c-4204-8105-7a5e491494fa-image-registry-private-configuration\") pod \"e54086bc-258c-4204-8105-7a5e491494fa\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " Apr 16 19:55:39.803318 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.803242 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls\") pod \"e54086bc-258c-4204-8105-7a5e491494fa\" (UID: \"e54086bc-258c-4204-8105-7a5e491494fa\") " Apr 16 19:55:39.803546 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.803403 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e54086bc-258c-4204-8105-7a5e491494fa-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e54086bc-258c-4204-8105-7a5e491494fa" (UID: "e54086bc-258c-4204-8105-7a5e491494fa"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:39.803546 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.803418 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e54086bc-258c-4204-8105-7a5e491494fa-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e54086bc-258c-4204-8105-7a5e491494fa" (UID: "e54086bc-258c-4204-8105-7a5e491494fa"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:39.805620 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.805590 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e54086bc-258c-4204-8105-7a5e491494fa" (UID: "e54086bc-258c-4204-8105-7a5e491494fa"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:39.805957 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.805924 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e54086bc-258c-4204-8105-7a5e491494fa-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "e54086bc-258c-4204-8105-7a5e491494fa" (UID: "e54086bc-258c-4204-8105-7a5e491494fa"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:55:39.806080 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.806015 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e54086bc-258c-4204-8105-7a5e491494fa" (UID: "e54086bc-258c-4204-8105-7a5e491494fa"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:39.806080 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.806040 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e54086bc-258c-4204-8105-7a5e491494fa-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e54086bc-258c-4204-8105-7a5e491494fa" (UID: "e54086bc-258c-4204-8105-7a5e491494fa"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:55:39.806156 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.806087 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-kube-api-access-gh8tt" (OuterVolumeSpecName: "kube-api-access-gh8tt") pod "e54086bc-258c-4204-8105-7a5e491494fa" (UID: "e54086bc-258c-4204-8105-7a5e491494fa"). InnerVolumeSpecName "kube-api-access-gh8tt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:39.813854 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.813828 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e54086bc-258c-4204-8105-7a5e491494fa-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e54086bc-258c-4204-8105-7a5e491494fa" (UID: "e54086bc-258c-4204-8105-7a5e491494fa"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:55:39.904162 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.904127 2561 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e54086bc-258c-4204-8105-7a5e491494fa-image-registry-private-configuration\") on node \"ip-10-0-128-201.ec2.internal\" DevicePath \"\"" Apr 16 19:55:39.904162 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.904157 2561 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-registry-tls\") on node \"ip-10-0-128-201.ec2.internal\" DevicePath \"\"" Apr 16 19:55:39.904162 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.904168 2561 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e54086bc-258c-4204-8105-7a5e491494fa-ca-trust-extracted\") on node \"ip-10-0-128-201.ec2.internal\" DevicePath \"\"" Apr 16 19:55:39.904384 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.904176 2561 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e54086bc-258c-4204-8105-7a5e491494fa-registry-certificates\") on node \"ip-10-0-128-201.ec2.internal\" DevicePath \"\"" Apr 16 19:55:39.904384 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.904185 2561 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e54086bc-258c-4204-8105-7a5e491494fa-trusted-ca\") on node \"ip-10-0-128-201.ec2.internal\" DevicePath \"\"" Apr 16 19:55:39.904384 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.904194 2561 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-bound-sa-token\") on node \"ip-10-0-128-201.ec2.internal\" DevicePath \"\"" Apr 16 19:55:39.904384 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.904202 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gh8tt\" (UniqueName: \"kubernetes.io/projected/e54086bc-258c-4204-8105-7a5e491494fa-kube-api-access-gh8tt\") on node \"ip-10-0-128-201.ec2.internal\" DevicePath \"\"" Apr 16 19:55:39.904384 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:39.904211 2561 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e54086bc-258c-4204-8105-7a5e491494fa-installation-pull-secrets\") on node \"ip-10-0-128-201.ec2.internal\" DevicePath \"\"" Apr 16 19:55:40.375585 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:40.375548 2561 generic.go:358] "Generic (PLEG): container finished" podID="e54086bc-258c-4204-8105-7a5e491494fa" containerID="c6c6ce8213b2384d8c139d7729ac85e589a116234e86c28b6e1e1b49a04ecd54" exitCode=0 Apr 16 19:55:40.375751 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:40.375608 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c684876c-h8496" Apr 16 19:55:40.375751 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:40.375626 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c684876c-h8496" event={"ID":"e54086bc-258c-4204-8105-7a5e491494fa","Type":"ContainerDied","Data":"c6c6ce8213b2384d8c139d7729ac85e589a116234e86c28b6e1e1b49a04ecd54"} Apr 16 19:55:40.375751 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:40.375666 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c684876c-h8496" event={"ID":"e54086bc-258c-4204-8105-7a5e491494fa","Type":"ContainerDied","Data":"0d43cde37db8d99d569bc40e5ff5b1824e6095da0199d6223ddf6787b3cada5b"} Apr 16 19:55:40.375751 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:40.375681 2561 scope.go:117] "RemoveContainer" containerID="c6c6ce8213b2384d8c139d7729ac85e589a116234e86c28b6e1e1b49a04ecd54" Apr 16 19:55:40.386414 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:40.386022 2561 scope.go:117] "RemoveContainer" containerID="c6c6ce8213b2384d8c139d7729ac85e589a116234e86c28b6e1e1b49a04ecd54" Apr 16 19:55:40.386657 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:55:40.386488 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6c6ce8213b2384d8c139d7729ac85e589a116234e86c28b6e1e1b49a04ecd54\": container with ID starting with c6c6ce8213b2384d8c139d7729ac85e589a116234e86c28b6e1e1b49a04ecd54 not found: ID does not exist" containerID="c6c6ce8213b2384d8c139d7729ac85e589a116234e86c28b6e1e1b49a04ecd54" Apr 16 19:55:40.386657 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:40.386528 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6c6ce8213b2384d8c139d7729ac85e589a116234e86c28b6e1e1b49a04ecd54"} err="failed to get container status \"c6c6ce8213b2384d8c139d7729ac85e589a116234e86c28b6e1e1b49a04ecd54\": rpc error: code = NotFound desc = could not find container \"c6c6ce8213b2384d8c139d7729ac85e589a116234e86c28b6e1e1b49a04ecd54\": container with ID starting with c6c6ce8213b2384d8c139d7729ac85e589a116234e86c28b6e1e1b49a04ecd54 not found: ID does not exist" Apr 16 19:55:40.400997 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:40.400970 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-c684876c-h8496"] Apr 16 19:55:40.404329 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:40.404311 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-c684876c-h8496"] Apr 16 19:55:40.839466 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:55:40.839424 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e54086bc-258c-4204-8105-7a5e491494fa" path="/var/lib/kubelet/pods/e54086bc-258c-4204-8105-7a5e491494fa/volumes" Apr 16 19:56:01.438942 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:56:01.438908 2561 generic.go:358] "Generic (PLEG): container finished" podID="3e23a4d9-ff98-49a4-a888-9d26648f61cf" containerID="5cef1d6ba75b88775ff27aaafb0ca8e90c5314adc98542ef8d896f0852129b02" exitCode=0 Apr 16 19:56:01.439293 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:56:01.438961 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-94k49" event={"ID":"3e23a4d9-ff98-49a4-a888-9d26648f61cf","Type":"ContainerDied","Data":"5cef1d6ba75b88775ff27aaafb0ca8e90c5314adc98542ef8d896f0852129b02"} Apr 16 19:56:01.439293 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:56:01.439261 2561 scope.go:117] "RemoveContainer" containerID="5cef1d6ba75b88775ff27aaafb0ca8e90c5314adc98542ef8d896f0852129b02" Apr 16 19:56:02.443015 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:56:02.442975 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-94k49" event={"ID":"3e23a4d9-ff98-49a4-a888-9d26648f61cf","Type":"ContainerStarted","Data":"167d8e993876012b36829da393ff67ca1e85bb0e95ea39078fc6b353636f0db7"} Apr 16 19:57:45.132908 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:45.132871 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4nrns"] Apr 16 19:57:45.133312 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:45.133151 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e54086bc-258c-4204-8105-7a5e491494fa" containerName="registry" Apr 16 19:57:45.133312 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:45.133161 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54086bc-258c-4204-8105-7a5e491494fa" containerName="registry" Apr 16 19:57:45.133312 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:45.133211 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="e54086bc-258c-4204-8105-7a5e491494fa" containerName="registry" Apr 16 19:57:45.134967 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:45.134951 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4nrns" Apr 16 19:57:45.137960 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:45.137934 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-vc4s8\"" Apr 16 19:57:45.138089 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:45.137935 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 19:57:45.138089 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:45.137945 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 19:57:45.138089 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:45.138008 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 19:57:45.146329 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:45.146311 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4nrns"] Apr 16 19:57:45.229688 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:45.229652 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z85rx\" (UniqueName: \"kubernetes.io/projected/09b22423-6955-4fb3-ad8c-39bf3fe6ea43-kube-api-access-z85rx\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4nrns\" (UID: \"09b22423-6955-4fb3-ad8c-39bf3fe6ea43\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4nrns" Apr 16 19:57:45.229854 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:45.229764 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/09b22423-6955-4fb3-ad8c-39bf3fe6ea43-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4nrns\" (UID: \"09b22423-6955-4fb3-ad8c-39bf3fe6ea43\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4nrns" Apr 16 19:57:45.330550 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:45.330514 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z85rx\" (UniqueName: \"kubernetes.io/projected/09b22423-6955-4fb3-ad8c-39bf3fe6ea43-kube-api-access-z85rx\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4nrns\" (UID: \"09b22423-6955-4fb3-ad8c-39bf3fe6ea43\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4nrns" Apr 16 19:57:45.330700 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:45.330598 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/09b22423-6955-4fb3-ad8c-39bf3fe6ea43-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4nrns\" (UID: \"09b22423-6955-4fb3-ad8c-39bf3fe6ea43\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4nrns" Apr 16 19:57:45.332908 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:45.332887 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/09b22423-6955-4fb3-ad8c-39bf3fe6ea43-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4nrns\" (UID: \"09b22423-6955-4fb3-ad8c-39bf3fe6ea43\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4nrns" Apr 16 19:57:45.339456 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:45.339437 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z85rx\" (UniqueName: \"kubernetes.io/projected/09b22423-6955-4fb3-ad8c-39bf3fe6ea43-kube-api-access-z85rx\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4nrns\" (UID: \"09b22423-6955-4fb3-ad8c-39bf3fe6ea43\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4nrns" Apr 16 19:57:45.444778 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:45.444705 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4nrns" Apr 16 19:57:45.568095 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:45.568066 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4nrns"] Apr 16 19:57:45.571324 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:57:45.571287 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09b22423_6955_4fb3_ad8c_39bf3fe6ea43.slice/crio-955ae5c0150fcf08ac320d1690031a2310d852cbed2a1f1668a18c9ce9db91ca WatchSource:0}: Error finding container 955ae5c0150fcf08ac320d1690031a2310d852cbed2a1f1668a18c9ce9db91ca: Status 404 returned error can't find the container with id 955ae5c0150fcf08ac320d1690031a2310d852cbed2a1f1668a18c9ce9db91ca Apr 16 19:57:45.728082 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:45.728007 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4nrns" event={"ID":"09b22423-6955-4fb3-ad8c-39bf3fe6ea43","Type":"ContainerStarted","Data":"955ae5c0150fcf08ac320d1690031a2310d852cbed2a1f1668a18c9ce9db91ca"} Apr 16 19:57:49.745855 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:49.745813 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4nrns" event={"ID":"09b22423-6955-4fb3-ad8c-39bf3fe6ea43","Type":"ContainerStarted","Data":"f68cfe6d9ae01b5bf656379fa5cd4c43acf4c942f7b6448b8b0c153b4c7e4b74"} Apr 16 19:57:49.746261 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:49.745968 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4nrns" Apr 16 19:57:49.768219 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:49.768164 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4nrns" podStartSLOduration=0.808079652 podStartE2EDuration="4.768147586s" podCreationTimestamp="2026-04-16 19:57:45 +0000 UTC" firstStartedPulling="2026-04-16 19:57:45.573176349 +0000 UTC m=+243.296222234" lastFinishedPulling="2026-04-16 19:57:49.533244281 +0000 UTC m=+247.256290168" observedRunningTime="2026-04-16 19:57:49.765512042 +0000 UTC m=+247.488557948" watchObservedRunningTime="2026-04-16 19:57:49.768147586 +0000 UTC m=+247.491193494" Apr 16 19:57:50.111211 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.111180 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-xbh8v"] Apr 16 19:57:50.113645 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.113626 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-xbh8v" Apr 16 19:57:50.116419 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.116400 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 19:57:50.116548 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.116529 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-fjh4h\"" Apr 16 19:57:50.116699 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.116680 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 19:57:50.124259 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.124233 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-xbh8v"] Apr 16 19:57:50.272032 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.272003 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/29a5f938-134b-49fd-a510-e46c12ff91c6-certificates\") pod \"keda-operator-ffbb595cb-xbh8v\" (UID: \"29a5f938-134b-49fd-a510-e46c12ff91c6\") " pod="openshift-keda/keda-operator-ffbb595cb-xbh8v" Apr 16 19:57:50.272162 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.272058 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/29a5f938-134b-49fd-a510-e46c12ff91c6-cabundle0\") pod \"keda-operator-ffbb595cb-xbh8v\" (UID: \"29a5f938-134b-49fd-a510-e46c12ff91c6\") " pod="openshift-keda/keda-operator-ffbb595cb-xbh8v" Apr 16 19:57:50.272162 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.272074 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knfdg\" (UniqueName: \"kubernetes.io/projected/29a5f938-134b-49fd-a510-e46c12ff91c6-kube-api-access-knfdg\") pod \"keda-operator-ffbb595cb-xbh8v\" (UID: \"29a5f938-134b-49fd-a510-e46c12ff91c6\") " pod="openshift-keda/keda-operator-ffbb595cb-xbh8v" Apr 16 19:57:50.337242 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.337215 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-qcf8b"] Apr 16 19:57:50.339457 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.339441 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qcf8b" Apr 16 19:57:50.342098 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.342079 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 19:57:50.350621 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.350600 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-qcf8b"] Apr 16 19:57:50.373450 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.373396 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/29a5f938-134b-49fd-a510-e46c12ff91c6-certificates\") pod \"keda-operator-ffbb595cb-xbh8v\" (UID: \"29a5f938-134b-49fd-a510-e46c12ff91c6\") " pod="openshift-keda/keda-operator-ffbb595cb-xbh8v" Apr 16 19:57:50.373450 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.373443 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/29a5f938-134b-49fd-a510-e46c12ff91c6-cabundle0\") pod \"keda-operator-ffbb595cb-xbh8v\" (UID: \"29a5f938-134b-49fd-a510-e46c12ff91c6\") " pod="openshift-keda/keda-operator-ffbb595cb-xbh8v" Apr 16 19:57:50.373574 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.373459 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-knfdg\" (UniqueName: \"kubernetes.io/projected/29a5f938-134b-49fd-a510-e46c12ff91c6-kube-api-access-knfdg\") pod \"keda-operator-ffbb595cb-xbh8v\" (UID: \"29a5f938-134b-49fd-a510-e46c12ff91c6\") " pod="openshift-keda/keda-operator-ffbb595cb-xbh8v" Apr 16 19:57:50.373574 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:57:50.373556 2561 secret.go:281] references non-existent secret key: ca.crt Apr 16 19:57:50.373647 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:57:50.373577 2561 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 19:57:50.373647 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:57:50.373588 2561 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-xbh8v: references non-existent secret key: ca.crt Apr 16 19:57:50.373647 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:57:50.373644 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29a5f938-134b-49fd-a510-e46c12ff91c6-certificates podName:29a5f938-134b-49fd-a510-e46c12ff91c6 nodeName:}" failed. No retries permitted until 2026-04-16 19:57:50.873625876 +0000 UTC m=+248.596671774 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/29a5f938-134b-49fd-a510-e46c12ff91c6-certificates") pod "keda-operator-ffbb595cb-xbh8v" (UID: "29a5f938-134b-49fd-a510-e46c12ff91c6") : references non-existent secret key: ca.crt Apr 16 19:57:50.374041 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.374023 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/29a5f938-134b-49fd-a510-e46c12ff91c6-cabundle0\") pod \"keda-operator-ffbb595cb-xbh8v\" (UID: \"29a5f938-134b-49fd-a510-e46c12ff91c6\") " pod="openshift-keda/keda-operator-ffbb595cb-xbh8v" Apr 16 19:57:50.381691 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.381667 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-knfdg\" (UniqueName: \"kubernetes.io/projected/29a5f938-134b-49fd-a510-e46c12ff91c6-kube-api-access-knfdg\") pod \"keda-operator-ffbb595cb-xbh8v\" (UID: \"29a5f938-134b-49fd-a510-e46c12ff91c6\") " pod="openshift-keda/keda-operator-ffbb595cb-xbh8v" Apr 16 19:57:50.473953 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.473930 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-259hr\" (UniqueName: \"kubernetes.io/projected/69eb2653-988a-4311-a6cc-908e3069bf9f-kube-api-access-259hr\") pod \"keda-metrics-apiserver-7c9f485588-qcf8b\" (UID: \"69eb2653-988a-4311-a6cc-908e3069bf9f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qcf8b" Apr 16 19:57:50.474057 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.473980 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/69eb2653-988a-4311-a6cc-908e3069bf9f-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qcf8b\" (UID: \"69eb2653-988a-4311-a6cc-908e3069bf9f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qcf8b" Apr 16 19:57:50.474057 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.474005 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/69eb2653-988a-4311-a6cc-908e3069bf9f-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-qcf8b\" (UID: \"69eb2653-988a-4311-a6cc-908e3069bf9f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qcf8b" Apr 16 19:57:50.575276 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.575248 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/69eb2653-988a-4311-a6cc-908e3069bf9f-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qcf8b\" (UID: \"69eb2653-988a-4311-a6cc-908e3069bf9f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qcf8b" Apr 16 19:57:50.575408 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.575303 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/69eb2653-988a-4311-a6cc-908e3069bf9f-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-qcf8b\" (UID: \"69eb2653-988a-4311-a6cc-908e3069bf9f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qcf8b" Apr 16 19:57:50.575408 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.575379 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-259hr\" (UniqueName: \"kubernetes.io/projected/69eb2653-988a-4311-a6cc-908e3069bf9f-kube-api-access-259hr\") pod \"keda-metrics-apiserver-7c9f485588-qcf8b\" (UID: \"69eb2653-988a-4311-a6cc-908e3069bf9f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qcf8b" Apr 16 19:57:50.575921 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:57:50.575574 2561 secret.go:281] references non-existent secret key: tls.crt Apr 16 19:57:50.575921 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:57:50.575599 2561 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 19:57:50.575921 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:57:50.575621 2561 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 16 19:57:50.575921 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:57:50.575643 2561 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-qcf8b: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 19:57:50.575921 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:57:50.575700 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/69eb2653-988a-4311-a6cc-908e3069bf9f-certificates podName:69eb2653-988a-4311-a6cc-908e3069bf9f nodeName:}" failed. No retries permitted until 2026-04-16 19:57:51.075682146 +0000 UTC m=+248.798728046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/69eb2653-988a-4311-a6cc-908e3069bf9f-certificates") pod "keda-metrics-apiserver-7c9f485588-qcf8b" (UID: "69eb2653-988a-4311-a6cc-908e3069bf9f") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 19:57:50.575921 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.575882 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/69eb2653-988a-4311-a6cc-908e3069bf9f-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-qcf8b\" (UID: \"69eb2653-988a-4311-a6cc-908e3069bf9f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qcf8b" Apr 16 19:57:50.584625 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.584604 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-259hr\" (UniqueName: \"kubernetes.io/projected/69eb2653-988a-4311-a6cc-908e3069bf9f-kube-api-access-259hr\") pod \"keda-metrics-apiserver-7c9f485588-qcf8b\" (UID: \"69eb2653-988a-4311-a6cc-908e3069bf9f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qcf8b" Apr 16 19:57:50.877179 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:50.877150 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/29a5f938-134b-49fd-a510-e46c12ff91c6-certificates\") pod \"keda-operator-ffbb595cb-xbh8v\" (UID: \"29a5f938-134b-49fd-a510-e46c12ff91c6\") " pod="openshift-keda/keda-operator-ffbb595cb-xbh8v" Apr 16 19:57:50.877541 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:57:50.877251 2561 secret.go:281] references non-existent secret key: ca.crt Apr 16 19:57:50.877541 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:57:50.877263 2561 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 19:57:50.877541 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:57:50.877271 2561 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-xbh8v: references non-existent secret key: ca.crt Apr 16 19:57:50.877541 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:57:50.877313 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29a5f938-134b-49fd-a510-e46c12ff91c6-certificates podName:29a5f938-134b-49fd-a510-e46c12ff91c6 nodeName:}" failed. No retries permitted until 2026-04-16 19:57:51.877301166 +0000 UTC m=+249.600347050 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/29a5f938-134b-49fd-a510-e46c12ff91c6-certificates") pod "keda-operator-ffbb595cb-xbh8v" (UID: "29a5f938-134b-49fd-a510-e46c12ff91c6") : references non-existent secret key: ca.crt Apr 16 19:57:51.079104 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:51.079076 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/69eb2653-988a-4311-a6cc-908e3069bf9f-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qcf8b\" (UID: \"69eb2653-988a-4311-a6cc-908e3069bf9f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qcf8b" Apr 16 19:57:51.081330 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:51.081312 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/69eb2653-988a-4311-a6cc-908e3069bf9f-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qcf8b\" (UID: \"69eb2653-988a-4311-a6cc-908e3069bf9f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qcf8b" Apr 16 19:57:51.252109 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:51.252012 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qcf8b" Apr 16 19:57:51.375684 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:51.375660 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-qcf8b"] Apr 16 19:57:51.377828 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:57:51.377800 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69eb2653_988a_4311_a6cc_908e3069bf9f.slice/crio-613906dd59c04d91810c611e64abe2dbac7d96e7340b3c0c69dcf2ffa33a2166 WatchSource:0}: Error finding container 613906dd59c04d91810c611e64abe2dbac7d96e7340b3c0c69dcf2ffa33a2166: Status 404 returned error can't find the container with id 613906dd59c04d91810c611e64abe2dbac7d96e7340b3c0c69dcf2ffa33a2166 Apr 16 19:57:51.753219 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:51.753183 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qcf8b" event={"ID":"69eb2653-988a-4311-a6cc-908e3069bf9f","Type":"ContainerStarted","Data":"613906dd59c04d91810c611e64abe2dbac7d96e7340b3c0c69dcf2ffa33a2166"} Apr 16 19:57:51.887770 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:51.887735 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/29a5f938-134b-49fd-a510-e46c12ff91c6-certificates\") pod \"keda-operator-ffbb595cb-xbh8v\" (UID: \"29a5f938-134b-49fd-a510-e46c12ff91c6\") " pod="openshift-keda/keda-operator-ffbb595cb-xbh8v" Apr 16 19:57:51.890143 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:51.890121 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/29a5f938-134b-49fd-a510-e46c12ff91c6-certificates\") pod \"keda-operator-ffbb595cb-xbh8v\" (UID: \"29a5f938-134b-49fd-a510-e46c12ff91c6\") " pod="openshift-keda/keda-operator-ffbb595cb-xbh8v" Apr 16 19:57:51.923664 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:51.923638 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-xbh8v" Apr 16 19:57:52.047212 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:52.047184 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-xbh8v"] Apr 16 19:57:52.049631 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:57:52.049599 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29a5f938_134b_49fd_a510_e46c12ff91c6.slice/crio-4bcd73f42694503f7df1ae30887375bec2397cb9f7ab0f1f9c575e38388c3236 WatchSource:0}: Error finding container 4bcd73f42694503f7df1ae30887375bec2397cb9f7ab0f1f9c575e38388c3236: Status 404 returned error can't find the container with id 4bcd73f42694503f7df1ae30887375bec2397cb9f7ab0f1f9c575e38388c3236 Apr 16 19:57:52.756976 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:52.756936 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-xbh8v" event={"ID":"29a5f938-134b-49fd-a510-e46c12ff91c6","Type":"ContainerStarted","Data":"4bcd73f42694503f7df1ae30887375bec2397cb9f7ab0f1f9c575e38388c3236"} Apr 16 19:57:54.765776 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:54.765698 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qcf8b" event={"ID":"69eb2653-988a-4311-a6cc-908e3069bf9f","Type":"ContainerStarted","Data":"d79a367a204a5ee45273db0ec0ed87f7587b95449017a530370fcd6622919726"} Apr 16 19:57:54.766196 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:54.765822 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qcf8b" Apr 16 19:57:54.783649 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:54.783593 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qcf8b" podStartSLOduration=1.67812037 podStartE2EDuration="4.783579504s" podCreationTimestamp="2026-04-16 19:57:50 +0000 UTC" firstStartedPulling="2026-04-16 19:57:51.379143768 +0000 UTC m=+249.102189651" lastFinishedPulling="2026-04-16 19:57:54.484602897 +0000 UTC m=+252.207648785" observedRunningTime="2026-04-16 19:57:54.782917118 +0000 UTC m=+252.505963025" watchObservedRunningTime="2026-04-16 19:57:54.783579504 +0000 UTC m=+252.506625437" Apr 16 19:57:56.775844 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:56.775808 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-xbh8v" event={"ID":"29a5f938-134b-49fd-a510-e46c12ff91c6","Type":"ContainerStarted","Data":"cd949ff848f6de02ed5bf5a81842162c3cd1547be2c114ed34858ae003c67a77"} Apr 16 19:57:56.776232 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:56.776020 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-xbh8v" Apr 16 19:57:56.794017 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:57:56.793968 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-xbh8v" podStartSLOduration=3.134745322 podStartE2EDuration="6.793956688s" podCreationTimestamp="2026-04-16 19:57:50 +0000 UTC" firstStartedPulling="2026-04-16 19:57:52.051137592 +0000 UTC m=+249.774183481" lastFinishedPulling="2026-04-16 19:57:55.71034895 +0000 UTC m=+253.433394847" observedRunningTime="2026-04-16 19:57:56.792722737 +0000 UTC m=+254.515768643" watchObservedRunningTime="2026-04-16 19:57:56.793956688 +0000 UTC m=+254.517002595" Apr 16 19:58:05.774950 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:05.774871 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qcf8b" Apr 16 19:58:10.751468 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:10.751435 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4nrns" Apr 16 19:58:17.781558 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:17.781524 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-xbh8v" Apr 16 19:58:42.757720 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:42.757684 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/1.log" Apr 16 19:58:42.758284 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:42.758200 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/1.log" Apr 16 19:58:42.762723 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:42.762699 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 19:58:42.762944 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:42.762923 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 19:58:42.769079 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:42.769061 2561 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 19:58:58.877648 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:58.877610 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-c7gx8"] Apr 16 19:58:58.880922 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:58.880905 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-c7gx8" Apr 16 19:58:58.884979 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:58.884928 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 19:58:58.884979 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:58.884965 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 19:58:58.885160 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:58.885062 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 19:58:58.886522 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:58.886504 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-sz52s\"" Apr 16 19:58:58.893760 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:58.893741 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-c7gx8"] Apr 16 19:58:58.922023 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:58.921999 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-jgbp8"] Apr 16 19:58:58.925419 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:58.925402 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-jgbp8" Apr 16 19:58:58.928278 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:58.928259 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 19:58:58.928397 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:58.928374 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-xf4q2\"" Apr 16 19:58:58.935839 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:58.935818 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-jgbp8"] Apr 16 19:58:58.984700 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:58.984671 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkggp\" (UniqueName: \"kubernetes.io/projected/eb93b256-25cf-46a1-b413-2e64d58db62b-kube-api-access-lkggp\") pod \"seaweedfs-86cc847c5c-jgbp8\" (UID: \"eb93b256-25cf-46a1-b413-2e64d58db62b\") " pod="kserve/seaweedfs-86cc847c5c-jgbp8" Apr 16 19:58:58.984858 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:58.984729 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f8ec62b-bea2-402c-86d7-d060b85a0a65-cert\") pod \"kserve-controller-manager-659c8cbdc-c7gx8\" (UID: \"9f8ec62b-bea2-402c-86d7-d060b85a0a65\") " pod="kserve/kserve-controller-manager-659c8cbdc-c7gx8" Apr 16 19:58:58.984858 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:58.984765 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tldkw\" (UniqueName: \"kubernetes.io/projected/9f8ec62b-bea2-402c-86d7-d060b85a0a65-kube-api-access-tldkw\") pod \"kserve-controller-manager-659c8cbdc-c7gx8\" (UID: \"9f8ec62b-bea2-402c-86d7-d060b85a0a65\") " pod="kserve/kserve-controller-manager-659c8cbdc-c7gx8" Apr 16 19:58:58.984858 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:58.984807 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/eb93b256-25cf-46a1-b413-2e64d58db62b-data\") pod \"seaweedfs-86cc847c5c-jgbp8\" (UID: \"eb93b256-25cf-46a1-b413-2e64d58db62b\") " pod="kserve/seaweedfs-86cc847c5c-jgbp8" Apr 16 19:58:59.085873 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:59.085837 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lkggp\" (UniqueName: \"kubernetes.io/projected/eb93b256-25cf-46a1-b413-2e64d58db62b-kube-api-access-lkggp\") pod \"seaweedfs-86cc847c5c-jgbp8\" (UID: \"eb93b256-25cf-46a1-b413-2e64d58db62b\") " pod="kserve/seaweedfs-86cc847c5c-jgbp8" Apr 16 19:58:59.086031 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:59.085903 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f8ec62b-bea2-402c-86d7-d060b85a0a65-cert\") pod \"kserve-controller-manager-659c8cbdc-c7gx8\" (UID: \"9f8ec62b-bea2-402c-86d7-d060b85a0a65\") " pod="kserve/kserve-controller-manager-659c8cbdc-c7gx8" Apr 16 19:58:59.086031 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:59.085939 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tldkw\" (UniqueName: \"kubernetes.io/projected/9f8ec62b-bea2-402c-86d7-d060b85a0a65-kube-api-access-tldkw\") pod \"kserve-controller-manager-659c8cbdc-c7gx8\" (UID: \"9f8ec62b-bea2-402c-86d7-d060b85a0a65\") " pod="kserve/kserve-controller-manager-659c8cbdc-c7gx8" Apr 16 19:58:59.086031 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:59.085967 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/eb93b256-25cf-46a1-b413-2e64d58db62b-data\") pod \"seaweedfs-86cc847c5c-jgbp8\" (UID: \"eb93b256-25cf-46a1-b413-2e64d58db62b\") " pod="kserve/seaweedfs-86cc847c5c-jgbp8" Apr 16 19:58:59.086188 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:58:59.086036 2561 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 19:58:59.086188 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:58:59.086106 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f8ec62b-bea2-402c-86d7-d060b85a0a65-cert podName:9f8ec62b-bea2-402c-86d7-d060b85a0a65 nodeName:}" failed. No retries permitted until 2026-04-16 19:58:59.586085088 +0000 UTC m=+317.309130982 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9f8ec62b-bea2-402c-86d7-d060b85a0a65-cert") pod "kserve-controller-manager-659c8cbdc-c7gx8" (UID: "9f8ec62b-bea2-402c-86d7-d060b85a0a65") : secret "kserve-webhook-server-cert" not found Apr 16 19:58:59.086367 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:59.086351 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/eb93b256-25cf-46a1-b413-2e64d58db62b-data\") pod \"seaweedfs-86cc847c5c-jgbp8\" (UID: \"eb93b256-25cf-46a1-b413-2e64d58db62b\") " pod="kserve/seaweedfs-86cc847c5c-jgbp8" Apr 16 19:58:59.097507 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:59.097477 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkggp\" (UniqueName: \"kubernetes.io/projected/eb93b256-25cf-46a1-b413-2e64d58db62b-kube-api-access-lkggp\") pod \"seaweedfs-86cc847c5c-jgbp8\" (UID: \"eb93b256-25cf-46a1-b413-2e64d58db62b\") " pod="kserve/seaweedfs-86cc847c5c-jgbp8" Apr 16 19:58:59.098686 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:59.098662 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tldkw\" (UniqueName: \"kubernetes.io/projected/9f8ec62b-bea2-402c-86d7-d060b85a0a65-kube-api-access-tldkw\") pod \"kserve-controller-manager-659c8cbdc-c7gx8\" (UID: \"9f8ec62b-bea2-402c-86d7-d060b85a0a65\") " pod="kserve/kserve-controller-manager-659c8cbdc-c7gx8" Apr 16 19:58:59.236775 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:59.236683 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-jgbp8" Apr 16 19:58:59.361549 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:59.361525 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-jgbp8"] Apr 16 19:58:59.363825 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:58:59.363798 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb93b256_25cf_46a1_b413_2e64d58db62b.slice/crio-1be44c341f8a842233571b60220eed128a4e39fa20167b011723852f3c264c6b WatchSource:0}: Error finding container 1be44c341f8a842233571b60220eed128a4e39fa20167b011723852f3c264c6b: Status 404 returned error can't find the container with id 1be44c341f8a842233571b60220eed128a4e39fa20167b011723852f3c264c6b Apr 16 19:58:59.365091 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:59.365072 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:58:59.590113 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:59.590026 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f8ec62b-bea2-402c-86d7-d060b85a0a65-cert\") pod \"kserve-controller-manager-659c8cbdc-c7gx8\" (UID: \"9f8ec62b-bea2-402c-86d7-d060b85a0a65\") " pod="kserve/kserve-controller-manager-659c8cbdc-c7gx8" Apr 16 19:58:59.592436 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:59.592417 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f8ec62b-bea2-402c-86d7-d060b85a0a65-cert\") pod \"kserve-controller-manager-659c8cbdc-c7gx8\" (UID: \"9f8ec62b-bea2-402c-86d7-d060b85a0a65\") " pod="kserve/kserve-controller-manager-659c8cbdc-c7gx8" Apr 16 19:58:59.791240 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:59.791186 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-c7gx8" Apr 16 19:58:59.964148 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:58:59.964109 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-jgbp8" event={"ID":"eb93b256-25cf-46a1-b413-2e64d58db62b","Type":"ContainerStarted","Data":"1be44c341f8a842233571b60220eed128a4e39fa20167b011723852f3c264c6b"} Apr 16 19:59:00.007310 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:00.007282 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-c7gx8"] Apr 16 19:59:00.008879 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:59:00.008844 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f8ec62b_bea2_402c_86d7_d060b85a0a65.slice/crio-466d92b6c6ee5544564437deec85ce311a3f2f898fd950ba738a7706afc52e5e WatchSource:0}: Error finding container 466d92b6c6ee5544564437deec85ce311a3f2f898fd950ba738a7706afc52e5e: Status 404 returned error can't find the container with id 466d92b6c6ee5544564437deec85ce311a3f2f898fd950ba738a7706afc52e5e Apr 16 19:59:00.971755 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:00.971703 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-c7gx8" event={"ID":"9f8ec62b-bea2-402c-86d7-d060b85a0a65","Type":"ContainerStarted","Data":"466d92b6c6ee5544564437deec85ce311a3f2f898fd950ba738a7706afc52e5e"} Apr 16 19:59:03.984241 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:03.984204 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-jgbp8" event={"ID":"eb93b256-25cf-46a1-b413-2e64d58db62b","Type":"ContainerStarted","Data":"6dbe8d8953d63b702836f907c9e07325c67fc01207d165cc3269e5ee699747ba"} Apr 16 19:59:03.984680 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:03.984273 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-jgbp8" Apr 16 19:59:03.985445 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:03.985423 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-c7gx8" event={"ID":"9f8ec62b-bea2-402c-86d7-d060b85a0a65","Type":"ContainerStarted","Data":"a519eba9db7ace12eb6a139c4e5fb15dd47d8539b4725569934c060b7f624f80"} Apr 16 19:59:03.985594 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:03.985580 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-659c8cbdc-c7gx8" Apr 16 19:59:04.002723 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:04.002683 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-jgbp8" podStartSLOduration=1.775949666 podStartE2EDuration="6.002667775s" podCreationTimestamp="2026-04-16 19:58:58 +0000 UTC" firstStartedPulling="2026-04-16 19:58:59.365190464 +0000 UTC m=+317.088236348" lastFinishedPulling="2026-04-16 19:59:03.591908567 +0000 UTC m=+321.314954457" observedRunningTime="2026-04-16 19:59:04.002290732 +0000 UTC m=+321.725336663" watchObservedRunningTime="2026-04-16 19:59:04.002667775 +0000 UTC m=+321.725713683" Apr 16 19:59:04.019882 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:04.019837 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-659c8cbdc-c7gx8" podStartSLOduration=2.497283316 podStartE2EDuration="6.019824349s" podCreationTimestamp="2026-04-16 19:58:58 +0000 UTC" firstStartedPulling="2026-04-16 19:59:00.010306554 +0000 UTC m=+317.733352441" lastFinishedPulling="2026-04-16 19:59:03.532847587 +0000 UTC m=+321.255893474" observedRunningTime="2026-04-16 19:59:04.018277556 +0000 UTC m=+321.741323462" watchObservedRunningTime="2026-04-16 19:59:04.019824349 +0000 UTC m=+321.742870254" Apr 16 19:59:09.991466 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:09.991440 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-jgbp8" Apr 16 19:59:34.368550 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:34.368452 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-c7gx8"] Apr 16 19:59:34.369026 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:34.368759 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-659c8cbdc-c7gx8" podUID="9f8ec62b-bea2-402c-86d7-d060b85a0a65" containerName="manager" containerID="cri-o://a519eba9db7ace12eb6a139c4e5fb15dd47d8539b4725569934c060b7f624f80" gracePeriod=10 Apr 16 19:59:34.373968 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:34.373942 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-659c8cbdc-c7gx8" Apr 16 19:59:34.398573 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:34.398552 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-rz7sf"] Apr 16 19:59:34.401860 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:34.401841 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-rz7sf" Apr 16 19:59:34.411768 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:34.411730 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-rz7sf"] Apr 16 19:59:34.453115 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:34.453089 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d003760d-7c1d-443a-85f8-14280df4b2cd-cert\") pod \"kserve-controller-manager-659c8cbdc-rz7sf\" (UID: \"d003760d-7c1d-443a-85f8-14280df4b2cd\") " pod="kserve/kserve-controller-manager-659c8cbdc-rz7sf" Apr 16 19:59:34.453221 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:34.453129 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbzln\" (UniqueName: \"kubernetes.io/projected/d003760d-7c1d-443a-85f8-14280df4b2cd-kube-api-access-mbzln\") pod \"kserve-controller-manager-659c8cbdc-rz7sf\" (UID: \"d003760d-7c1d-443a-85f8-14280df4b2cd\") " pod="kserve/kserve-controller-manager-659c8cbdc-rz7sf" Apr 16 19:59:34.554035 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:34.554007 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d003760d-7c1d-443a-85f8-14280df4b2cd-cert\") pod \"kserve-controller-manager-659c8cbdc-rz7sf\" (UID: \"d003760d-7c1d-443a-85f8-14280df4b2cd\") " pod="kserve/kserve-controller-manager-659c8cbdc-rz7sf" Apr 16 19:59:34.554164 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:34.554051 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbzln\" (UniqueName: \"kubernetes.io/projected/d003760d-7c1d-443a-85f8-14280df4b2cd-kube-api-access-mbzln\") pod \"kserve-controller-manager-659c8cbdc-rz7sf\" (UID: \"d003760d-7c1d-443a-85f8-14280df4b2cd\") " pod="kserve/kserve-controller-manager-659c8cbdc-rz7sf" Apr 16 19:59:34.556657 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:34.556631 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d003760d-7c1d-443a-85f8-14280df4b2cd-cert\") pod \"kserve-controller-manager-659c8cbdc-rz7sf\" (UID: \"d003760d-7c1d-443a-85f8-14280df4b2cd\") " pod="kserve/kserve-controller-manager-659c8cbdc-rz7sf" Apr 16 19:59:34.563715 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:34.563692 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbzln\" (UniqueName: \"kubernetes.io/projected/d003760d-7c1d-443a-85f8-14280df4b2cd-kube-api-access-mbzln\") pod \"kserve-controller-manager-659c8cbdc-rz7sf\" (UID: \"d003760d-7c1d-443a-85f8-14280df4b2cd\") " pod="kserve/kserve-controller-manager-659c8cbdc-rz7sf" Apr 16 19:59:34.603055 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:34.603035 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-c7gx8" Apr 16 19:59:34.655140 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:34.655116 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f8ec62b-bea2-402c-86d7-d060b85a0a65-cert\") pod \"9f8ec62b-bea2-402c-86d7-d060b85a0a65\" (UID: \"9f8ec62b-bea2-402c-86d7-d060b85a0a65\") " Apr 16 19:59:34.655303 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:34.655199 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tldkw\" (UniqueName: \"kubernetes.io/projected/9f8ec62b-bea2-402c-86d7-d060b85a0a65-kube-api-access-tldkw\") pod \"9f8ec62b-bea2-402c-86d7-d060b85a0a65\" (UID: \"9f8ec62b-bea2-402c-86d7-d060b85a0a65\") " Apr 16 19:59:34.657323 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:34.657291 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f8ec62b-bea2-402c-86d7-d060b85a0a65-cert" (OuterVolumeSpecName: "cert") pod "9f8ec62b-bea2-402c-86d7-d060b85a0a65" (UID: "9f8ec62b-bea2-402c-86d7-d060b85a0a65"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:59:34.657427 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:34.657346 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f8ec62b-bea2-402c-86d7-d060b85a0a65-kube-api-access-tldkw" (OuterVolumeSpecName: "kube-api-access-tldkw") pod "9f8ec62b-bea2-402c-86d7-d060b85a0a65" (UID: "9f8ec62b-bea2-402c-86d7-d060b85a0a65"). InnerVolumeSpecName "kube-api-access-tldkw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:59:34.742072 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:34.742039 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-rz7sf" Apr 16 19:59:34.756181 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:34.756151 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tldkw\" (UniqueName: \"kubernetes.io/projected/9f8ec62b-bea2-402c-86d7-d060b85a0a65-kube-api-access-tldkw\") on node \"ip-10-0-128-201.ec2.internal\" DevicePath \"\"" Apr 16 19:59:34.756277 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:34.756187 2561 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f8ec62b-bea2-402c-86d7-d060b85a0a65-cert\") on node \"ip-10-0-128-201.ec2.internal\" DevicePath \"\"" Apr 16 19:59:34.878048 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:34.878022 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-rz7sf"] Apr 16 19:59:34.880861 ip-10-0-128-201 kubenswrapper[2561]: W0416 19:59:34.880841 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd003760d_7c1d_443a_85f8_14280df4b2cd.slice/crio-b2acf15bfc9668c2d7345620a07e610fd32e046424c3f4c6845f4625042bf708 WatchSource:0}: Error finding container b2acf15bfc9668c2d7345620a07e610fd32e046424c3f4c6845f4625042bf708: Status 404 returned error can't find the container with id b2acf15bfc9668c2d7345620a07e610fd32e046424c3f4c6845f4625042bf708 Apr 16 19:59:35.080925 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:35.080832 2561 generic.go:358] "Generic (PLEG): container finished" podID="9f8ec62b-bea2-402c-86d7-d060b85a0a65" containerID="a519eba9db7ace12eb6a139c4e5fb15dd47d8539b4725569934c060b7f624f80" exitCode=0 Apr 16 19:59:35.080925 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:35.080896 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-659c8cbdc-c7gx8" Apr 16 19:59:35.080925 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:35.080917 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-c7gx8" event={"ID":"9f8ec62b-bea2-402c-86d7-d060b85a0a65","Type":"ContainerDied","Data":"a519eba9db7ace12eb6a139c4e5fb15dd47d8539b4725569934c060b7f624f80"} Apr 16 19:59:35.081169 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:35.080960 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-c7gx8" event={"ID":"9f8ec62b-bea2-402c-86d7-d060b85a0a65","Type":"ContainerDied","Data":"466d92b6c6ee5544564437deec85ce311a3f2f898fd950ba738a7706afc52e5e"} Apr 16 19:59:35.081169 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:35.080982 2561 scope.go:117] "RemoveContainer" containerID="a519eba9db7ace12eb6a139c4e5fb15dd47d8539b4725569934c060b7f624f80" Apr 16 19:59:35.082078 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:35.082059 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-rz7sf" event={"ID":"d003760d-7c1d-443a-85f8-14280df4b2cd","Type":"ContainerStarted","Data":"b2acf15bfc9668c2d7345620a07e610fd32e046424c3f4c6845f4625042bf708"} Apr 16 19:59:35.088622 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:35.088606 2561 scope.go:117] "RemoveContainer" containerID="a519eba9db7ace12eb6a139c4e5fb15dd47d8539b4725569934c060b7f624f80" Apr 16 19:59:35.088899 ip-10-0-128-201 kubenswrapper[2561]: E0416 19:59:35.088873 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a519eba9db7ace12eb6a139c4e5fb15dd47d8539b4725569934c060b7f624f80\": container with ID starting with a519eba9db7ace12eb6a139c4e5fb15dd47d8539b4725569934c060b7f624f80 not found: ID does not exist" containerID="a519eba9db7ace12eb6a139c4e5fb15dd47d8539b4725569934c060b7f624f80" Apr 16 19:59:35.088968 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:35.088907 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a519eba9db7ace12eb6a139c4e5fb15dd47d8539b4725569934c060b7f624f80"} err="failed to get container status \"a519eba9db7ace12eb6a139c4e5fb15dd47d8539b4725569934c060b7f624f80\": rpc error: code = NotFound desc = could not find container \"a519eba9db7ace12eb6a139c4e5fb15dd47d8539b4725569934c060b7f624f80\": container with ID starting with a519eba9db7ace12eb6a139c4e5fb15dd47d8539b4725569934c060b7f624f80 not found: ID does not exist" Apr 16 19:59:35.098903 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:35.098885 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-c7gx8"] Apr 16 19:59:35.104879 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:35.104860 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-659c8cbdc-c7gx8"] Apr 16 19:59:36.086211 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:36.086181 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-659c8cbdc-rz7sf" event={"ID":"d003760d-7c1d-443a-85f8-14280df4b2cd","Type":"ContainerStarted","Data":"ec5578863220ec0642a1ca530680d527e9e12eaa7bae47089930ed6e8d3b384e"} Apr 16 19:59:36.086637 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:36.086239 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-659c8cbdc-rz7sf" Apr 16 19:59:36.104626 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:36.104585 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-659c8cbdc-rz7sf" podStartSLOduration=1.8032771699999999 podStartE2EDuration="2.10457282s" podCreationTimestamp="2026-04-16 19:59:34 +0000 UTC" firstStartedPulling="2026-04-16 19:59:34.882076447 +0000 UTC m=+352.605122331" lastFinishedPulling="2026-04-16 19:59:35.183372097 +0000 UTC m=+352.906417981" observedRunningTime="2026-04-16 19:59:36.103471145 +0000 UTC m=+353.826517052" watchObservedRunningTime="2026-04-16 19:59:36.10457282 +0000 UTC m=+353.827618725" Apr 16 19:59:36.839566 ip-10-0-128-201 kubenswrapper[2561]: I0416 19:59:36.839535 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f8ec62b-bea2-402c-86d7-d060b85a0a65" path="/var/lib/kubelet/pods/9f8ec62b-bea2-402c-86d7-d060b85a0a65/volumes" Apr 16 20:00:07.094661 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:07.094633 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-659c8cbdc-rz7sf" Apr 16 20:00:07.965926 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:07.965891 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-dz25p"] Apr 16 20:00:07.966279 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:07.966263 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f8ec62b-bea2-402c-86d7-d060b85a0a65" containerName="manager" Apr 16 20:00:07.966279 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:07.966279 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f8ec62b-bea2-402c-86d7-d060b85a0a65" containerName="manager" Apr 16 20:00:07.966423 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:07.966353 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f8ec62b-bea2-402c-86d7-d060b85a0a65" containerName="manager" Apr 16 20:00:07.969202 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:07.969185 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-dz25p" Apr 16 20:00:07.976079 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:07.975879 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 20:00:07.976079 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:07.975928 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-pq22w\"" Apr 16 20:00:07.984054 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:07.984028 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-dz25p"] Apr 16 20:00:08.104648 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:08.104613 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nd6m\" (UniqueName: \"kubernetes.io/projected/32c8dc29-f0b2-42a4-93c0-6adc40019f95-kube-api-access-2nd6m\") pod \"model-serving-api-86f7b4b499-dz25p\" (UID: \"32c8dc29-f0b2-42a4-93c0-6adc40019f95\") " pod="kserve/model-serving-api-86f7b4b499-dz25p" Apr 16 20:00:08.104996 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:08.104668 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/32c8dc29-f0b2-42a4-93c0-6adc40019f95-tls-certs\") pod \"model-serving-api-86f7b4b499-dz25p\" (UID: \"32c8dc29-f0b2-42a4-93c0-6adc40019f95\") " pod="kserve/model-serving-api-86f7b4b499-dz25p" Apr 16 20:00:08.205398 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:08.205370 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/32c8dc29-f0b2-42a4-93c0-6adc40019f95-tls-certs\") pod \"model-serving-api-86f7b4b499-dz25p\" (UID: \"32c8dc29-f0b2-42a4-93c0-6adc40019f95\") " pod="kserve/model-serving-api-86f7b4b499-dz25p" Apr 16 20:00:08.205545 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:08.205474 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nd6m\" (UniqueName: \"kubernetes.io/projected/32c8dc29-f0b2-42a4-93c0-6adc40019f95-kube-api-access-2nd6m\") pod \"model-serving-api-86f7b4b499-dz25p\" (UID: \"32c8dc29-f0b2-42a4-93c0-6adc40019f95\") " pod="kserve/model-serving-api-86f7b4b499-dz25p" Apr 16 20:00:08.205545 ip-10-0-128-201 kubenswrapper[2561]: E0416 20:00:08.205503 2561 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 16 20:00:08.205653 ip-10-0-128-201 kubenswrapper[2561]: E0416 20:00:08.205594 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32c8dc29-f0b2-42a4-93c0-6adc40019f95-tls-certs podName:32c8dc29-f0b2-42a4-93c0-6adc40019f95 nodeName:}" failed. No retries permitted until 2026-04-16 20:00:08.70557761 +0000 UTC m=+386.428623494 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/32c8dc29-f0b2-42a4-93c0-6adc40019f95-tls-certs") pod "model-serving-api-86f7b4b499-dz25p" (UID: "32c8dc29-f0b2-42a4-93c0-6adc40019f95") : secret "model-serving-api-tls" not found Apr 16 20:00:08.220141 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:08.220076 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nd6m\" (UniqueName: \"kubernetes.io/projected/32c8dc29-f0b2-42a4-93c0-6adc40019f95-kube-api-access-2nd6m\") pod \"model-serving-api-86f7b4b499-dz25p\" (UID: \"32c8dc29-f0b2-42a4-93c0-6adc40019f95\") " pod="kserve/model-serving-api-86f7b4b499-dz25p" Apr 16 20:00:08.708823 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:08.708775 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/32c8dc29-f0b2-42a4-93c0-6adc40019f95-tls-certs\") pod \"model-serving-api-86f7b4b499-dz25p\" (UID: \"32c8dc29-f0b2-42a4-93c0-6adc40019f95\") " pod="kserve/model-serving-api-86f7b4b499-dz25p" Apr 16 20:00:08.711196 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:08.711178 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/32c8dc29-f0b2-42a4-93c0-6adc40019f95-tls-certs\") pod \"model-serving-api-86f7b4b499-dz25p\" (UID: \"32c8dc29-f0b2-42a4-93c0-6adc40019f95\") " pod="kserve/model-serving-api-86f7b4b499-dz25p" Apr 16 20:00:08.888779 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:08.888733 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-dz25p" Apr 16 20:00:09.012778 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:09.012753 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-dz25p"] Apr 16 20:00:09.015196 ip-10-0-128-201 kubenswrapper[2561]: W0416 20:00:09.015169 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32c8dc29_f0b2_42a4_93c0_6adc40019f95.slice/crio-29f1f4e05d2caaf320d1e792b0a198cc84ab8b30be65e4bf73083458e7206cee WatchSource:0}: Error finding container 29f1f4e05d2caaf320d1e792b0a198cc84ab8b30be65e4bf73083458e7206cee: Status 404 returned error can't find the container with id 29f1f4e05d2caaf320d1e792b0a198cc84ab8b30be65e4bf73083458e7206cee Apr 16 20:00:09.194146 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:09.194113 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-dz25p" event={"ID":"32c8dc29-f0b2-42a4-93c0-6adc40019f95","Type":"ContainerStarted","Data":"29f1f4e05d2caaf320d1e792b0a198cc84ab8b30be65e4bf73083458e7206cee"} Apr 16 20:00:11.201215 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:11.201177 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-dz25p" event={"ID":"32c8dc29-f0b2-42a4-93c0-6adc40019f95","Type":"ContainerStarted","Data":"1fdaa3e3a305fbe989cdfa3780390da1895027805cf63d61fe426285e91b8a08"} Apr 16 20:00:11.201649 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:11.201296 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-dz25p" Apr 16 20:00:11.220386 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:11.220340 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-dz25p" podStartSLOduration=2.475680501 podStartE2EDuration="4.220328755s" podCreationTimestamp="2026-04-16 20:00:07 +0000 UTC" firstStartedPulling="2026-04-16 20:00:09.016905283 +0000 UTC m=+386.739951170" lastFinishedPulling="2026-04-16 20:00:10.761553529 +0000 UTC m=+388.484599424" observedRunningTime="2026-04-16 20:00:11.219353937 +0000 UTC m=+388.942399834" watchObservedRunningTime="2026-04-16 20:00:11.220328755 +0000 UTC m=+388.943374660" Apr 16 20:00:22.208946 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:22.208916 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-dz25p" Apr 16 20:00:44.408810 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:44.408756 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2"] Apr 16 20:00:44.411378 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:44.411360 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2" Apr 16 20:00:44.414719 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:44.414694 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-jlhx9\"" Apr 16 20:00:44.421781 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:44.421754 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2"] Apr 16 20:00:44.422253 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:44.422236 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2" Apr 16 20:00:44.572190 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:44.572157 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2"] Apr 16 20:00:44.576475 ip-10-0-128-201 kubenswrapper[2561]: W0416 20:00:44.576446 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b09c279_8214_4252_86e3_632d35882386.slice/crio-48060787362f8ebe1231383a71419439a8c4cdce87b60839b622134eb000dfe0 WatchSource:0}: Error finding container 48060787362f8ebe1231383a71419439a8c4cdce87b60839b622134eb000dfe0: Status 404 returned error can't find the container with id 48060787362f8ebe1231383a71419439a8c4cdce87b60839b622134eb000dfe0 Apr 16 20:00:44.606474 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:44.606445 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj"] Apr 16 20:00:44.618596 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:44.616180 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" Apr 16 20:00:44.622727 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:44.621085 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj"] Apr 16 20:00:44.709120 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:44.708333 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr"] Apr 16 20:00:44.713196 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:44.713168 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr" Apr 16 20:00:44.729902 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:44.729866 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr" Apr 16 20:00:44.732282 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:44.730762 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr"] Apr 16 20:00:44.807777 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:44.807630 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a77ee7fc-37ba-4669-867d-7b3660de3c9b-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj\" (UID: \"a77ee7fc-37ba-4669-867d-7b3660de3c9b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" Apr 16 20:00:44.896214 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:44.896184 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr"] Apr 16 20:00:44.898210 ip-10-0-128-201 kubenswrapper[2561]: W0416 20:00:44.898170 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11d79e04_a4b3_4879_93e8_8b8729614e6f.slice/crio-32a54e5a06f31cddf4e05434155b7fdd9d5d816457086f2c83f1e8690c68af06 WatchSource:0}: Error finding container 32a54e5a06f31cddf4e05434155b7fdd9d5d816457086f2c83f1e8690c68af06: Status 404 returned error can't find the container with id 32a54e5a06f31cddf4e05434155b7fdd9d5d816457086f2c83f1e8690c68af06 Apr 16 20:00:44.908438 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:44.908413 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a77ee7fc-37ba-4669-867d-7b3660de3c9b-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj\" (UID: \"a77ee7fc-37ba-4669-867d-7b3660de3c9b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" Apr 16 20:00:44.908723 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:44.908707 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a77ee7fc-37ba-4669-867d-7b3660de3c9b-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj\" (UID: \"a77ee7fc-37ba-4669-867d-7b3660de3c9b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" Apr 16 20:00:44.934597 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:44.934529 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" Apr 16 20:00:45.071344 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:45.071207 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj"] Apr 16 20:00:45.074624 ip-10-0-128-201 kubenswrapper[2561]: W0416 20:00:45.074587 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda77ee7fc_37ba_4669_867d_7b3660de3c9b.slice/crio-0823753646e42194b05680be31942b78e7fb7f804b2f155b9e1178e23b69e554 WatchSource:0}: Error finding container 0823753646e42194b05680be31942b78e7fb7f804b2f155b9e1178e23b69e554: Status 404 returned error can't find the container with id 0823753646e42194b05680be31942b78e7fb7f804b2f155b9e1178e23b69e554 Apr 16 20:00:45.321262 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:45.321144 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" event={"ID":"a77ee7fc-37ba-4669-867d-7b3660de3c9b","Type":"ContainerStarted","Data":"0823753646e42194b05680be31942b78e7fb7f804b2f155b9e1178e23b69e554"} Apr 16 20:00:45.324042 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:45.323970 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr" event={"ID":"11d79e04-a4b3-4879-93e8-8b8729614e6f","Type":"ContainerStarted","Data":"32a54e5a06f31cddf4e05434155b7fdd9d5d816457086f2c83f1e8690c68af06"} Apr 16 20:00:45.326500 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:45.326449 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2" event={"ID":"4b09c279-8214-4252-86e3-632d35882386","Type":"ContainerStarted","Data":"48060787362f8ebe1231383a71419439a8c4cdce87b60839b622134eb000dfe0"} Apr 16 20:00:51.365075 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:51.364839 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" event={"ID":"a77ee7fc-37ba-4669-867d-7b3660de3c9b","Type":"ContainerStarted","Data":"a0d159240991f20d61729c45a207448342a0b83374f4dffe39a39bcda84c5811"} Apr 16 20:00:56.385308 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:56.385271 2561 generic.go:358] "Generic (PLEG): container finished" podID="a77ee7fc-37ba-4669-867d-7b3660de3c9b" containerID="a0d159240991f20d61729c45a207448342a0b83374f4dffe39a39bcda84c5811" exitCode=0 Apr 16 20:00:56.385748 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:56.385318 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" event={"ID":"a77ee7fc-37ba-4669-867d-7b3660de3c9b","Type":"ContainerDied","Data":"a0d159240991f20d61729c45a207448342a0b83374f4dffe39a39bcda84c5811"} Apr 16 20:00:59.399344 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:59.399288 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr" event={"ID":"11d79e04-a4b3-4879-93e8-8b8729614e6f","Type":"ContainerStarted","Data":"65d2ba5d1dbb604e6d7ca401abc9c41d0a7e8189a4fea9d693d2e3c84516f636"} Apr 16 20:00:59.399893 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:59.399504 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr" Apr 16 20:00:59.401051 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:59.400998 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr" podUID="11d79e04-a4b3-4879-93e8-8b8729614e6f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 20:00:59.401193 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:59.401076 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2" event={"ID":"4b09c279-8214-4252-86e3-632d35882386","Type":"ContainerStarted","Data":"47f71634ccbabadc78e44b951717b2726823baff68f9dcf4f2813fa7e8d646cf"} Apr 16 20:00:59.401349 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:59.401328 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2" Apr 16 20:00:59.402560 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:59.402524 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2" podUID="4b09c279-8214-4252-86e3-632d35882386" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 20:00:59.423585 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:59.423532 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr" podStartSLOduration=1.110511149 podStartE2EDuration="15.423514561s" podCreationTimestamp="2026-04-16 20:00:44 +0000 UTC" firstStartedPulling="2026-04-16 20:00:44.90015789 +0000 UTC m=+422.623203774" lastFinishedPulling="2026-04-16 20:00:59.213161299 +0000 UTC m=+436.936207186" observedRunningTime="2026-04-16 20:00:59.422104918 +0000 UTC m=+437.145150824" watchObservedRunningTime="2026-04-16 20:00:59.423514561 +0000 UTC m=+437.146560469" Apr 16 20:00:59.451703 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:00:59.451637 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2" podStartSLOduration=0.815091421 podStartE2EDuration="15.451621657s" podCreationTimestamp="2026-04-16 20:00:44 +0000 UTC" firstStartedPulling="2026-04-16 20:00:44.578830277 +0000 UTC m=+422.301876177" lastFinishedPulling="2026-04-16 20:00:59.215360529 +0000 UTC m=+436.938406413" observedRunningTime="2026-04-16 20:00:59.450201186 +0000 UTC m=+437.173247094" watchObservedRunningTime="2026-04-16 20:00:59.451621657 +0000 UTC m=+437.174667563" Apr 16 20:01:00.405523 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:01:00.405480 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr" podUID="11d79e04-a4b3-4879-93e8-8b8729614e6f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 20:01:00.405970 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:01:00.405481 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2" podUID="4b09c279-8214-4252-86e3-632d35882386" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 20:01:05.428407 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:01:05.428366 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" event={"ID":"a77ee7fc-37ba-4669-867d-7b3660de3c9b","Type":"ContainerStarted","Data":"e9f0a9cfeac970e2df3cd099d6b4ca353581e78661687f0db9595bbd978c3a29"} Apr 16 20:01:05.428864 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:01:05.428700 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" Apr 16 20:01:05.429994 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:01:05.429964 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" podUID="a77ee7fc-37ba-4669-867d-7b3660de3c9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 20:01:05.445957 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:01:05.445914 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" podStartSLOduration=1.872152012 podStartE2EDuration="21.445901151s" podCreationTimestamp="2026-04-16 20:00:44 +0000 UTC" firstStartedPulling="2026-04-16 20:00:45.077151789 +0000 UTC m=+422.800197673" lastFinishedPulling="2026-04-16 20:01:04.650900912 +0000 UTC m=+442.373946812" observedRunningTime="2026-04-16 20:01:05.445440058 +0000 UTC m=+443.168485966" watchObservedRunningTime="2026-04-16 20:01:05.445901151 +0000 UTC m=+443.168947127" Apr 16 20:01:06.432611 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:01:06.432562 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" podUID="a77ee7fc-37ba-4669-867d-7b3660de3c9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 20:01:10.405997 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:01:10.405956 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2" podUID="4b09c279-8214-4252-86e3-632d35882386" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 20:01:10.406381 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:01:10.405965 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr" podUID="11d79e04-a4b3-4879-93e8-8b8729614e6f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 20:01:16.432709 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:01:16.432658 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" podUID="a77ee7fc-37ba-4669-867d-7b3660de3c9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 20:01:20.406425 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:01:20.406378 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr" podUID="11d79e04-a4b3-4879-93e8-8b8729614e6f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 20:01:20.406903 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:01:20.406380 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2" podUID="4b09c279-8214-4252-86e3-632d35882386" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 20:01:26.432965 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:01:26.432919 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" podUID="a77ee7fc-37ba-4669-867d-7b3660de3c9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 20:01:30.406219 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:01:30.406179 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr" podUID="11d79e04-a4b3-4879-93e8-8b8729614e6f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 20:01:30.406589 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:01:30.406179 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2" podUID="4b09c279-8214-4252-86e3-632d35882386" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 20:01:36.433508 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:01:36.433458 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" podUID="a77ee7fc-37ba-4669-867d-7b3660de3c9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 20:01:40.405615 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:01:40.405572 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr" podUID="11d79e04-a4b3-4879-93e8-8b8729614e6f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 20:01:40.406008 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:01:40.405580 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2" podUID="4b09c279-8214-4252-86e3-632d35882386" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 20:01:46.433551 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:01:46.433502 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" podUID="a77ee7fc-37ba-4669-867d-7b3660de3c9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 20:01:50.406931 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:01:50.406901 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2" Apr 16 20:01:50.407298 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:01:50.406950 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr" Apr 16 20:01:56.432865 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:01:56.432815 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" podUID="a77ee7fc-37ba-4669-867d-7b3660de3c9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 20:02:06.433107 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:06.433064 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" podUID="a77ee7fc-37ba-4669-867d-7b3660de3c9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 20:02:16.434022 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:16.433984 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" Apr 16 20:02:18.802381 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:18.802346 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2"] Apr 16 20:02:18.802825 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:18.802636 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2" podUID="4b09c279-8214-4252-86e3-632d35882386" containerName="kserve-container" containerID="cri-o://47f71634ccbabadc78e44b951717b2726823baff68f9dcf4f2813fa7e8d646cf" gracePeriod=30 Apr 16 20:02:18.841507 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:18.841476 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669"] Apr 16 20:02:18.844022 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:18.844002 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669" Apr 16 20:02:18.852023 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:18.851999 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669"] Apr 16 20:02:18.860266 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:18.859432 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669" Apr 16 20:02:18.889459 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:18.889432 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr"] Apr 16 20:02:18.890212 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:18.889825 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr" podUID="11d79e04-a4b3-4879-93e8-8b8729614e6f" containerName="kserve-container" containerID="cri-o://65d2ba5d1dbb604e6d7ca401abc9c41d0a7e8189a4fea9d693d2e3c84516f636" gracePeriod=30 Apr 16 20:02:18.891584 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:18.891558 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2"] Apr 16 20:02:18.895023 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:18.895005 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2" Apr 16 20:02:18.900259 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:18.900222 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2"] Apr 16 20:02:18.909693 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:18.909669 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2" Apr 16 20:02:19.021833 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:19.021770 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669"] Apr 16 20:02:19.023873 ip-10-0-128-201 kubenswrapper[2561]: W0416 20:02:19.023749 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod979d7b3d_8bf4_413a_80e4_606aec94cc9a.slice/crio-a8dc8a2a7cdbb591a28f5489364c1e3eeb45a63b2cfa7ae797391f298f893f33 WatchSource:0}: Error finding container a8dc8a2a7cdbb591a28f5489364c1e3eeb45a63b2cfa7ae797391f298f893f33: Status 404 returned error can't find the container with id a8dc8a2a7cdbb591a28f5489364c1e3eeb45a63b2cfa7ae797391f298f893f33 Apr 16 20:02:19.058521 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:19.058498 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2"] Apr 16 20:02:19.061224 ip-10-0-128-201 kubenswrapper[2561]: W0416 20:02:19.061199 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1076105_94f3_4033_9308_97f6a7eb942b.slice/crio-0c41045af25d174eab0f8d814be7c423103f0a9f400b82bdb449cae01f5fda45 WatchSource:0}: Error finding container 0c41045af25d174eab0f8d814be7c423103f0a9f400b82bdb449cae01f5fda45: Status 404 returned error can't find the container with id 0c41045af25d174eab0f8d814be7c423103f0a9f400b82bdb449cae01f5fda45 Apr 16 20:02:19.673959 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:19.673916 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2" event={"ID":"f1076105-94f3-4033-9308-97f6a7eb942b","Type":"ContainerStarted","Data":"9bfd896bffc0ae9a2ca9c00b6aaafbaf2caa1c9e1ea49a7538414c3e4b516159"} Apr 16 20:02:19.673959 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:19.673959 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2" event={"ID":"f1076105-94f3-4033-9308-97f6a7eb942b","Type":"ContainerStarted","Data":"0c41045af25d174eab0f8d814be7c423103f0a9f400b82bdb449cae01f5fda45"} Apr 16 20:02:19.674210 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:19.674093 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2" Apr 16 20:02:19.675359 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:19.675328 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669" event={"ID":"979d7b3d-8bf4-413a-80e4-606aec94cc9a","Type":"ContainerStarted","Data":"afb9b92ed4f11438eaaa79d219a32cd166104ecd2618b298ee165d4f9a17603e"} Apr 16 20:02:19.675623 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:19.675359 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669" event={"ID":"979d7b3d-8bf4-413a-80e4-606aec94cc9a","Type":"ContainerStarted","Data":"a8dc8a2a7cdbb591a28f5489364c1e3eeb45a63b2cfa7ae797391f298f893f33"} Apr 16 20:02:19.675623 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:19.675515 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669" Apr 16 20:02:19.675721 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:19.675641 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2" podUID="f1076105-94f3-4033-9308-97f6a7eb942b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 20:02:19.676296 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:19.676278 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669" podUID="979d7b3d-8bf4-413a-80e4-606aec94cc9a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 20:02:19.689282 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:19.689210 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2" podStartSLOduration=1.689198925 podStartE2EDuration="1.689198925s" podCreationTimestamp="2026-04-16 20:02:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:02:19.688009793 +0000 UTC m=+517.411055696" watchObservedRunningTime="2026-04-16 20:02:19.689198925 +0000 UTC m=+517.412244831" Apr 16 20:02:19.703822 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:19.703766 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669" podStartSLOduration=1.703757783 podStartE2EDuration="1.703757783s" podCreationTimestamp="2026-04-16 20:02:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:02:19.702387627 +0000 UTC m=+517.425433536" watchObservedRunningTime="2026-04-16 20:02:19.703757783 +0000 UTC m=+517.426803688" Apr 16 20:02:20.406094 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:20.406051 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr" podUID="11d79e04-a4b3-4879-93e8-8b8729614e6f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 20:02:20.406473 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:20.406055 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2" podUID="4b09c279-8214-4252-86e3-632d35882386" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 20:02:20.679059 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:20.678966 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669" podUID="979d7b3d-8bf4-413a-80e4-606aec94cc9a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 20:02:20.679059 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:20.679002 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2" podUID="f1076105-94f3-4033-9308-97f6a7eb942b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 20:02:21.946594 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:21.946571 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr" Apr 16 20:02:22.338502 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:22.338479 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2" Apr 16 20:02:22.688225 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:22.688191 2561 generic.go:358] "Generic (PLEG): container finished" podID="11d79e04-a4b3-4879-93e8-8b8729614e6f" containerID="65d2ba5d1dbb604e6d7ca401abc9c41d0a7e8189a4fea9d693d2e3c84516f636" exitCode=0 Apr 16 20:02:22.688376 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:22.688248 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr" Apr 16 20:02:22.688376 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:22.688280 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr" event={"ID":"11d79e04-a4b3-4879-93e8-8b8729614e6f","Type":"ContainerDied","Data":"65d2ba5d1dbb604e6d7ca401abc9c41d0a7e8189a4fea9d693d2e3c84516f636"} Apr 16 20:02:22.688376 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:22.688316 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr" event={"ID":"11d79e04-a4b3-4879-93e8-8b8729614e6f","Type":"ContainerDied","Data":"32a54e5a06f31cddf4e05434155b7fdd9d5d816457086f2c83f1e8690c68af06"} Apr 16 20:02:22.688376 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:22.688344 2561 scope.go:117] "RemoveContainer" containerID="65d2ba5d1dbb604e6d7ca401abc9c41d0a7e8189a4fea9d693d2e3c84516f636" Apr 16 20:02:22.689443 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:22.689421 2561 generic.go:358] "Generic (PLEG): container finished" podID="4b09c279-8214-4252-86e3-632d35882386" containerID="47f71634ccbabadc78e44b951717b2726823baff68f9dcf4f2813fa7e8d646cf" exitCode=0 Apr 16 20:02:22.689539 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:22.689488 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2" Apr 16 20:02:22.689539 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:22.689509 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2" event={"ID":"4b09c279-8214-4252-86e3-632d35882386","Type":"ContainerDied","Data":"47f71634ccbabadc78e44b951717b2726823baff68f9dcf4f2813fa7e8d646cf"} Apr 16 20:02:22.689654 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:22.689552 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2" event={"ID":"4b09c279-8214-4252-86e3-632d35882386","Type":"ContainerDied","Data":"48060787362f8ebe1231383a71419439a8c4cdce87b60839b622134eb000dfe0"} Apr 16 20:02:22.696811 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:22.696641 2561 scope.go:117] "RemoveContainer" containerID="65d2ba5d1dbb604e6d7ca401abc9c41d0a7e8189a4fea9d693d2e3c84516f636" Apr 16 20:02:22.697025 ip-10-0-128-201 kubenswrapper[2561]: E0416 20:02:22.696943 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65d2ba5d1dbb604e6d7ca401abc9c41d0a7e8189a4fea9d693d2e3c84516f636\": container with ID starting with 65d2ba5d1dbb604e6d7ca401abc9c41d0a7e8189a4fea9d693d2e3c84516f636 not found: ID does not exist" containerID="65d2ba5d1dbb604e6d7ca401abc9c41d0a7e8189a4fea9d693d2e3c84516f636" Apr 16 20:02:22.697025 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:22.696977 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65d2ba5d1dbb604e6d7ca401abc9c41d0a7e8189a4fea9d693d2e3c84516f636"} err="failed to get container status \"65d2ba5d1dbb604e6d7ca401abc9c41d0a7e8189a4fea9d693d2e3c84516f636\": rpc error: code = NotFound desc = could not find container \"65d2ba5d1dbb604e6d7ca401abc9c41d0a7e8189a4fea9d693d2e3c84516f636\": container with ID starting with 65d2ba5d1dbb604e6d7ca401abc9c41d0a7e8189a4fea9d693d2e3c84516f636 not found: ID does not exist" Apr 16 20:02:22.697025 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:22.697004 2561 scope.go:117] "RemoveContainer" containerID="47f71634ccbabadc78e44b951717b2726823baff68f9dcf4f2813fa7e8d646cf" Apr 16 20:02:22.704154 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:22.704138 2561 scope.go:117] "RemoveContainer" containerID="47f71634ccbabadc78e44b951717b2726823baff68f9dcf4f2813fa7e8d646cf" Apr 16 20:02:22.704399 ip-10-0-128-201 kubenswrapper[2561]: E0416 20:02:22.704382 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47f71634ccbabadc78e44b951717b2726823baff68f9dcf4f2813fa7e8d646cf\": container with ID starting with 47f71634ccbabadc78e44b951717b2726823baff68f9dcf4f2813fa7e8d646cf not found: ID does not exist" containerID="47f71634ccbabadc78e44b951717b2726823baff68f9dcf4f2813fa7e8d646cf" Apr 16 20:02:22.704452 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:22.704405 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47f71634ccbabadc78e44b951717b2726823baff68f9dcf4f2813fa7e8d646cf"} err="failed to get container status \"47f71634ccbabadc78e44b951717b2726823baff68f9dcf4f2813fa7e8d646cf\": rpc error: code = NotFound desc = could not find container \"47f71634ccbabadc78e44b951717b2726823baff68f9dcf4f2813fa7e8d646cf\": container with ID starting with 47f71634ccbabadc78e44b951717b2726823baff68f9dcf4f2813fa7e8d646cf not found: ID does not exist" Apr 16 20:02:22.719487 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:22.719463 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr"] Apr 16 20:02:22.724727 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:22.724706 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1bf21-predictor-6b9945bd54-cggbr"] Apr 16 20:02:22.738563 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:22.738544 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2"] Apr 16 20:02:22.745550 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:22.745530 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1bf21-predictor-6544f99f87-pskk2"] Apr 16 20:02:22.841047 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:22.841020 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11d79e04-a4b3-4879-93e8-8b8729614e6f" path="/var/lib/kubelet/pods/11d79e04-a4b3-4879-93e8-8b8729614e6f/volumes" Apr 16 20:02:22.841258 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:22.841245 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b09c279-8214-4252-86e3-632d35882386" path="/var/lib/kubelet/pods/4b09c279-8214-4252-86e3-632d35882386/volumes" Apr 16 20:02:30.679649 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:30.679612 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669" podUID="979d7b3d-8bf4-413a-80e4-606aec94cc9a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 20:02:30.680047 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:30.679619 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2" podUID="f1076105-94f3-4033-9308-97f6a7eb942b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 20:02:40.680069 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:40.679971 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669" podUID="979d7b3d-8bf4-413a-80e4-606aec94cc9a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 20:02:40.680069 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:40.679971 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2" podUID="f1076105-94f3-4033-9308-97f6a7eb942b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 20:02:50.679901 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:50.679848 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2" podUID="f1076105-94f3-4033-9308-97f6a7eb942b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 20:02:50.680379 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:50.679848 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669" podUID="979d7b3d-8bf4-413a-80e4-606aec94cc9a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 20:02:54.611768 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:54.611735 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj"] Apr 16 20:02:54.612172 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:54.612049 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" podUID="a77ee7fc-37ba-4669-867d-7b3660de3c9b" containerName="kserve-container" containerID="cri-o://e9f0a9cfeac970e2df3cd099d6b4ca353581e78661687f0db9595bbd978c3a29" gracePeriod=30 Apr 16 20:02:54.622239 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:54.622212 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b"] Apr 16 20:02:54.622739 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:54.622724 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11d79e04-a4b3-4879-93e8-8b8729614e6f" containerName="kserve-container" Apr 16 20:02:54.622823 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:54.622742 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="11d79e04-a4b3-4879-93e8-8b8729614e6f" containerName="kserve-container" Apr 16 20:02:54.622823 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:54.622764 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b09c279-8214-4252-86e3-632d35882386" containerName="kserve-container" Apr 16 20:02:54.622823 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:54.622772 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b09c279-8214-4252-86e3-632d35882386" containerName="kserve-container" Apr 16 20:02:54.622938 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:54.622880 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b09c279-8214-4252-86e3-632d35882386" containerName="kserve-container" Apr 16 20:02:54.622938 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:54.622898 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="11d79e04-a4b3-4879-93e8-8b8729614e6f" containerName="kserve-container" Apr 16 20:02:54.625842 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:54.625824 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b" Apr 16 20:02:54.633266 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:54.633123 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b"] Apr 16 20:02:54.640711 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:54.639229 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b" Apr 16 20:02:54.719484 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:54.719452 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl"] Apr 16 20:02:54.722351 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:54.722276 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl" Apr 16 20:02:54.732707 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:54.732658 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl"] Apr 16 20:02:54.744725 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:54.741882 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl" Apr 16 20:02:54.795916 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:54.795890 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b"] Apr 16 20:02:54.804941 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:54.804881 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b" event={"ID":"000ed769-3a85-4a78-a087-5b689ec85d24","Type":"ContainerStarted","Data":"d761dc81a05aad9f7b2036af9d73069eff396838fa78b717e831e5a554dc3c8c"} Apr 16 20:02:54.887935 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:54.887911 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl"] Apr 16 20:02:54.890123 ip-10-0-128-201 kubenswrapper[2561]: W0416 20:02:54.890099 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod816ba564_9c86_4b31_9a10_bd1bbad5fe9d.slice/crio-8e11a94b86f59f8548dc37cab8097118b2faec4c64926ec29ce190e4232336e2 WatchSource:0}: Error finding container 8e11a94b86f59f8548dc37cab8097118b2faec4c64926ec29ce190e4232336e2: Status 404 returned error can't find the container with id 8e11a94b86f59f8548dc37cab8097118b2faec4c64926ec29ce190e4232336e2 Apr 16 20:02:55.809473 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:55.809431 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl" event={"ID":"816ba564-9c86-4b31-9a10-bd1bbad5fe9d","Type":"ContainerStarted","Data":"c894b9d195f65078b8b2279c62cbb984b477c499f43f6b063f1088a970a3a262"} Apr 16 20:02:55.809473 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:55.809476 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl" event={"ID":"816ba564-9c86-4b31-9a10-bd1bbad5fe9d","Type":"ContainerStarted","Data":"8e11a94b86f59f8548dc37cab8097118b2faec4c64926ec29ce190e4232336e2"} Apr 16 20:02:55.810003 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:55.809619 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl" Apr 16 20:02:55.810805 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:55.810769 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b" event={"ID":"000ed769-3a85-4a78-a087-5b689ec85d24","Type":"ContainerStarted","Data":"2fabb793da6a4734783a84a8ee67f7edabd7dba1621e39876091d1d512b5157a"} Apr 16 20:02:55.810930 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:55.810842 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl" podUID="816ba564-9c86-4b31-9a10-bd1bbad5fe9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 20:02:55.811015 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:55.810997 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b" Apr 16 20:02:55.811938 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:55.811914 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b" podUID="000ed769-3a85-4a78-a087-5b689ec85d24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 20:02:55.827418 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:55.827371 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl" podStartSLOduration=1.827356124 podStartE2EDuration="1.827356124s" podCreationTimestamp="2026-04-16 20:02:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:02:55.825354742 +0000 UTC m=+553.548400648" watchObservedRunningTime="2026-04-16 20:02:55.827356124 +0000 UTC m=+553.550402031" Apr 16 20:02:55.840914 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:55.840869 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b" podStartSLOduration=1.8408572589999999 podStartE2EDuration="1.840857259s" podCreationTimestamp="2026-04-16 20:02:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:02:55.839323722 +0000 UTC m=+553.562369632" watchObservedRunningTime="2026-04-16 20:02:55.840857259 +0000 UTC m=+553.563903158" Apr 16 20:02:56.432871 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:56.432822 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" podUID="a77ee7fc-37ba-4669-867d-7b3660de3c9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 20:02:56.813811 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:56.813704 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b" podUID="000ed769-3a85-4a78-a087-5b689ec85d24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 20:02:56.813811 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:56.813716 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl" podUID="816ba564-9c86-4b31-9a10-bd1bbad5fe9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 20:02:59.827604 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:59.827548 2561 generic.go:358] "Generic (PLEG): container finished" podID="a77ee7fc-37ba-4669-867d-7b3660de3c9b" containerID="e9f0a9cfeac970e2df3cd099d6b4ca353581e78661687f0db9595bbd978c3a29" exitCode=0 Apr 16 20:02:59.827997 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:59.827706 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" event={"ID":"a77ee7fc-37ba-4669-867d-7b3660de3c9b","Type":"ContainerDied","Data":"e9f0a9cfeac970e2df3cd099d6b4ca353581e78661687f0db9595bbd978c3a29"} Apr 16 20:02:59.963256 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:02:59.963234 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" Apr 16 20:03:00.077833 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:00.077731 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a77ee7fc-37ba-4669-867d-7b3660de3c9b-kserve-provision-location\") pod \"a77ee7fc-37ba-4669-867d-7b3660de3c9b\" (UID: \"a77ee7fc-37ba-4669-867d-7b3660de3c9b\") " Apr 16 20:03:00.078097 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:00.078075 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a77ee7fc-37ba-4669-867d-7b3660de3c9b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a77ee7fc-37ba-4669-867d-7b3660de3c9b" (UID: "a77ee7fc-37ba-4669-867d-7b3660de3c9b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:03:00.179001 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:00.178965 2561 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a77ee7fc-37ba-4669-867d-7b3660de3c9b-kserve-provision-location\") on node \"ip-10-0-128-201.ec2.internal\" DevicePath \"\"" Apr 16 20:03:00.679758 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:00.679714 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669" podUID="979d7b3d-8bf4-413a-80e4-606aec94cc9a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 20:03:00.679954 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:00.679714 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2" podUID="f1076105-94f3-4033-9308-97f6a7eb942b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 20:03:00.836231 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:00.836198 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" Apr 16 20:03:00.840011 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:00.839982 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj" event={"ID":"a77ee7fc-37ba-4669-867d-7b3660de3c9b","Type":"ContainerDied","Data":"0823753646e42194b05680be31942b78e7fb7f804b2f155b9e1178e23b69e554"} Apr 16 20:03:00.840143 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:00.840026 2561 scope.go:117] "RemoveContainer" containerID="e9f0a9cfeac970e2df3cd099d6b4ca353581e78661687f0db9595bbd978c3a29" Apr 16 20:03:00.848476 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:00.848459 2561 scope.go:117] "RemoveContainer" containerID="a0d159240991f20d61729c45a207448342a0b83374f4dffe39a39bcda84c5811" Apr 16 20:03:00.860517 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:00.860489 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj"] Apr 16 20:03:00.866484 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:00.866463 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-74d68d6bdc-vzbzj"] Apr 16 20:03:02.840460 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:02.840427 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a77ee7fc-37ba-4669-867d-7b3660de3c9b" path="/var/lib/kubelet/pods/a77ee7fc-37ba-4669-867d-7b3660de3c9b/volumes" Apr 16 20:03:06.814738 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:06.814698 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b" podUID="000ed769-3a85-4a78-a087-5b689ec85d24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 20:03:06.815129 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:06.814697 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl" podUID="816ba564-9c86-4b31-9a10-bd1bbad5fe9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 20:03:10.679998 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:10.679959 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669" Apr 16 20:03:10.680404 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:10.680028 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2" Apr 16 20:03:16.814327 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:16.814277 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl" podUID="816ba564-9c86-4b31-9a10-bd1bbad5fe9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 20:03:16.814688 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:16.814277 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b" podUID="000ed769-3a85-4a78-a087-5b689ec85d24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 20:03:26.814705 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:26.814659 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b" podUID="000ed769-3a85-4a78-a087-5b689ec85d24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 20:03:26.815214 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:26.814659 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl" podUID="816ba564-9c86-4b31-9a10-bd1bbad5fe9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 20:03:36.814738 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:36.814690 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl" podUID="816ba564-9c86-4b31-9a10-bd1bbad5fe9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 20:03:36.815166 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:36.814687 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b" podUID="000ed769-3a85-4a78-a087-5b689ec85d24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 20:03:42.779610 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:42.779583 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/1.log" Apr 16 20:03:42.781708 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:42.781687 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/1.log" Apr 16 20:03:42.783930 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:42.783900 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 20:03:42.785983 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:42.785964 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 20:03:46.814538 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:46.814504 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl" Apr 16 20:03:46.815034 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:03:46.814964 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b" Apr 16 20:08:42.801004 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:08:42.800928 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/1.log" Apr 16 20:08:42.804846 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:08:42.804822 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/1.log" Apr 16 20:08:42.805897 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:08:42.805874 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 20:08:42.809253 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:08:42.809235 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 20:11:43.917126 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:43.917053 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669"] Apr 16 20:11:43.917603 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:43.917244 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669" podUID="979d7b3d-8bf4-413a-80e4-606aec94cc9a" containerName="kserve-container" containerID="cri-o://afb9b92ed4f11438eaaa79d219a32cd166104ecd2618b298ee165d4f9a17603e" gracePeriod=30 Apr 16 20:11:43.981491 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:43.981459 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx"] Apr 16 20:11:43.982194 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:43.982166 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a77ee7fc-37ba-4669-867d-7b3660de3c9b" containerName="storage-initializer" Apr 16 20:11:43.982194 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:43.982196 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="a77ee7fc-37ba-4669-867d-7b3660de3c9b" containerName="storage-initializer" Apr 16 20:11:43.982362 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:43.982226 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a77ee7fc-37ba-4669-867d-7b3660de3c9b" containerName="kserve-container" Apr 16 20:11:43.982362 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:43.982235 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="a77ee7fc-37ba-4669-867d-7b3660de3c9b" containerName="kserve-container" Apr 16 20:11:43.982362 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:43.982332 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="a77ee7fc-37ba-4669-867d-7b3660de3c9b" containerName="kserve-container" Apr 16 20:11:43.985927 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:43.985895 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx" Apr 16 20:11:43.997684 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:43.997664 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx" Apr 16 20:11:44.002928 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:44.002905 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2"] Apr 16 20:11:44.003220 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:44.003193 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2" podUID="f1076105-94f3-4033-9308-97f6a7eb942b" containerName="kserve-container" containerID="cri-o://9bfd896bffc0ae9a2ca9c00b6aaafbaf2caa1c9e1ea49a7538414c3e4b516159" gracePeriod=30 Apr 16 20:11:44.006500 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:44.006476 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx"] Apr 16 20:11:44.032542 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:44.032511 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k"] Apr 16 20:11:44.035960 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:44.035941 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k" Apr 16 20:11:44.048032 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:44.048004 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k"] Apr 16 20:11:44.050850 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:44.050512 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k" Apr 16 20:11:44.161473 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:44.161143 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx"] Apr 16 20:11:44.167571 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:44.166685 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:11:44.215976 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:44.215931 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k"] Apr 16 20:11:44.217892 ip-10-0-128-201 kubenswrapper[2561]: W0416 20:11:44.217857 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddacdd8d2_efd1_47c1_b6d9_4f87a369513a.slice/crio-8b74528e5eca8581dea3842278bfd18928b8f9de528e845186d7e053bb74c425 WatchSource:0}: Error finding container 8b74528e5eca8581dea3842278bfd18928b8f9de528e845186d7e053bb74c425: Status 404 returned error can't find the container with id 8b74528e5eca8581dea3842278bfd18928b8f9de528e845186d7e053bb74c425 Apr 16 20:11:44.578844 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:44.578744 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k" event={"ID":"dacdd8d2-efd1-47c1-b6d9-4f87a369513a","Type":"ContainerStarted","Data":"fca3a265fc228ff7158539e1904d2de2e2e67b1c8e18d8b5de6ea0a68ecab4ad"} Apr 16 20:11:44.578844 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:44.578780 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k" event={"ID":"dacdd8d2-efd1-47c1-b6d9-4f87a369513a","Type":"ContainerStarted","Data":"8b74528e5eca8581dea3842278bfd18928b8f9de528e845186d7e053bb74c425"} Apr 16 20:11:44.579084 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:44.579011 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k" Apr 16 20:11:44.580300 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:44.580264 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k" podUID="dacdd8d2-efd1-47c1-b6d9-4f87a369513a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 20:11:44.580300 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:44.580281 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx" event={"ID":"70a7fd93-0d3e-4360-bfdf-56ad797ba3c6","Type":"ContainerStarted","Data":"f2b6832dd8cec8a78e717c2fae8aa27728511ab869f5bdb94bff024adcc90f31"} Apr 16 20:11:44.580481 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:44.580312 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx" event={"ID":"70a7fd93-0d3e-4360-bfdf-56ad797ba3c6","Type":"ContainerStarted","Data":"890d73f9db809b58c735d4a4f5b5fdcea225c9c487a1fb658fc3e511d1f1ec5e"} Apr 16 20:11:44.580481 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:44.580432 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx" Apr 16 20:11:44.581525 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:44.581496 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx" podUID="70a7fd93-0d3e-4360-bfdf-56ad797ba3c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 20:11:44.611301 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:44.611248 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k" podStartSLOduration=0.611232928 podStartE2EDuration="611.232928ms" podCreationTimestamp="2026-04-16 20:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:11:44.596155368 +0000 UTC m=+1082.319201275" watchObservedRunningTime="2026-04-16 20:11:44.611232928 +0000 UTC m=+1082.334278836" Apr 16 20:11:44.613027 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:44.612996 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx" podStartSLOduration=1.6129876300000001 podStartE2EDuration="1.61298763s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:11:44.610929058 +0000 UTC m=+1082.333974963" watchObservedRunningTime="2026-04-16 20:11:44.61298763 +0000 UTC m=+1082.336033536" Apr 16 20:11:45.584174 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:45.584130 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx" podUID="70a7fd93-0d3e-4360-bfdf-56ad797ba3c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 20:11:45.584612 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:45.584248 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k" podUID="dacdd8d2-efd1-47c1-b6d9-4f87a369513a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 20:11:47.155182 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:47.155162 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2" Apr 16 20:11:47.264714 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:47.264690 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669" Apr 16 20:11:47.592292 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:47.592215 2561 generic.go:358] "Generic (PLEG): container finished" podID="f1076105-94f3-4033-9308-97f6a7eb942b" containerID="9bfd896bffc0ae9a2ca9c00b6aaafbaf2caa1c9e1ea49a7538414c3e4b516159" exitCode=0 Apr 16 20:11:47.592292 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:47.592280 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2" Apr 16 20:11:47.592506 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:47.592294 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2" event={"ID":"f1076105-94f3-4033-9308-97f6a7eb942b","Type":"ContainerDied","Data":"9bfd896bffc0ae9a2ca9c00b6aaafbaf2caa1c9e1ea49a7538414c3e4b516159"} Apr 16 20:11:47.592506 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:47.592335 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2" event={"ID":"f1076105-94f3-4033-9308-97f6a7eb942b","Type":"ContainerDied","Data":"0c41045af25d174eab0f8d814be7c423103f0a9f400b82bdb449cae01f5fda45"} Apr 16 20:11:47.592506 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:47.592355 2561 scope.go:117] "RemoveContainer" containerID="9bfd896bffc0ae9a2ca9c00b6aaafbaf2caa1c9e1ea49a7538414c3e4b516159" Apr 16 20:11:47.593429 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:47.593391 2561 generic.go:358] "Generic (PLEG): container finished" podID="979d7b3d-8bf4-413a-80e4-606aec94cc9a" containerID="afb9b92ed4f11438eaaa79d219a32cd166104ecd2618b298ee165d4f9a17603e" exitCode=0 Apr 16 20:11:47.593509 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:47.593454 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669" Apr 16 20:11:47.593509 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:47.593453 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669" event={"ID":"979d7b3d-8bf4-413a-80e4-606aec94cc9a","Type":"ContainerDied","Data":"afb9b92ed4f11438eaaa79d219a32cd166104ecd2618b298ee165d4f9a17603e"} Apr 16 20:11:47.593509 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:47.593494 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669" event={"ID":"979d7b3d-8bf4-413a-80e4-606aec94cc9a","Type":"ContainerDied","Data":"a8dc8a2a7cdbb591a28f5489364c1e3eeb45a63b2cfa7ae797391f298f893f33"} Apr 16 20:11:47.601228 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:47.601211 2561 scope.go:117] "RemoveContainer" containerID="9bfd896bffc0ae9a2ca9c00b6aaafbaf2caa1c9e1ea49a7538414c3e4b516159" Apr 16 20:11:47.601468 ip-10-0-128-201 kubenswrapper[2561]: E0416 20:11:47.601450 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bfd896bffc0ae9a2ca9c00b6aaafbaf2caa1c9e1ea49a7538414c3e4b516159\": container with ID starting with 9bfd896bffc0ae9a2ca9c00b6aaafbaf2caa1c9e1ea49a7538414c3e4b516159 not found: ID does not exist" containerID="9bfd896bffc0ae9a2ca9c00b6aaafbaf2caa1c9e1ea49a7538414c3e4b516159" Apr 16 20:11:47.601538 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:47.601475 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bfd896bffc0ae9a2ca9c00b6aaafbaf2caa1c9e1ea49a7538414c3e4b516159"} err="failed to get container status \"9bfd896bffc0ae9a2ca9c00b6aaafbaf2caa1c9e1ea49a7538414c3e4b516159\": rpc error: code = NotFound desc = could not find container \"9bfd896bffc0ae9a2ca9c00b6aaafbaf2caa1c9e1ea49a7538414c3e4b516159\": container with ID starting with 9bfd896bffc0ae9a2ca9c00b6aaafbaf2caa1c9e1ea49a7538414c3e4b516159 not found: ID does not exist" Apr 16 20:11:47.601538 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:47.601492 2561 scope.go:117] "RemoveContainer" containerID="afb9b92ed4f11438eaaa79d219a32cd166104ecd2618b298ee165d4f9a17603e" Apr 16 20:11:47.608598 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:47.608581 2561 scope.go:117] "RemoveContainer" containerID="afb9b92ed4f11438eaaa79d219a32cd166104ecd2618b298ee165d4f9a17603e" Apr 16 20:11:47.608848 ip-10-0-128-201 kubenswrapper[2561]: E0416 20:11:47.608826 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afb9b92ed4f11438eaaa79d219a32cd166104ecd2618b298ee165d4f9a17603e\": container with ID starting with afb9b92ed4f11438eaaa79d219a32cd166104ecd2618b298ee165d4f9a17603e not found: ID does not exist" containerID="afb9b92ed4f11438eaaa79d219a32cd166104ecd2618b298ee165d4f9a17603e" Apr 16 20:11:47.608946 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:47.608852 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afb9b92ed4f11438eaaa79d219a32cd166104ecd2618b298ee165d4f9a17603e"} err="failed to get container status \"afb9b92ed4f11438eaaa79d219a32cd166104ecd2618b298ee165d4f9a17603e\": rpc error: code = NotFound desc = could not find container \"afb9b92ed4f11438eaaa79d219a32cd166104ecd2618b298ee165d4f9a17603e\": container with ID starting with afb9b92ed4f11438eaaa79d219a32cd166104ecd2618b298ee165d4f9a17603e not found: ID does not exist" Apr 16 20:11:47.617899 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:47.617879 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669"] Apr 16 20:11:47.622065 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:47.622046 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3d0d2-predictor-7967c5b545-lq669"] Apr 16 20:11:47.632933 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:47.632908 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2"] Apr 16 20:11:47.636126 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:47.636103 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3d0d2-predictor-78c799bd44-wt5c2"] Apr 16 20:11:48.839428 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:48.839393 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="979d7b3d-8bf4-413a-80e4-606aec94cc9a" path="/var/lib/kubelet/pods/979d7b3d-8bf4-413a-80e4-606aec94cc9a/volumes" Apr 16 20:11:48.839769 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:48.839632 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1076105-94f3-4033-9308-97f6a7eb942b" path="/var/lib/kubelet/pods/f1076105-94f3-4033-9308-97f6a7eb942b/volumes" Apr 16 20:11:55.584589 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:55.584546 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx" podUID="70a7fd93-0d3e-4360-bfdf-56ad797ba3c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 20:11:55.585007 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:11:55.584546 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k" podUID="dacdd8d2-efd1-47c1-b6d9-4f87a369513a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 20:12:05.585234 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:05.585186 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx" podUID="70a7fd93-0d3e-4360-bfdf-56ad797ba3c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 20:12:05.585635 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:05.585196 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k" podUID="dacdd8d2-efd1-47c1-b6d9-4f87a369513a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 20:12:15.584940 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:15.584898 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx" podUID="70a7fd93-0d3e-4360-bfdf-56ad797ba3c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 20:12:15.585389 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:15.584905 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k" podUID="dacdd8d2-efd1-47c1-b6d9-4f87a369513a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 20:12:19.518855 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:19.518823 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl"] Apr 16 20:12:19.519286 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:19.519116 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl" podUID="816ba564-9c86-4b31-9a10-bd1bbad5fe9d" containerName="kserve-container" containerID="cri-o://c894b9d195f65078b8b2279c62cbb984b477c499f43f6b063f1088a970a3a262" gracePeriod=30 Apr 16 20:12:19.568373 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:19.568340 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b"] Apr 16 20:12:19.568653 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:19.568629 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b" podUID="000ed769-3a85-4a78-a087-5b689ec85d24" containerName="kserve-container" containerID="cri-o://2fabb793da6a4734783a84a8ee67f7edabd7dba1621e39876091d1d512b5157a" gracePeriod=30 Apr 16 20:12:19.578822 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:19.578776 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr"] Apr 16 20:12:19.579145 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:19.579133 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="979d7b3d-8bf4-413a-80e4-606aec94cc9a" containerName="kserve-container" Apr 16 20:12:19.579187 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:19.579146 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="979d7b3d-8bf4-413a-80e4-606aec94cc9a" containerName="kserve-container" Apr 16 20:12:19.579187 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:19.579159 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1076105-94f3-4033-9308-97f6a7eb942b" containerName="kserve-container" Apr 16 20:12:19.579187 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:19.579164 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1076105-94f3-4033-9308-97f6a7eb942b" containerName="kserve-container" Apr 16 20:12:19.579304 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:19.579233 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="979d7b3d-8bf4-413a-80e4-606aec94cc9a" containerName="kserve-container" Apr 16 20:12:19.579304 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:19.579243 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1076105-94f3-4033-9308-97f6a7eb942b" containerName="kserve-container" Apr 16 20:12:19.582390 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:19.582373 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr" Apr 16 20:12:19.591410 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:19.591389 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr"] Apr 16 20:12:19.593460 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:19.593445 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr" Apr 16 20:12:19.657126 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:19.657103 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2"] Apr 16 20:12:19.661680 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:19.661661 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2" Apr 16 20:12:19.669688 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:19.669405 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2"] Apr 16 20:12:19.681137 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:19.681106 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2" Apr 16 20:12:19.741956 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:19.741924 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr"] Apr 16 20:12:19.744169 ip-10-0-128-201 kubenswrapper[2561]: W0416 20:12:19.744137 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod824c77c0_f514_44fa_9c4a_fe78e3de0d02.slice/crio-fd322350887213679a700c318f6fa1f258c3246b2b87156e370a14363ea3bc83 WatchSource:0}: Error finding container fd322350887213679a700c318f6fa1f258c3246b2b87156e370a14363ea3bc83: Status 404 returned error can't find the container with id fd322350887213679a700c318f6fa1f258c3246b2b87156e370a14363ea3bc83 Apr 16 20:12:19.869820 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:19.869772 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2"] Apr 16 20:12:19.872114 ip-10-0-128-201 kubenswrapper[2561]: W0416 20:12:19.872089 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2767c4e9_3ec6_4973_8a0d_7a65a375385e.slice/crio-4dc9136b5b8e355c8c47985417966f1de1841c4bd27267d6ba7a6a2c84bbd523 WatchSource:0}: Error finding container 4dc9136b5b8e355c8c47985417966f1de1841c4bd27267d6ba7a6a2c84bbd523: Status 404 returned error can't find the container with id 4dc9136b5b8e355c8c47985417966f1de1841c4bd27267d6ba7a6a2c84bbd523 Apr 16 20:12:20.705589 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:20.705553 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2" event={"ID":"2767c4e9-3ec6-4973-8a0d-7a65a375385e","Type":"ContainerStarted","Data":"acedbbc976bbcf7cfda077236d96b60dedcde32bf2295d26f7c5913f1c097b1d"} Apr 16 20:12:20.705589 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:20.705589 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2" event={"ID":"2767c4e9-3ec6-4973-8a0d-7a65a375385e","Type":"ContainerStarted","Data":"4dc9136b5b8e355c8c47985417966f1de1841c4bd27267d6ba7a6a2c84bbd523"} Apr 16 20:12:20.706067 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:20.705605 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2" Apr 16 20:12:20.707018 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:20.706984 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr" event={"ID":"824c77c0-f514-44fa-9c4a-fe78e3de0d02","Type":"ContainerStarted","Data":"45ebb94b0015904ed91034f8bf45bd515eee48e2517e79ca14644ff92577ef03"} Apr 16 20:12:20.707151 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:20.707022 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr" event={"ID":"824c77c0-f514-44fa-9c4a-fe78e3de0d02","Type":"ContainerStarted","Data":"fd322350887213679a700c318f6fa1f258c3246b2b87156e370a14363ea3bc83"} Apr 16 20:12:20.707222 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:20.707187 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr" Apr 16 20:12:20.707277 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:20.707252 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2" podUID="2767c4e9-3ec6-4973-8a0d-7a65a375385e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 20:12:20.708039 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:20.708018 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr" podUID="824c77c0-f514-44fa-9c4a-fe78e3de0d02" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 20:12:20.723976 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:20.723937 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2" podStartSLOduration=1.723927148 podStartE2EDuration="1.723927148s" podCreationTimestamp="2026-04-16 20:12:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:12:20.722038105 +0000 UTC m=+1118.445084011" watchObservedRunningTime="2026-04-16 20:12:20.723927148 +0000 UTC m=+1118.446973053" Apr 16 20:12:20.737728 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:20.737689 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr" podStartSLOduration=1.737677632 podStartE2EDuration="1.737677632s" podCreationTimestamp="2026-04-16 20:12:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:12:20.737240246 +0000 UTC m=+1118.460286154" watchObservedRunningTime="2026-04-16 20:12:20.737677632 +0000 UTC m=+1118.460723608" Apr 16 20:12:21.711319 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:21.711276 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2" podUID="2767c4e9-3ec6-4973-8a0d-7a65a375385e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 20:12:21.711319 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:21.711275 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr" podUID="824c77c0-f514-44fa-9c4a-fe78e3de0d02" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 20:12:23.380060 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:23.380035 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl" Apr 16 20:12:23.433184 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:23.433156 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b" Apr 16 20:12:23.719073 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:23.718992 2561 generic.go:358] "Generic (PLEG): container finished" podID="816ba564-9c86-4b31-9a10-bd1bbad5fe9d" containerID="c894b9d195f65078b8b2279c62cbb984b477c499f43f6b063f1088a970a3a262" exitCode=0 Apr 16 20:12:23.719073 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:23.719054 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl" Apr 16 20:12:23.719294 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:23.719083 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl" event={"ID":"816ba564-9c86-4b31-9a10-bd1bbad5fe9d","Type":"ContainerDied","Data":"c894b9d195f65078b8b2279c62cbb984b477c499f43f6b063f1088a970a3a262"} Apr 16 20:12:23.719294 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:23.719120 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl" event={"ID":"816ba564-9c86-4b31-9a10-bd1bbad5fe9d","Type":"ContainerDied","Data":"8e11a94b86f59f8548dc37cab8097118b2faec4c64926ec29ce190e4232336e2"} Apr 16 20:12:23.719294 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:23.719145 2561 scope.go:117] "RemoveContainer" containerID="c894b9d195f65078b8b2279c62cbb984b477c499f43f6b063f1088a970a3a262" Apr 16 20:12:23.720136 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:23.720108 2561 generic.go:358] "Generic (PLEG): container finished" podID="000ed769-3a85-4a78-a087-5b689ec85d24" containerID="2fabb793da6a4734783a84a8ee67f7edabd7dba1621e39876091d1d512b5157a" exitCode=0 Apr 16 20:12:23.720207 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:23.720154 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b" Apr 16 20:12:23.720207 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:23.720187 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b" event={"ID":"000ed769-3a85-4a78-a087-5b689ec85d24","Type":"ContainerDied","Data":"2fabb793da6a4734783a84a8ee67f7edabd7dba1621e39876091d1d512b5157a"} Apr 16 20:12:23.720275 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:23.720221 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b" event={"ID":"000ed769-3a85-4a78-a087-5b689ec85d24","Type":"ContainerDied","Data":"d761dc81a05aad9f7b2036af9d73069eff396838fa78b717e831e5a554dc3c8c"} Apr 16 20:12:23.726820 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:23.726782 2561 scope.go:117] "RemoveContainer" containerID="c894b9d195f65078b8b2279c62cbb984b477c499f43f6b063f1088a970a3a262" Apr 16 20:12:23.729480 ip-10-0-128-201 kubenswrapper[2561]: E0416 20:12:23.727489 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c894b9d195f65078b8b2279c62cbb984b477c499f43f6b063f1088a970a3a262\": container with ID starting with c894b9d195f65078b8b2279c62cbb984b477c499f43f6b063f1088a970a3a262 not found: ID does not exist" containerID="c894b9d195f65078b8b2279c62cbb984b477c499f43f6b063f1088a970a3a262" Apr 16 20:12:23.729480 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:23.727525 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c894b9d195f65078b8b2279c62cbb984b477c499f43f6b063f1088a970a3a262"} err="failed to get container status \"c894b9d195f65078b8b2279c62cbb984b477c499f43f6b063f1088a970a3a262\": rpc error: code = NotFound desc = could not find container \"c894b9d195f65078b8b2279c62cbb984b477c499f43f6b063f1088a970a3a262\": container with ID starting with c894b9d195f65078b8b2279c62cbb984b477c499f43f6b063f1088a970a3a262 not found: ID does not exist" Apr 16 20:12:23.729480 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:23.727549 2561 scope.go:117] "RemoveContainer" containerID="2fabb793da6a4734783a84a8ee67f7edabd7dba1621e39876091d1d512b5157a" Apr 16 20:12:23.736975 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:23.736952 2561 scope.go:117] "RemoveContainer" containerID="2fabb793da6a4734783a84a8ee67f7edabd7dba1621e39876091d1d512b5157a" Apr 16 20:12:23.737227 ip-10-0-128-201 kubenswrapper[2561]: E0416 20:12:23.737210 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fabb793da6a4734783a84a8ee67f7edabd7dba1621e39876091d1d512b5157a\": container with ID starting with 2fabb793da6a4734783a84a8ee67f7edabd7dba1621e39876091d1d512b5157a not found: ID does not exist" containerID="2fabb793da6a4734783a84a8ee67f7edabd7dba1621e39876091d1d512b5157a" Apr 16 20:12:23.737288 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:23.737232 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fabb793da6a4734783a84a8ee67f7edabd7dba1621e39876091d1d512b5157a"} err="failed to get container status \"2fabb793da6a4734783a84a8ee67f7edabd7dba1621e39876091d1d512b5157a\": rpc error: code = NotFound desc = could not find container \"2fabb793da6a4734783a84a8ee67f7edabd7dba1621e39876091d1d512b5157a\": container with ID starting with 2fabb793da6a4734783a84a8ee67f7edabd7dba1621e39876091d1d512b5157a not found: ID does not exist" Apr 16 20:12:23.747071 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:23.747050 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl"] Apr 16 20:12:23.750740 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:23.750720 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-99283-predictor-75c5d8c576-2d7bl"] Apr 16 20:12:23.762383 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:23.762357 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b"] Apr 16 20:12:23.764470 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:23.764447 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-99283-predictor-5b9fd86cdf-j7k6b"] Apr 16 20:12:24.840039 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:24.840005 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="000ed769-3a85-4a78-a087-5b689ec85d24" path="/var/lib/kubelet/pods/000ed769-3a85-4a78-a087-5b689ec85d24/volumes" Apr 16 20:12:24.840379 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:24.840225 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="816ba564-9c86-4b31-9a10-bd1bbad5fe9d" path="/var/lib/kubelet/pods/816ba564-9c86-4b31-9a10-bd1bbad5fe9d/volumes" Apr 16 20:12:25.585185 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:25.585145 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k" podUID="dacdd8d2-efd1-47c1-b6d9-4f87a369513a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 20:12:25.585379 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:25.585144 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx" podUID="70a7fd93-0d3e-4360-bfdf-56ad797ba3c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 20:12:31.711831 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:31.711763 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2" podUID="2767c4e9-3ec6-4973-8a0d-7a65a375385e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 20:12:31.712273 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:31.711765 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr" podUID="824c77c0-f514-44fa-9c4a-fe78e3de0d02" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 20:12:35.585403 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:35.585365 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx" Apr 16 20:12:35.585897 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:35.585760 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k" Apr 16 20:12:41.711388 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:41.711340 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr" podUID="824c77c0-f514-44fa-9c4a-fe78e3de0d02" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 20:12:41.711887 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:41.711341 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2" podUID="2767c4e9-3ec6-4973-8a0d-7a65a375385e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 20:12:51.712053 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:51.712005 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2" podUID="2767c4e9-3ec6-4973-8a0d-7a65a375385e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 20:12:51.712536 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:12:51.712005 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr" podUID="824c77c0-f514-44fa-9c4a-fe78e3de0d02" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 20:13:01.711635 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:01.711532 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2" podUID="2767c4e9-3ec6-4973-8a0d-7a65a375385e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 20:13:01.712038 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:01.711539 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr" podUID="824c77c0-f514-44fa-9c4a-fe78e3de0d02" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 20:13:04.177012 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.176973 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k"] Apr 16 20:13:04.177416 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.177269 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k" podUID="dacdd8d2-efd1-47c1-b6d9-4f87a369513a" containerName="kserve-container" containerID="cri-o://fca3a265fc228ff7158539e1904d2de2e2e67b1c8e18d8b5de6ea0a68ecab4ad" gracePeriod=30 Apr 16 20:13:04.198281 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.198243 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg"] Apr 16 20:13:04.198632 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.198611 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="000ed769-3a85-4a78-a087-5b689ec85d24" containerName="kserve-container" Apr 16 20:13:04.198632 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.198631 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="000ed769-3a85-4a78-a087-5b689ec85d24" containerName="kserve-container" Apr 16 20:13:04.198872 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.198657 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="816ba564-9c86-4b31-9a10-bd1bbad5fe9d" containerName="kserve-container" Apr 16 20:13:04.198872 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.198666 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="816ba564-9c86-4b31-9a10-bd1bbad5fe9d" containerName="kserve-container" Apr 16 20:13:04.198872 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.198733 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="000ed769-3a85-4a78-a087-5b689ec85d24" containerName="kserve-container" Apr 16 20:13:04.198872 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.198747 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="816ba564-9c86-4b31-9a10-bd1bbad5fe9d" containerName="kserve-container" Apr 16 20:13:04.203920 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.203890 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg" Apr 16 20:13:04.212930 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.212491 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg"] Apr 16 20:13:04.217291 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.217265 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg" Apr 16 20:13:04.234957 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.234916 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx"] Apr 16 20:13:04.235281 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.235227 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx" podUID="70a7fd93-0d3e-4360-bfdf-56ad797ba3c6" containerName="kserve-container" containerID="cri-o://f2b6832dd8cec8a78e717c2fae8aa27728511ab869f5bdb94bff024adcc90f31" gracePeriod=30 Apr 16 20:13:04.283488 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.283156 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn"] Apr 16 20:13:04.288522 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.288436 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn" Apr 16 20:13:04.295678 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.295630 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn"] Apr 16 20:13:04.306758 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.306345 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn" Apr 16 20:13:04.395650 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.395627 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg"] Apr 16 20:13:04.398184 ip-10-0-128-201 kubenswrapper[2561]: W0416 20:13:04.398152 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc73502e_f8bc_4e27_b5b5_8e91aab00243.slice/crio-10b9f73ac4968195f9702997e831f8f1770babca4b8d22971654828047e976a1 WatchSource:0}: Error finding container 10b9f73ac4968195f9702997e831f8f1770babca4b8d22971654828047e976a1: Status 404 returned error can't find the container with id 10b9f73ac4968195f9702997e831f8f1770babca4b8d22971654828047e976a1 Apr 16 20:13:04.479644 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.479617 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn"] Apr 16 20:13:04.482162 ip-10-0-128-201 kubenswrapper[2561]: W0416 20:13:04.482131 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb181d931_60a8_48dc_abce_f5911d25b5b9.slice/crio-413fb151dfd284706abb85c788cf49f550e6a94d227fc3bb44c78be9c9aa0a4a WatchSource:0}: Error finding container 413fb151dfd284706abb85c788cf49f550e6a94d227fc3bb44c78be9c9aa0a4a: Status 404 returned error can't find the container with id 413fb151dfd284706abb85c788cf49f550e6a94d227fc3bb44c78be9c9aa0a4a Apr 16 20:13:04.865859 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.865753 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn" event={"ID":"b181d931-60a8-48dc-abce-f5911d25b5b9","Type":"ContainerStarted","Data":"f4c6a183d89e6f19ae3f4c3776acfd9cddfaea5e19fd82eac6486431bfbd8b95"} Apr 16 20:13:04.865859 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.865822 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn" event={"ID":"b181d931-60a8-48dc-abce-f5911d25b5b9","Type":"ContainerStarted","Data":"413fb151dfd284706abb85c788cf49f550e6a94d227fc3bb44c78be9c9aa0a4a"} Apr 16 20:13:04.866092 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.865952 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn" Apr 16 20:13:04.867070 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.867045 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg" event={"ID":"fc73502e-f8bc-4e27-b5b5-8e91aab00243","Type":"ContainerStarted","Data":"cb91761e18a1f1cb2a4c76a69dc0397be9faef882b3ac615cb5128c6fb1d28df"} Apr 16 20:13:04.867187 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.867078 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg" event={"ID":"fc73502e-f8bc-4e27-b5b5-8e91aab00243","Type":"ContainerStarted","Data":"10b9f73ac4968195f9702997e831f8f1770babca4b8d22971654828047e976a1"} Apr 16 20:13:04.867312 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.867288 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg" Apr 16 20:13:04.867432 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.867339 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn" podUID="b181d931-60a8-48dc-abce-f5911d25b5b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 20:13:04.868340 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.868314 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg" podUID="fc73502e-f8bc-4e27-b5b5-8e91aab00243" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:13:04.885694 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.885641 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn" podStartSLOduration=0.88562434 podStartE2EDuration="885.62434ms" podCreationTimestamp="2026-04-16 20:13:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:13:04.88476921 +0000 UTC m=+1162.607815115" watchObservedRunningTime="2026-04-16 20:13:04.88562434 +0000 UTC m=+1162.608670246" Apr 16 20:13:04.901952 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:04.901906 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg" podStartSLOduration=0.901889816 podStartE2EDuration="901.889816ms" podCreationTimestamp="2026-04-16 20:13:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:13:04.90030473 +0000 UTC m=+1162.623350638" watchObservedRunningTime="2026-04-16 20:13:04.901889816 +0000 UTC m=+1162.624935718" Apr 16 20:13:05.584970 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:05.584933 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx" podUID="70a7fd93-0d3e-4360-bfdf-56ad797ba3c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 20:13:05.585333 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:05.584933 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k" podUID="dacdd8d2-efd1-47c1-b6d9-4f87a369513a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 20:13:05.870736 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:05.870638 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg" podUID="fc73502e-f8bc-4e27-b5b5-8e91aab00243" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:13:05.870736 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:05.870670 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn" podUID="b181d931-60a8-48dc-abce-f5911d25b5b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 20:13:07.619886 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:07.619862 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k" Apr 16 20:13:07.877651 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:07.877619 2561 generic.go:358] "Generic (PLEG): container finished" podID="dacdd8d2-efd1-47c1-b6d9-4f87a369513a" containerID="fca3a265fc228ff7158539e1904d2de2e2e67b1c8e18d8b5de6ea0a68ecab4ad" exitCode=0 Apr 16 20:13:07.877816 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:07.877686 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k" Apr 16 20:13:07.877816 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:07.877707 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k" event={"ID":"dacdd8d2-efd1-47c1-b6d9-4f87a369513a","Type":"ContainerDied","Data":"fca3a265fc228ff7158539e1904d2de2e2e67b1c8e18d8b5de6ea0a68ecab4ad"} Apr 16 20:13:07.877816 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:07.877748 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k" event={"ID":"dacdd8d2-efd1-47c1-b6d9-4f87a369513a","Type":"ContainerDied","Data":"8b74528e5eca8581dea3842278bfd18928b8f9de528e845186d7e053bb74c425"} Apr 16 20:13:07.877816 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:07.877764 2561 scope.go:117] "RemoveContainer" containerID="fca3a265fc228ff7158539e1904d2de2e2e67b1c8e18d8b5de6ea0a68ecab4ad" Apr 16 20:13:07.885507 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:07.885484 2561 scope.go:117] "RemoveContainer" containerID="fca3a265fc228ff7158539e1904d2de2e2e67b1c8e18d8b5de6ea0a68ecab4ad" Apr 16 20:13:07.885841 ip-10-0-128-201 kubenswrapper[2561]: E0416 20:13:07.885823 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fca3a265fc228ff7158539e1904d2de2e2e67b1c8e18d8b5de6ea0a68ecab4ad\": container with ID starting with fca3a265fc228ff7158539e1904d2de2e2e67b1c8e18d8b5de6ea0a68ecab4ad not found: ID does not exist" containerID="fca3a265fc228ff7158539e1904d2de2e2e67b1c8e18d8b5de6ea0a68ecab4ad" Apr 16 20:13:07.885922 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:07.885852 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fca3a265fc228ff7158539e1904d2de2e2e67b1c8e18d8b5de6ea0a68ecab4ad"} err="failed to get container status \"fca3a265fc228ff7158539e1904d2de2e2e67b1c8e18d8b5de6ea0a68ecab4ad\": rpc error: code = NotFound desc = could not find container \"fca3a265fc228ff7158539e1904d2de2e2e67b1c8e18d8b5de6ea0a68ecab4ad\": container with ID starting with fca3a265fc228ff7158539e1904d2de2e2e67b1c8e18d8b5de6ea0a68ecab4ad not found: ID does not exist" Apr 16 20:13:07.900946 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:07.900924 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k"] Apr 16 20:13:07.904136 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:07.904115 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-de58a-predictor-56d6b9bf76-pws6k"] Apr 16 20:13:08.284658 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:08.284638 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx" Apr 16 20:13:08.840175 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:08.840141 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dacdd8d2-efd1-47c1-b6d9-4f87a369513a" path="/var/lib/kubelet/pods/dacdd8d2-efd1-47c1-b6d9-4f87a369513a/volumes" Apr 16 20:13:08.882478 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:08.882449 2561 generic.go:358] "Generic (PLEG): container finished" podID="70a7fd93-0d3e-4360-bfdf-56ad797ba3c6" containerID="f2b6832dd8cec8a78e717c2fae8aa27728511ab869f5bdb94bff024adcc90f31" exitCode=0 Apr 16 20:13:08.882590 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:08.882505 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx" Apr 16 20:13:08.882590 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:08.882529 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx" event={"ID":"70a7fd93-0d3e-4360-bfdf-56ad797ba3c6","Type":"ContainerDied","Data":"f2b6832dd8cec8a78e717c2fae8aa27728511ab869f5bdb94bff024adcc90f31"} Apr 16 20:13:08.882590 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:08.882562 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx" event={"ID":"70a7fd93-0d3e-4360-bfdf-56ad797ba3c6","Type":"ContainerDied","Data":"890d73f9db809b58c735d4a4f5b5fdcea225c9c487a1fb658fc3e511d1f1ec5e"} Apr 16 20:13:08.882590 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:08.882579 2561 scope.go:117] "RemoveContainer" containerID="f2b6832dd8cec8a78e717c2fae8aa27728511ab869f5bdb94bff024adcc90f31" Apr 16 20:13:08.890822 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:08.890803 2561 scope.go:117] "RemoveContainer" containerID="f2b6832dd8cec8a78e717c2fae8aa27728511ab869f5bdb94bff024adcc90f31" Apr 16 20:13:08.891118 ip-10-0-128-201 kubenswrapper[2561]: E0416 20:13:08.891100 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2b6832dd8cec8a78e717c2fae8aa27728511ab869f5bdb94bff024adcc90f31\": container with ID starting with f2b6832dd8cec8a78e717c2fae8aa27728511ab869f5bdb94bff024adcc90f31 not found: ID does not exist" containerID="f2b6832dd8cec8a78e717c2fae8aa27728511ab869f5bdb94bff024adcc90f31" Apr 16 20:13:08.891204 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:08.891125 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2b6832dd8cec8a78e717c2fae8aa27728511ab869f5bdb94bff024adcc90f31"} err="failed to get container status \"f2b6832dd8cec8a78e717c2fae8aa27728511ab869f5bdb94bff024adcc90f31\": rpc error: code = NotFound desc = could not find container \"f2b6832dd8cec8a78e717c2fae8aa27728511ab869f5bdb94bff024adcc90f31\": container with ID starting with f2b6832dd8cec8a78e717c2fae8aa27728511ab869f5bdb94bff024adcc90f31 not found: ID does not exist" Apr 16 20:13:08.902179 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:08.902151 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx"] Apr 16 20:13:08.904706 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:08.904687 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-de58a-predictor-5959f57c5c-pfhmx"] Apr 16 20:13:10.839572 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:10.839539 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70a7fd93-0d3e-4360-bfdf-56ad797ba3c6" path="/var/lib/kubelet/pods/70a7fd93-0d3e-4360-bfdf-56ad797ba3c6/volumes" Apr 16 20:13:11.712458 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:11.712430 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr" Apr 16 20:13:11.712914 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:11.712897 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2" Apr 16 20:13:15.871401 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:15.871364 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg" podUID="fc73502e-f8bc-4e27-b5b5-8e91aab00243" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:13:15.871784 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:15.871365 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn" podUID="b181d931-60a8-48dc-abce-f5911d25b5b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 20:13:25.870903 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:25.870860 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg" podUID="fc73502e-f8bc-4e27-b5b5-8e91aab00243" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:13:25.871386 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:25.870860 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn" podUID="b181d931-60a8-48dc-abce-f5911d25b5b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 20:13:35.871186 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:35.871137 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg" podUID="fc73502e-f8bc-4e27-b5b5-8e91aab00243" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:13:35.871654 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:35.871146 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn" podUID="b181d931-60a8-48dc-abce-f5911d25b5b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 20:13:39.798086 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:39.798042 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr"] Apr 16 20:13:39.798531 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:39.798334 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr" podUID="824c77c0-f514-44fa-9c4a-fe78e3de0d02" containerName="kserve-container" containerID="cri-o://45ebb94b0015904ed91034f8bf45bd515eee48e2517e79ca14644ff92577ef03" gracePeriod=30 Apr 16 20:13:39.813349 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:39.813316 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn"] Apr 16 20:13:39.813678 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:39.813665 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70a7fd93-0d3e-4360-bfdf-56ad797ba3c6" containerName="kserve-container" Apr 16 20:13:39.813739 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:39.813679 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a7fd93-0d3e-4360-bfdf-56ad797ba3c6" containerName="kserve-container" Apr 16 20:13:39.813739 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:39.813707 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dacdd8d2-efd1-47c1-b6d9-4f87a369513a" containerName="kserve-container" Apr 16 20:13:39.813739 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:39.813713 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="dacdd8d2-efd1-47c1-b6d9-4f87a369513a" containerName="kserve-container" Apr 16 20:13:39.813922 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:39.813758 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="70a7fd93-0d3e-4360-bfdf-56ad797ba3c6" containerName="kserve-container" Apr 16 20:13:39.813922 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:39.813767 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="dacdd8d2-efd1-47c1-b6d9-4f87a369513a" containerName="kserve-container" Apr 16 20:13:39.816709 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:39.816691 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn" Apr 16 20:13:39.824930 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:39.824902 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn"] Apr 16 20:13:39.827747 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:39.827725 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn" Apr 16 20:13:39.882823 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:39.882739 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2"] Apr 16 20:13:39.883068 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:39.883039 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2" podUID="2767c4e9-3ec6-4973-8a0d-7a65a375385e" containerName="kserve-container" containerID="cri-o://acedbbc976bbcf7cfda077236d96b60dedcde32bf2295d26f7c5913f1c097b1d" gracePeriod=30 Apr 16 20:13:39.909215 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:39.909156 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt"] Apr 16 20:13:39.914628 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:39.914607 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt" Apr 16 20:13:39.922516 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:39.922465 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt"] Apr 16 20:13:39.934535 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:39.934512 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt" Apr 16 20:13:39.998569 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:39.998538 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn"] Apr 16 20:13:40.003346 ip-10-0-128-201 kubenswrapper[2561]: W0416 20:13:40.003312 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57f8f750_7681_45d3_b864_ed5e9e150e2a.slice/crio-7c4e6ae04d2853638d521f77d0935025cc8a2012cb202a34cfdb380bd91841de WatchSource:0}: Error finding container 7c4e6ae04d2853638d521f77d0935025cc8a2012cb202a34cfdb380bd91841de: Status 404 returned error can't find the container with id 7c4e6ae04d2853638d521f77d0935025cc8a2012cb202a34cfdb380bd91841de Apr 16 20:13:40.114661 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:40.114487 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt"] Apr 16 20:13:40.117234 ip-10-0-128-201 kubenswrapper[2561]: W0416 20:13:40.117202 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb615d820_99ec_4104_b645_5a38c0747077.slice/crio-99ca145e04995b461db28207163a9cb17fa37199cd4994f77b7b06e5634926e2 WatchSource:0}: Error finding container 99ca145e04995b461db28207163a9cb17fa37199cd4994f77b7b06e5634926e2: Status 404 returned error can't find the container with id 99ca145e04995b461db28207163a9cb17fa37199cd4994f77b7b06e5634926e2 Apr 16 20:13:40.991606 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:40.991561 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn" event={"ID":"57f8f750-7681-45d3-b864-ed5e9e150e2a","Type":"ContainerStarted","Data":"70cf3f4ee4b6a69138e74639a0eedb10ecb58a0c49cccc902cfe61cad672fff5"} Apr 16 20:13:40.991606 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:40.991605 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn" event={"ID":"57f8f750-7681-45d3-b864-ed5e9e150e2a","Type":"ContainerStarted","Data":"7c4e6ae04d2853638d521f77d0935025cc8a2012cb202a34cfdb380bd91841de"} Apr 16 20:13:40.992081 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:40.991762 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn" Apr 16 20:13:40.992970 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:40.992948 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt" event={"ID":"b615d820-99ec-4104-b645-5a38c0747077","Type":"ContainerStarted","Data":"609bf4f809ce9e49059dbf01d045255df388f991fd95a2a9bda926dd603d4f20"} Apr 16 20:13:40.993093 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:40.992973 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt" event={"ID":"b615d820-99ec-4104-b645-5a38c0747077","Type":"ContainerStarted","Data":"99ca145e04995b461db28207163a9cb17fa37199cd4994f77b7b06e5634926e2"} Apr 16 20:13:40.993093 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:40.993018 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn" podUID="57f8f750-7681-45d3-b864-ed5e9e150e2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:13:40.993168 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:40.993095 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt" Apr 16 20:13:40.993970 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:40.993949 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt" podUID="b615d820-99ec-4104-b645-5a38c0747077" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:13:41.011504 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:41.011460 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn" podStartSLOduration=2.011449215 podStartE2EDuration="2.011449215s" podCreationTimestamp="2026-04-16 20:13:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:13:41.010480058 +0000 UTC m=+1198.733525978" watchObservedRunningTime="2026-04-16 20:13:41.011449215 +0000 UTC m=+1198.734495122" Apr 16 20:13:41.026715 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:41.026678 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt" podStartSLOduration=2.02666748 podStartE2EDuration="2.02666748s" podCreationTimestamp="2026-04-16 20:13:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:13:41.025592396 +0000 UTC m=+1198.748638302" watchObservedRunningTime="2026-04-16 20:13:41.02666748 +0000 UTC m=+1198.749713386" Apr 16 20:13:41.711948 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:41.711912 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2" podUID="2767c4e9-3ec6-4973-8a0d-7a65a375385e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 20:13:41.712121 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:41.711911 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr" podUID="824c77c0-f514-44fa-9c4a-fe78e3de0d02" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 20:13:41.996015 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:41.995932 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn" podUID="57f8f750-7681-45d3-b864-ed5e9e150e2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:13:41.996015 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:41.995932 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt" podUID="b615d820-99ec-4104-b645-5a38c0747077" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:13:42.823107 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:42.823075 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/1.log" Apr 16 20:13:42.827414 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:42.827388 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 20:13:42.827834 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:42.827578 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/1.log" Apr 16 20:13:42.839598 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:42.839578 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 20:13:43.649734 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:43.649706 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr" Apr 16 20:13:43.720369 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:43.720344 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2" Apr 16 20:13:44.004132 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:44.004044 2561 generic.go:358] "Generic (PLEG): container finished" podID="824c77c0-f514-44fa-9c4a-fe78e3de0d02" containerID="45ebb94b0015904ed91034f8bf45bd515eee48e2517e79ca14644ff92577ef03" exitCode=0 Apr 16 20:13:44.004297 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:44.004126 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr" event={"ID":"824c77c0-f514-44fa-9c4a-fe78e3de0d02","Type":"ContainerDied","Data":"45ebb94b0015904ed91034f8bf45bd515eee48e2517e79ca14644ff92577ef03"} Apr 16 20:13:44.004297 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:44.004145 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr" Apr 16 20:13:44.004297 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:44.004166 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr" event={"ID":"824c77c0-f514-44fa-9c4a-fe78e3de0d02","Type":"ContainerDied","Data":"fd322350887213679a700c318f6fa1f258c3246b2b87156e370a14363ea3bc83"} Apr 16 20:13:44.004297 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:44.004185 2561 scope.go:117] "RemoveContainer" containerID="45ebb94b0015904ed91034f8bf45bd515eee48e2517e79ca14644ff92577ef03" Apr 16 20:13:44.005321 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:44.005202 2561 generic.go:358] "Generic (PLEG): container finished" podID="2767c4e9-3ec6-4973-8a0d-7a65a375385e" containerID="acedbbc976bbcf7cfda077236d96b60dedcde32bf2295d26f7c5913f1c097b1d" exitCode=0 Apr 16 20:13:44.005321 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:44.005281 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2" Apr 16 20:13:44.005321 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:44.005289 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2" event={"ID":"2767c4e9-3ec6-4973-8a0d-7a65a375385e","Type":"ContainerDied","Data":"acedbbc976bbcf7cfda077236d96b60dedcde32bf2295d26f7c5913f1c097b1d"} Apr 16 20:13:44.005321 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:44.005313 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2" event={"ID":"2767c4e9-3ec6-4973-8a0d-7a65a375385e","Type":"ContainerDied","Data":"4dc9136b5b8e355c8c47985417966f1de1841c4bd27267d6ba7a6a2c84bbd523"} Apr 16 20:13:44.011763 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:44.011742 2561 scope.go:117] "RemoveContainer" containerID="45ebb94b0015904ed91034f8bf45bd515eee48e2517e79ca14644ff92577ef03" Apr 16 20:13:44.012028 ip-10-0-128-201 kubenswrapper[2561]: E0416 20:13:44.012003 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45ebb94b0015904ed91034f8bf45bd515eee48e2517e79ca14644ff92577ef03\": container with ID starting with 45ebb94b0015904ed91034f8bf45bd515eee48e2517e79ca14644ff92577ef03 not found: ID does not exist" containerID="45ebb94b0015904ed91034f8bf45bd515eee48e2517e79ca14644ff92577ef03" Apr 16 20:13:44.012123 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:44.012036 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ebb94b0015904ed91034f8bf45bd515eee48e2517e79ca14644ff92577ef03"} err="failed to get container status \"45ebb94b0015904ed91034f8bf45bd515eee48e2517e79ca14644ff92577ef03\": rpc error: code = NotFound desc = could not find container \"45ebb94b0015904ed91034f8bf45bd515eee48e2517e79ca14644ff92577ef03\": container with ID starting with 45ebb94b0015904ed91034f8bf45bd515eee48e2517e79ca14644ff92577ef03 not found: ID does not exist" Apr 16 20:13:44.012123 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:44.012053 2561 scope.go:117] "RemoveContainer" containerID="acedbbc976bbcf7cfda077236d96b60dedcde32bf2295d26f7c5913f1c097b1d" Apr 16 20:13:44.019654 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:44.019639 2561 scope.go:117] "RemoveContainer" containerID="acedbbc976bbcf7cfda077236d96b60dedcde32bf2295d26f7c5913f1c097b1d" Apr 16 20:13:44.019959 ip-10-0-128-201 kubenswrapper[2561]: E0416 20:13:44.019939 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acedbbc976bbcf7cfda077236d96b60dedcde32bf2295d26f7c5913f1c097b1d\": container with ID starting with acedbbc976bbcf7cfda077236d96b60dedcde32bf2295d26f7c5913f1c097b1d not found: ID does not exist" containerID="acedbbc976bbcf7cfda077236d96b60dedcde32bf2295d26f7c5913f1c097b1d" Apr 16 20:13:44.020052 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:44.019964 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acedbbc976bbcf7cfda077236d96b60dedcde32bf2295d26f7c5913f1c097b1d"} err="failed to get container status \"acedbbc976bbcf7cfda077236d96b60dedcde32bf2295d26f7c5913f1c097b1d\": rpc error: code = NotFound desc = could not find container \"acedbbc976bbcf7cfda077236d96b60dedcde32bf2295d26f7c5913f1c097b1d\": container with ID starting with acedbbc976bbcf7cfda077236d96b60dedcde32bf2295d26f7c5913f1c097b1d not found: ID does not exist" Apr 16 20:13:44.031093 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:44.031032 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr"] Apr 16 20:13:44.034392 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:44.034365 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6b9e1-predictor-77fcc484d9-s29lr"] Apr 16 20:13:44.044470 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:44.044439 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2"] Apr 16 20:13:44.048969 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:44.048948 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6b9e1-predictor-d9f68c7f6-rhbb2"] Apr 16 20:13:44.841532 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:44.841503 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2767c4e9-3ec6-4973-8a0d-7a65a375385e" path="/var/lib/kubelet/pods/2767c4e9-3ec6-4973-8a0d-7a65a375385e/volumes" Apr 16 20:13:44.841912 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:44.841719 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="824c77c0-f514-44fa-9c4a-fe78e3de0d02" path="/var/lib/kubelet/pods/824c77c0-f514-44fa-9c4a-fe78e3de0d02/volumes" Apr 16 20:13:45.871175 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:45.871133 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn" podUID="b181d931-60a8-48dc-abce-f5911d25b5b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 20:13:45.871553 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:45.871133 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg" podUID="fc73502e-f8bc-4e27-b5b5-8e91aab00243" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:13:51.996181 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:51.996133 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn" podUID="57f8f750-7681-45d3-b864-ed5e9e150e2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:13:51.996600 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:51.996144 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt" podUID="b615d820-99ec-4104-b645-5a38c0747077" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:13:55.871485 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:55.871452 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg" Apr 16 20:13:55.872006 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:13:55.871855 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn" Apr 16 20:14:01.996837 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:14:01.996768 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt" podUID="b615d820-99ec-4104-b645-5a38c0747077" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:14:01.997393 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:14:01.996759 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn" podUID="57f8f750-7681-45d3-b864-ed5e9e150e2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:14:11.996130 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:14:11.996081 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt" podUID="b615d820-99ec-4104-b645-5a38c0747077" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:14:11.996541 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:14:11.996090 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn" podUID="57f8f750-7681-45d3-b864-ed5e9e150e2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:14:21.996474 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:14:21.996422 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt" podUID="b615d820-99ec-4104-b645-5a38c0747077" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:14:21.996891 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:14:21.996422 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn" podUID="57f8f750-7681-45d3-b864-ed5e9e150e2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:14:31.997453 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:14:31.997371 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt" Apr 16 20:14:31.997957 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:14:31.997936 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn" Apr 16 20:18:42.851687 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:18:42.851657 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/1.log" Apr 16 20:18:42.856501 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:18:42.856479 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 20:18:42.858368 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:18:42.858347 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/1.log" Apr 16 20:18:42.862510 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:18:42.862494 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 20:22:29.062988 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.062946 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg"] Apr 16 20:22:29.065527 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.063256 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg" podUID="fc73502e-f8bc-4e27-b5b5-8e91aab00243" containerName="kserve-container" containerID="cri-o://cb91761e18a1f1cb2a4c76a69dc0397be9faef882b3ac615cb5128c6fb1d28df" gracePeriod=30 Apr 16 20:22:29.125667 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.125635 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb"] Apr 16 20:22:29.126173 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.126151 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2767c4e9-3ec6-4973-8a0d-7a65a375385e" containerName="kserve-container" Apr 16 20:22:29.126173 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.126172 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="2767c4e9-3ec6-4973-8a0d-7a65a375385e" containerName="kserve-container" Apr 16 20:22:29.126348 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.126201 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="824c77c0-f514-44fa-9c4a-fe78e3de0d02" containerName="kserve-container" Apr 16 20:22:29.126348 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.126210 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="824c77c0-f514-44fa-9c4a-fe78e3de0d02" containerName="kserve-container" Apr 16 20:22:29.126348 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.126287 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="824c77c0-f514-44fa-9c4a-fe78e3de0d02" containerName="kserve-container" Apr 16 20:22:29.126348 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.126298 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="2767c4e9-3ec6-4973-8a0d-7a65a375385e" containerName="kserve-container" Apr 16 20:22:29.129761 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.129735 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb" Apr 16 20:22:29.142416 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.142398 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb" Apr 16 20:22:29.155364 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.155327 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn"] Apr 16 20:22:29.156656 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.155689 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn" podUID="b181d931-60a8-48dc-abce-f5911d25b5b9" containerName="kserve-container" containerID="cri-o://f4c6a183d89e6f19ae3f4c3776acfd9cddfaea5e19fd82eac6486431bfbd8b95" gracePeriod=30 Apr 16 20:22:29.161000 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.160974 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb"] Apr 16 20:22:29.167683 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.163990 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr"] Apr 16 20:22:29.168490 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.168472 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr" Apr 16 20:22:29.174752 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.174722 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr"] Apr 16 20:22:29.183840 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.183779 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr" Apr 16 20:22:29.294532 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.294507 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb"] Apr 16 20:22:29.298329 ip-10-0-128-201 kubenswrapper[2561]: W0416 20:22:29.298295 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbee9c44_d672_4fc6_81bf_67c5c6b666c9.slice/crio-3c8d93f772e4c5554e6dba11912680caa9d7a156d2d4749405a23fae876b1dea WatchSource:0}: Error finding container 3c8d93f772e4c5554e6dba11912680caa9d7a156d2d4749405a23fae876b1dea: Status 404 returned error can't find the container with id 3c8d93f772e4c5554e6dba11912680caa9d7a156d2d4749405a23fae876b1dea Apr 16 20:22:29.300953 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.300650 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:22:29.329943 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.329908 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr"] Apr 16 20:22:29.335539 ip-10-0-128-201 kubenswrapper[2561]: W0416 20:22:29.335505 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d448a51_f3bb_4062_981e_316ed8f1f8dc.slice/crio-4655658e1fb192bc903d85c4441a2901f658301fc493ae69af8c8ab8afdd4278 WatchSource:0}: Error finding container 4655658e1fb192bc903d85c4441a2901f658301fc493ae69af8c8ab8afdd4278: Status 404 returned error can't find the container with id 4655658e1fb192bc903d85c4441a2901f658301fc493ae69af8c8ab8afdd4278 Apr 16 20:22:29.707337 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.707301 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb" event={"ID":"cbee9c44-d672-4fc6-81bf-67c5c6b666c9","Type":"ContainerStarted","Data":"5510c91b82b43dd921becf259cedd7084ee125d593d0504af6ea008532d1d9a3"} Apr 16 20:22:29.707513 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.707345 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb" event={"ID":"cbee9c44-d672-4fc6-81bf-67c5c6b666c9","Type":"ContainerStarted","Data":"3c8d93f772e4c5554e6dba11912680caa9d7a156d2d4749405a23fae876b1dea"} Apr 16 20:22:29.707513 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.707484 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb" Apr 16 20:22:29.709056 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.709017 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb" podUID="cbee9c44-d672-4fc6-81bf-67c5c6b666c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 20:22:29.709182 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.709152 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr" event={"ID":"3d448a51-f3bb-4062-981e-316ed8f1f8dc","Type":"ContainerStarted","Data":"a1eb534e909f753cfd1ecf35aefa6d9918c0e8521f9fa54fbeefa6ad58867699"} Apr 16 20:22:29.709230 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.709194 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr" event={"ID":"3d448a51-f3bb-4062-981e-316ed8f1f8dc","Type":"ContainerStarted","Data":"4655658e1fb192bc903d85c4441a2901f658301fc493ae69af8c8ab8afdd4278"} Apr 16 20:22:29.709318 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.709300 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr" Apr 16 20:22:29.710135 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.710117 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr" podUID="3d448a51-f3bb-4062-981e-316ed8f1f8dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 20:22:29.724130 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.724095 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb" podStartSLOduration=0.724084751 podStartE2EDuration="724.084751ms" podCreationTimestamp="2026-04-16 20:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:22:29.723040398 +0000 UTC m=+1727.446086326" watchObservedRunningTime="2026-04-16 20:22:29.724084751 +0000 UTC m=+1727.447130656" Apr 16 20:22:29.738311 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:29.738274 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr" podStartSLOduration=0.738262154 podStartE2EDuration="738.262154ms" podCreationTimestamp="2026-04-16 20:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:22:29.737315544 +0000 UTC m=+1727.460361462" watchObservedRunningTime="2026-04-16 20:22:29.738262154 +0000 UTC m=+1727.461308058" Apr 16 20:22:30.712889 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:30.712848 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb" podUID="cbee9c44-d672-4fc6-81bf-67c5c6b666c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 20:22:30.713241 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:30.712848 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr" podUID="3d448a51-f3bb-4062-981e-316ed8f1f8dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 20:22:32.014425 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:32.014406 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg" Apr 16 20:22:32.199291 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:32.199273 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn" Apr 16 20:22:32.719944 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:32.719910 2561 generic.go:358] "Generic (PLEG): container finished" podID="b181d931-60a8-48dc-abce-f5911d25b5b9" containerID="f4c6a183d89e6f19ae3f4c3776acfd9cddfaea5e19fd82eac6486431bfbd8b95" exitCode=0 Apr 16 20:22:32.720120 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:32.719966 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn" Apr 16 20:22:32.720120 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:32.719986 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn" event={"ID":"b181d931-60a8-48dc-abce-f5911d25b5b9","Type":"ContainerDied","Data":"f4c6a183d89e6f19ae3f4c3776acfd9cddfaea5e19fd82eac6486431bfbd8b95"} Apr 16 20:22:32.720120 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:32.720014 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn" event={"ID":"b181d931-60a8-48dc-abce-f5911d25b5b9","Type":"ContainerDied","Data":"413fb151dfd284706abb85c788cf49f550e6a94d227fc3bb44c78be9c9aa0a4a"} Apr 16 20:22:32.720120 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:32.720028 2561 scope.go:117] "RemoveContainer" containerID="f4c6a183d89e6f19ae3f4c3776acfd9cddfaea5e19fd82eac6486431bfbd8b95" Apr 16 20:22:32.721114 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:32.721090 2561 generic.go:358] "Generic (PLEG): container finished" podID="fc73502e-f8bc-4e27-b5b5-8e91aab00243" containerID="cb91761e18a1f1cb2a4c76a69dc0397be9faef882b3ac615cb5128c6fb1d28df" exitCode=0 Apr 16 20:22:32.721214 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:32.721134 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg" Apr 16 20:22:32.721214 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:32.721163 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg" event={"ID":"fc73502e-f8bc-4e27-b5b5-8e91aab00243","Type":"ContainerDied","Data":"cb91761e18a1f1cb2a4c76a69dc0397be9faef882b3ac615cb5128c6fb1d28df"} Apr 16 20:22:32.721214 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:32.721187 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg" event={"ID":"fc73502e-f8bc-4e27-b5b5-8e91aab00243","Type":"ContainerDied","Data":"10b9f73ac4968195f9702997e831f8f1770babca4b8d22971654828047e976a1"} Apr 16 20:22:32.728615 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:32.728586 2561 scope.go:117] "RemoveContainer" containerID="f4c6a183d89e6f19ae3f4c3776acfd9cddfaea5e19fd82eac6486431bfbd8b95" Apr 16 20:22:32.729176 ip-10-0-128-201 kubenswrapper[2561]: E0416 20:22:32.729153 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4c6a183d89e6f19ae3f4c3776acfd9cddfaea5e19fd82eac6486431bfbd8b95\": container with ID starting with f4c6a183d89e6f19ae3f4c3776acfd9cddfaea5e19fd82eac6486431bfbd8b95 not found: ID does not exist" containerID="f4c6a183d89e6f19ae3f4c3776acfd9cddfaea5e19fd82eac6486431bfbd8b95" Apr 16 20:22:32.729254 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:32.729187 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4c6a183d89e6f19ae3f4c3776acfd9cddfaea5e19fd82eac6486431bfbd8b95"} err="failed to get container status \"f4c6a183d89e6f19ae3f4c3776acfd9cddfaea5e19fd82eac6486431bfbd8b95\": rpc error: code = NotFound desc = could not find container \"f4c6a183d89e6f19ae3f4c3776acfd9cddfaea5e19fd82eac6486431bfbd8b95\": container with ID starting with f4c6a183d89e6f19ae3f4c3776acfd9cddfaea5e19fd82eac6486431bfbd8b95 not found: ID does not exist" Apr 16 20:22:32.729254 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:32.729210 2561 scope.go:117] "RemoveContainer" containerID="cb91761e18a1f1cb2a4c76a69dc0397be9faef882b3ac615cb5128c6fb1d28df" Apr 16 20:22:32.738133 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:32.738113 2561 scope.go:117] "RemoveContainer" containerID="cb91761e18a1f1cb2a4c76a69dc0397be9faef882b3ac615cb5128c6fb1d28df" Apr 16 20:22:32.738356 ip-10-0-128-201 kubenswrapper[2561]: E0416 20:22:32.738335 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb91761e18a1f1cb2a4c76a69dc0397be9faef882b3ac615cb5128c6fb1d28df\": container with ID starting with cb91761e18a1f1cb2a4c76a69dc0397be9faef882b3ac615cb5128c6fb1d28df not found: ID does not exist" containerID="cb91761e18a1f1cb2a4c76a69dc0397be9faef882b3ac615cb5128c6fb1d28df" Apr 16 20:22:32.738419 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:32.738366 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb91761e18a1f1cb2a4c76a69dc0397be9faef882b3ac615cb5128c6fb1d28df"} err="failed to get container status \"cb91761e18a1f1cb2a4c76a69dc0397be9faef882b3ac615cb5128c6fb1d28df\": rpc error: code = NotFound desc = could not find container \"cb91761e18a1f1cb2a4c76a69dc0397be9faef882b3ac615cb5128c6fb1d28df\": container with ID starting with cb91761e18a1f1cb2a4c76a69dc0397be9faef882b3ac615cb5128c6fb1d28df not found: ID does not exist" Apr 16 20:22:32.744637 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:32.744616 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg"] Apr 16 20:22:32.748233 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:32.748212 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-12b34-predictor-954f5b995-lqhrg"] Apr 16 20:22:32.758513 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:32.758491 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn"] Apr 16 20:22:32.762279 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:32.762258 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-12b34-predictor-558d6f8bf8-mzwqn"] Apr 16 20:22:32.840721 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:32.840692 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b181d931-60a8-48dc-abce-f5911d25b5b9" path="/var/lib/kubelet/pods/b181d931-60a8-48dc-abce-f5911d25b5b9/volumes" Apr 16 20:22:32.840960 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:32.840946 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc73502e-f8bc-4e27-b5b5-8e91aab00243" path="/var/lib/kubelet/pods/fc73502e-f8bc-4e27-b5b5-8e91aab00243/volumes" Apr 16 20:22:40.713635 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:40.713598 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr" podUID="3d448a51-f3bb-4062-981e-316ed8f1f8dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 20:22:40.714040 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:40.713600 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb" podUID="cbee9c44-d672-4fc6-81bf-67c5c6b666c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 20:22:50.713342 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:50.713293 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr" podUID="3d448a51-f3bb-4062-981e-316ed8f1f8dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 20:22:50.713721 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:22:50.713296 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb" podUID="cbee9c44-d672-4fc6-81bf-67c5c6b666c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 20:23:00.713378 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:00.713332 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb" podUID="cbee9c44-d672-4fc6-81bf-67c5c6b666c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 20:23:00.713865 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:00.713331 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr" podUID="3d448a51-f3bb-4062-981e-316ed8f1f8dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 20:23:04.699139 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:04.699107 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt"] Apr 16 20:23:04.699517 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:04.699344 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt" podUID="b615d820-99ec-4104-b645-5a38c0747077" containerName="kserve-container" containerID="cri-o://609bf4f809ce9e49059dbf01d045255df388f991fd95a2a9bda926dd603d4f20" gracePeriod=30 Apr 16 20:23:04.753010 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:04.752979 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn"] Apr 16 20:23:04.753243 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:04.753220 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn" podUID="57f8f750-7681-45d3-b864-ed5e9e150e2a" containerName="kserve-container" containerID="cri-o://70cf3f4ee4b6a69138e74639a0eedb10ecb58a0c49cccc902cfe61cad672fff5" gracePeriod=30 Apr 16 20:23:04.761781 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:04.761757 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk"] Apr 16 20:23:04.762148 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:04.762132 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b181d931-60a8-48dc-abce-f5911d25b5b9" containerName="kserve-container" Apr 16 20:23:04.762203 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:04.762149 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="b181d931-60a8-48dc-abce-f5911d25b5b9" containerName="kserve-container" Apr 16 20:23:04.762203 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:04.762169 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc73502e-f8bc-4e27-b5b5-8e91aab00243" containerName="kserve-container" Apr 16 20:23:04.762203 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:04.762174 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc73502e-f8bc-4e27-b5b5-8e91aab00243" containerName="kserve-container" Apr 16 20:23:04.762301 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:04.762224 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="b181d931-60a8-48dc-abce-f5911d25b5b9" containerName="kserve-container" Apr 16 20:23:04.762301 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:04.762234 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc73502e-f8bc-4e27-b5b5-8e91aab00243" containerName="kserve-container" Apr 16 20:23:04.766452 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:04.766433 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk" Apr 16 20:23:04.772476 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:04.772456 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk"] Apr 16 20:23:04.777000 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:04.776983 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk" Apr 16 20:23:04.855050 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:04.853286 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t"] Apr 16 20:23:04.859115 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:04.859091 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t" Apr 16 20:23:04.862357 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:04.862329 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t"] Apr 16 20:23:04.872990 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:04.872967 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t" Apr 16 20:23:04.941610 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:04.941549 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk"] Apr 16 20:23:05.035382 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:05.035351 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t"] Apr 16 20:23:05.842852 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:05.842810 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk" event={"ID":"48d83598-87ee-44ad-b07e-45af3b6338dc","Type":"ContainerStarted","Data":"cbee0f7293493f8afe0e5a99487b50b49ca23eff07fc54d8e451d07b97514fa9"} Apr 16 20:23:05.842852 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:05.842857 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk" event={"ID":"48d83598-87ee-44ad-b07e-45af3b6338dc","Type":"ContainerStarted","Data":"5e487cc089f448381b33b140425937c083670c82dcc599bd2a3b502dcd29ac19"} Apr 16 20:23:05.843628 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:05.843034 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk" Apr 16 20:23:05.844572 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:05.844540 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk" podUID="48d83598-87ee-44ad-b07e-45af3b6338dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 20:23:05.844680 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:05.844658 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t" event={"ID":"6793b1e0-04e8-4184-8df4-4a3694e50d66","Type":"ContainerStarted","Data":"4847852b4b94d92c959883d212ae45b6f7c1e15629be008d3854f4e6ac8c5d66"} Apr 16 20:23:05.844762 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:05.844692 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t" event={"ID":"6793b1e0-04e8-4184-8df4-4a3694e50d66","Type":"ContainerStarted","Data":"730a6eb32ec0bdd804c4697b52c8b64aa79cd5a191f8fdd1741b4cc040c5ac60"} Apr 16 20:23:05.844829 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:05.844816 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t" Apr 16 20:23:05.845762 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:05.845740 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t" podUID="6793b1e0-04e8-4184-8df4-4a3694e50d66" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:23:05.861653 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:05.861612 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk" podStartSLOduration=1.861601073 podStartE2EDuration="1.861601073s" podCreationTimestamp="2026-04-16 20:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:23:05.859373228 +0000 UTC m=+1763.582419282" watchObservedRunningTime="2026-04-16 20:23:05.861601073 +0000 UTC m=+1763.584646978" Apr 16 20:23:05.876469 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:05.876190 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t" podStartSLOduration=1.8761761529999998 podStartE2EDuration="1.876176153s" podCreationTimestamp="2026-04-16 20:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:23:05.875520656 +0000 UTC m=+1763.598566564" watchObservedRunningTime="2026-04-16 20:23:05.876176153 +0000 UTC m=+1763.599222060" Apr 16 20:23:06.848206 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:06.848166 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t" podUID="6793b1e0-04e8-4184-8df4-4a3694e50d66" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:23:06.848579 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:06.848166 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk" podUID="48d83598-87ee-44ad-b07e-45af3b6338dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 20:23:08.767366 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:08.767324 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn" Apr 16 20:23:08.770390 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:08.770372 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt" Apr 16 20:23:08.857770 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:08.857742 2561 generic.go:358] "Generic (PLEG): container finished" podID="57f8f750-7681-45d3-b864-ed5e9e150e2a" containerID="70cf3f4ee4b6a69138e74639a0eedb10ecb58a0c49cccc902cfe61cad672fff5" exitCode=0 Apr 16 20:23:08.857902 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:08.857821 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn" event={"ID":"57f8f750-7681-45d3-b864-ed5e9e150e2a","Type":"ContainerDied","Data":"70cf3f4ee4b6a69138e74639a0eedb10ecb58a0c49cccc902cfe61cad672fff5"} Apr 16 20:23:08.857902 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:08.857834 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn" Apr 16 20:23:08.857902 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:08.857863 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn" event={"ID":"57f8f750-7681-45d3-b864-ed5e9e150e2a","Type":"ContainerDied","Data":"7c4e6ae04d2853638d521f77d0935025cc8a2012cb202a34cfdb380bd91841de"} Apr 16 20:23:08.857902 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:08.857893 2561 scope.go:117] "RemoveContainer" containerID="70cf3f4ee4b6a69138e74639a0eedb10ecb58a0c49cccc902cfe61cad672fff5" Apr 16 20:23:08.858966 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:08.858938 2561 generic.go:358] "Generic (PLEG): container finished" podID="b615d820-99ec-4104-b645-5a38c0747077" containerID="609bf4f809ce9e49059dbf01d045255df388f991fd95a2a9bda926dd603d4f20" exitCode=0 Apr 16 20:23:08.859082 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:08.859023 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt" Apr 16 20:23:08.859082 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:08.859024 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt" event={"ID":"b615d820-99ec-4104-b645-5a38c0747077","Type":"ContainerDied","Data":"609bf4f809ce9e49059dbf01d045255df388f991fd95a2a9bda926dd603d4f20"} Apr 16 20:23:08.859082 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:08.859056 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt" event={"ID":"b615d820-99ec-4104-b645-5a38c0747077","Type":"ContainerDied","Data":"99ca145e04995b461db28207163a9cb17fa37199cd4994f77b7b06e5634926e2"} Apr 16 20:23:08.865726 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:08.865711 2561 scope.go:117] "RemoveContainer" containerID="70cf3f4ee4b6a69138e74639a0eedb10ecb58a0c49cccc902cfe61cad672fff5" Apr 16 20:23:08.865973 ip-10-0-128-201 kubenswrapper[2561]: E0416 20:23:08.865955 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70cf3f4ee4b6a69138e74639a0eedb10ecb58a0c49cccc902cfe61cad672fff5\": container with ID starting with 70cf3f4ee4b6a69138e74639a0eedb10ecb58a0c49cccc902cfe61cad672fff5 not found: ID does not exist" containerID="70cf3f4ee4b6a69138e74639a0eedb10ecb58a0c49cccc902cfe61cad672fff5" Apr 16 20:23:08.866017 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:08.865982 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70cf3f4ee4b6a69138e74639a0eedb10ecb58a0c49cccc902cfe61cad672fff5"} err="failed to get container status \"70cf3f4ee4b6a69138e74639a0eedb10ecb58a0c49cccc902cfe61cad672fff5\": rpc error: code = NotFound desc = could not find container \"70cf3f4ee4b6a69138e74639a0eedb10ecb58a0c49cccc902cfe61cad672fff5\": container with ID starting with 70cf3f4ee4b6a69138e74639a0eedb10ecb58a0c49cccc902cfe61cad672fff5 not found: ID does not exist" Apr 16 20:23:08.866017 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:08.865996 2561 scope.go:117] "RemoveContainer" containerID="609bf4f809ce9e49059dbf01d045255df388f991fd95a2a9bda926dd603d4f20" Apr 16 20:23:08.873248 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:08.873230 2561 scope.go:117] "RemoveContainer" containerID="609bf4f809ce9e49059dbf01d045255df388f991fd95a2a9bda926dd603d4f20" Apr 16 20:23:08.873486 ip-10-0-128-201 kubenswrapper[2561]: E0416 20:23:08.873464 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"609bf4f809ce9e49059dbf01d045255df388f991fd95a2a9bda926dd603d4f20\": container with ID starting with 609bf4f809ce9e49059dbf01d045255df388f991fd95a2a9bda926dd603d4f20 not found: ID does not exist" containerID="609bf4f809ce9e49059dbf01d045255df388f991fd95a2a9bda926dd603d4f20" Apr 16 20:23:08.873576 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:08.873490 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"609bf4f809ce9e49059dbf01d045255df388f991fd95a2a9bda926dd603d4f20"} err="failed to get container status \"609bf4f809ce9e49059dbf01d045255df388f991fd95a2a9bda926dd603d4f20\": rpc error: code = NotFound desc = could not find container \"609bf4f809ce9e49059dbf01d045255df388f991fd95a2a9bda926dd603d4f20\": container with ID starting with 609bf4f809ce9e49059dbf01d045255df388f991fd95a2a9bda926dd603d4f20 not found: ID does not exist" Apr 16 20:23:08.874894 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:08.874871 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn"] Apr 16 20:23:08.876262 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:08.876244 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1f9ad-predictor-84d585c7c-4pvwn"] Apr 16 20:23:08.889115 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:08.889087 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt"] Apr 16 20:23:08.892667 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:08.892647 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1f9ad-predictor-6f68d9c676-2l6lt"] Apr 16 20:23:10.713127 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:10.713084 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb" podUID="cbee9c44-d672-4fc6-81bf-67c5c6b666c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 20:23:10.713508 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:10.713082 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr" podUID="3d448a51-f3bb-4062-981e-316ed8f1f8dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 20:23:10.840732 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:10.840704 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57f8f750-7681-45d3-b864-ed5e9e150e2a" path="/var/lib/kubelet/pods/57f8f750-7681-45d3-b864-ed5e9e150e2a/volumes" Apr 16 20:23:10.840957 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:10.840942 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b615d820-99ec-4104-b645-5a38c0747077" path="/var/lib/kubelet/pods/b615d820-99ec-4104-b645-5a38c0747077/volumes" Apr 16 20:23:16.848810 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:16.848745 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk" podUID="48d83598-87ee-44ad-b07e-45af3b6338dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 20:23:16.849200 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:16.848745 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t" podUID="6793b1e0-04e8-4184-8df4-4a3694e50d66" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:23:20.714439 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:20.714397 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr" Apr 16 20:23:20.714960 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:20.714940 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb" Apr 16 20:23:26.849039 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:26.848997 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t" podUID="6793b1e0-04e8-4184-8df4-4a3694e50d66" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:23:26.849405 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:26.848996 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk" podUID="48d83598-87ee-44ad-b07e-45af3b6338dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 20:23:36.848380 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:36.848297 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk" podUID="48d83598-87ee-44ad-b07e-45af3b6338dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 20:23:36.848855 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:36.848297 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t" podUID="6793b1e0-04e8-4184-8df4-4a3694e50d66" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:23:42.873553 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:42.873525 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/1.log" Apr 16 20:23:42.877945 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:42.877922 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 20:23:42.880293 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:42.880277 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/1.log" Apr 16 20:23:42.884344 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:42.884326 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 20:23:46.848734 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:46.848693 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t" podUID="6793b1e0-04e8-4184-8df4-4a3694e50d66" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:23:46.849132 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:46.848693 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk" podUID="48d83598-87ee-44ad-b07e-45af3b6338dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 20:23:49.359234 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.359200 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr"] Apr 16 20:23:49.359725 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.359664 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr" podUID="3d448a51-f3bb-4062-981e-316ed8f1f8dc" containerName="kserve-container" containerID="cri-o://a1eb534e909f753cfd1ecf35aefa6d9918c0e8521f9fa54fbeefa6ad58867699" gracePeriod=30 Apr 16 20:23:49.410230 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.410195 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk"] Apr 16 20:23:49.411082 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.411062 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b615d820-99ec-4104-b645-5a38c0747077" containerName="kserve-container" Apr 16 20:23:49.411234 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.411223 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="b615d820-99ec-4104-b645-5a38c0747077" containerName="kserve-container" Apr 16 20:23:49.411335 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.411325 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57f8f750-7681-45d3-b864-ed5e9e150e2a" containerName="kserve-container" Apr 16 20:23:49.411423 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.411414 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f8f750-7681-45d3-b864-ed5e9e150e2a" containerName="kserve-container" Apr 16 20:23:49.411602 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.411592 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="b615d820-99ec-4104-b645-5a38c0747077" containerName="kserve-container" Apr 16 20:23:49.411708 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.411699 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="57f8f750-7681-45d3-b864-ed5e9e150e2a" containerName="kserve-container" Apr 16 20:23:49.423327 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.419074 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb"] Apr 16 20:23:49.423327 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.419324 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb" podUID="cbee9c44-d672-4fc6-81bf-67c5c6b666c9" containerName="kserve-container" containerID="cri-o://5510c91b82b43dd921becf259cedd7084ee125d593d0504af6ea008532d1d9a3" gracePeriod=30 Apr 16 20:23:49.423327 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.419509 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk" Apr 16 20:23:49.426859 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.426070 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk"] Apr 16 20:23:49.443212 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.442901 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk" Apr 16 20:23:49.491651 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.491574 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff"] Apr 16 20:23:49.495605 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.495583 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff" Apr 16 20:23:49.507514 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.506710 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff"] Apr 16 20:23:49.514752 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.514352 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff" Apr 16 20:23:49.605185 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.605139 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk"] Apr 16 20:23:49.608375 ip-10-0-128-201 kubenswrapper[2561]: W0416 20:23:49.608339 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bf97a5e_af1d_42e1_937a_53ff1be8f1a5.slice/crio-19ced008f2c9755642291593c4a05b5a4d8ea7d1a367d0dce6255a9a15b888b4 WatchSource:0}: Error finding container 19ced008f2c9755642291593c4a05b5a4d8ea7d1a367d0dce6255a9a15b888b4: Status 404 returned error can't find the container with id 19ced008f2c9755642291593c4a05b5a4d8ea7d1a367d0dce6255a9a15b888b4 Apr 16 20:23:49.685522 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.685489 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff"] Apr 16 20:23:49.688014 ip-10-0-128-201 kubenswrapper[2561]: W0416 20:23:49.687979 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc34f3dfa_55e4_4781_9399_d716b78583d0.slice/crio-d49635483b9577017f3a1cbc3a0fabd15302df6128e4af7392252ac7d361d850 WatchSource:0}: Error finding container d49635483b9577017f3a1cbc3a0fabd15302df6128e4af7392252ac7d361d850: Status 404 returned error can't find the container with id d49635483b9577017f3a1cbc3a0fabd15302df6128e4af7392252ac7d361d850 Apr 16 20:23:49.997931 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.997894 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff" event={"ID":"c34f3dfa-55e4-4781-9399-d716b78583d0","Type":"ContainerStarted","Data":"acd30573fb07ff745694a9d8e7607f3c982670896dc9d8f59f6b303a27beae20"} Apr 16 20:23:49.997931 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.997937 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff" event={"ID":"c34f3dfa-55e4-4781-9399-d716b78583d0","Type":"ContainerStarted","Data":"d49635483b9577017f3a1cbc3a0fabd15302df6128e4af7392252ac7d361d850"} Apr 16 20:23:49.998177 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.998060 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff" Apr 16 20:23:49.999282 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.999254 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk" event={"ID":"5bf97a5e-af1d-42e1-937a-53ff1be8f1a5","Type":"ContainerStarted","Data":"7a366bb4f53d2ff102e2714f433098cdb295ac9b3669fdbdf411a2c33715af2c"} Apr 16 20:23:49.999411 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.999290 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk" event={"ID":"5bf97a5e-af1d-42e1-937a-53ff1be8f1a5","Type":"ContainerStarted","Data":"19ced008f2c9755642291593c4a05b5a4d8ea7d1a367d0dce6255a9a15b888b4"} Apr 16 20:23:49.999476 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.999432 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk" Apr 16 20:23:49.999516 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:49.999490 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff" podUID="c34f3dfa-55e4-4781-9399-d716b78583d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 20:23:50.000323 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:50.000300 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk" podUID="5bf97a5e-af1d-42e1-937a-53ff1be8f1a5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 20:23:50.012073 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:50.012033 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff" podStartSLOduration=1.012022507 podStartE2EDuration="1.012022507s" podCreationTimestamp="2026-04-16 20:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:23:50.01152794 +0000 UTC m=+1807.734573846" watchObservedRunningTime="2026-04-16 20:23:50.012022507 +0000 UTC m=+1807.735068410" Apr 16 20:23:50.025264 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:50.025229 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk" podStartSLOduration=1.025218317 podStartE2EDuration="1.025218317s" podCreationTimestamp="2026-04-16 20:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:23:50.024532142 +0000 UTC m=+1807.747578048" watchObservedRunningTime="2026-04-16 20:23:50.025218317 +0000 UTC m=+1807.748264226" Apr 16 20:23:50.713387 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:50.713350 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb" podUID="cbee9c44-d672-4fc6-81bf-67c5c6b666c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 20:23:50.713774 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:50.713351 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr" podUID="3d448a51-f3bb-4062-981e-316ed8f1f8dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 20:23:51.002915 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:51.002825 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff" podUID="c34f3dfa-55e4-4781-9399-d716b78583d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 20:23:51.002915 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:51.002831 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk" podUID="5bf97a5e-af1d-42e1-937a-53ff1be8f1a5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 20:23:52.788670 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:52.788647 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb" Apr 16 20:23:53.009684 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:53.009603 2561 generic.go:358] "Generic (PLEG): container finished" podID="cbee9c44-d672-4fc6-81bf-67c5c6b666c9" containerID="5510c91b82b43dd921becf259cedd7084ee125d593d0504af6ea008532d1d9a3" exitCode=0 Apr 16 20:23:53.009684 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:53.009660 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb" Apr 16 20:23:53.009885 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:53.009687 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb" event={"ID":"cbee9c44-d672-4fc6-81bf-67c5c6b666c9","Type":"ContainerDied","Data":"5510c91b82b43dd921becf259cedd7084ee125d593d0504af6ea008532d1d9a3"} Apr 16 20:23:53.009885 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:53.009720 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb" event={"ID":"cbee9c44-d672-4fc6-81bf-67c5c6b666c9","Type":"ContainerDied","Data":"3c8d93f772e4c5554e6dba11912680caa9d7a156d2d4749405a23fae876b1dea"} Apr 16 20:23:53.009885 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:53.009733 2561 scope.go:117] "RemoveContainer" containerID="5510c91b82b43dd921becf259cedd7084ee125d593d0504af6ea008532d1d9a3" Apr 16 20:23:53.017438 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:53.017420 2561 scope.go:117] "RemoveContainer" containerID="5510c91b82b43dd921becf259cedd7084ee125d593d0504af6ea008532d1d9a3" Apr 16 20:23:53.017678 ip-10-0-128-201 kubenswrapper[2561]: E0416 20:23:53.017657 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5510c91b82b43dd921becf259cedd7084ee125d593d0504af6ea008532d1d9a3\": container with ID starting with 5510c91b82b43dd921becf259cedd7084ee125d593d0504af6ea008532d1d9a3 not found: ID does not exist" containerID="5510c91b82b43dd921becf259cedd7084ee125d593d0504af6ea008532d1d9a3" Apr 16 20:23:53.017728 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:53.017688 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5510c91b82b43dd921becf259cedd7084ee125d593d0504af6ea008532d1d9a3"} err="failed to get container status \"5510c91b82b43dd921becf259cedd7084ee125d593d0504af6ea008532d1d9a3\": rpc error: code = NotFound desc = could not find container \"5510c91b82b43dd921becf259cedd7084ee125d593d0504af6ea008532d1d9a3\": container with ID starting with 5510c91b82b43dd921becf259cedd7084ee125d593d0504af6ea008532d1d9a3 not found: ID does not exist" Apr 16 20:23:53.027383 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:53.027358 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb"] Apr 16 20:23:53.031117 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:53.031097 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e8b6e-predictor-66bb45779f-wmmzb"] Apr 16 20:23:53.201289 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:53.201268 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr" Apr 16 20:23:54.015424 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:54.015396 2561 generic.go:358] "Generic (PLEG): container finished" podID="3d448a51-f3bb-4062-981e-316ed8f1f8dc" containerID="a1eb534e909f753cfd1ecf35aefa6d9918c0e8521f9fa54fbeefa6ad58867699" exitCode=0 Apr 16 20:23:54.015849 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:54.015446 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr" event={"ID":"3d448a51-f3bb-4062-981e-316ed8f1f8dc","Type":"ContainerDied","Data":"a1eb534e909f753cfd1ecf35aefa6d9918c0e8521f9fa54fbeefa6ad58867699"} Apr 16 20:23:54.015849 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:54.015458 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr" Apr 16 20:23:54.015849 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:54.015465 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr" event={"ID":"3d448a51-f3bb-4062-981e-316ed8f1f8dc","Type":"ContainerDied","Data":"4655658e1fb192bc903d85c4441a2901f658301fc493ae69af8c8ab8afdd4278"} Apr 16 20:23:54.015849 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:54.015481 2561 scope.go:117] "RemoveContainer" containerID="a1eb534e909f753cfd1ecf35aefa6d9918c0e8521f9fa54fbeefa6ad58867699" Apr 16 20:23:54.025469 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:54.025446 2561 scope.go:117] "RemoveContainer" containerID="a1eb534e909f753cfd1ecf35aefa6d9918c0e8521f9fa54fbeefa6ad58867699" Apr 16 20:23:54.026107 ip-10-0-128-201 kubenswrapper[2561]: E0416 20:23:54.025729 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1eb534e909f753cfd1ecf35aefa6d9918c0e8521f9fa54fbeefa6ad58867699\": container with ID starting with a1eb534e909f753cfd1ecf35aefa6d9918c0e8521f9fa54fbeefa6ad58867699 not found: ID does not exist" containerID="a1eb534e909f753cfd1ecf35aefa6d9918c0e8521f9fa54fbeefa6ad58867699" Apr 16 20:23:54.026107 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:54.025762 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1eb534e909f753cfd1ecf35aefa6d9918c0e8521f9fa54fbeefa6ad58867699"} err="failed to get container status \"a1eb534e909f753cfd1ecf35aefa6d9918c0e8521f9fa54fbeefa6ad58867699\": rpc error: code = NotFound desc = could not find container \"a1eb534e909f753cfd1ecf35aefa6d9918c0e8521f9fa54fbeefa6ad58867699\": container with ID starting with a1eb534e909f753cfd1ecf35aefa6d9918c0e8521f9fa54fbeefa6ad58867699 not found: ID does not exist" Apr 16 20:23:54.039632 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:54.039603 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr"] Apr 16 20:23:54.041492 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:54.041464 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e8b6e-predictor-64c44c8fcd-kllhr"] Apr 16 20:23:54.840880 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:54.840843 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d448a51-f3bb-4062-981e-316ed8f1f8dc" path="/var/lib/kubelet/pods/3d448a51-f3bb-4062-981e-316ed8f1f8dc/volumes" Apr 16 20:23:54.841095 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:54.841083 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbee9c44-d672-4fc6-81bf-67c5c6b666c9" path="/var/lib/kubelet/pods/cbee9c44-d672-4fc6-81bf-67c5c6b666c9/volumes" Apr 16 20:23:56.849840 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:56.849809 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t" Apr 16 20:23:56.850211 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:23:56.849854 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk" Apr 16 20:24:01.003717 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:24:01.003678 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk" podUID="5bf97a5e-af1d-42e1-937a-53ff1be8f1a5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 20:24:01.004115 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:24:01.003678 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff" podUID="c34f3dfa-55e4-4781-9399-d716b78583d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 20:24:11.003706 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:24:11.003666 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk" podUID="5bf97a5e-af1d-42e1-937a-53ff1be8f1a5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 20:24:11.004171 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:24:11.003668 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff" podUID="c34f3dfa-55e4-4781-9399-d716b78583d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 20:24:21.003101 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:24:21.003056 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk" podUID="5bf97a5e-af1d-42e1-937a-53ff1be8f1a5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 20:24:21.003101 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:24:21.003055 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff" podUID="c34f3dfa-55e4-4781-9399-d716b78583d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 20:24:31.003318 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:24:31.003276 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff" podUID="c34f3dfa-55e4-4781-9399-d716b78583d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 20:24:31.003711 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:24:31.003274 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk" podUID="5bf97a5e-af1d-42e1-937a-53ff1be8f1a5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 20:24:41.003863 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:24:41.003833 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff" Apr 16 20:24:41.004356 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:24:41.004338 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk" Apr 16 20:28:42.897659 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:28:42.897633 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/1.log" Apr 16 20:28:42.901402 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:28:42.901379 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 20:28:42.903760 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:28:42.903743 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/1.log" Apr 16 20:28:42.907709 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:28:42.907695 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 20:33:14.349925 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:14.349895 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk"] Apr 16 20:33:14.350379 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:14.350119 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk" podUID="5bf97a5e-af1d-42e1-937a-53ff1be8f1a5" containerName="kserve-container" containerID="cri-o://7a366bb4f53d2ff102e2714f433098cdb295ac9b3669fdbdf411a2c33715af2c" gracePeriod=30 Apr 16 20:33:14.391653 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:14.391622 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff"] Apr 16 20:33:14.391896 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:14.391859 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff" podUID="c34f3dfa-55e4-4781-9399-d716b78583d0" containerName="kserve-container" containerID="cri-o://acd30573fb07ff745694a9d8e7607f3c982670896dc9d8f59f6b303a27beae20" gracePeriod=30 Apr 16 20:33:17.812027 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:17.812007 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk" Apr 16 20:33:17.864149 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:17.864119 2561 generic.go:358] "Generic (PLEG): container finished" podID="c34f3dfa-55e4-4781-9399-d716b78583d0" containerID="acd30573fb07ff745694a9d8e7607f3c982670896dc9d8f59f6b303a27beae20" exitCode=0 Apr 16 20:33:17.864288 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:17.864194 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff" event={"ID":"c34f3dfa-55e4-4781-9399-d716b78583d0","Type":"ContainerDied","Data":"acd30573fb07ff745694a9d8e7607f3c982670896dc9d8f59f6b303a27beae20"} Apr 16 20:33:17.865272 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:17.865247 2561 generic.go:358] "Generic (PLEG): container finished" podID="5bf97a5e-af1d-42e1-937a-53ff1be8f1a5" containerID="7a366bb4f53d2ff102e2714f433098cdb295ac9b3669fdbdf411a2c33715af2c" exitCode=0 Apr 16 20:33:17.865402 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:17.865318 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk" Apr 16 20:33:17.865402 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:17.865330 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk" event={"ID":"5bf97a5e-af1d-42e1-937a-53ff1be8f1a5","Type":"ContainerDied","Data":"7a366bb4f53d2ff102e2714f433098cdb295ac9b3669fdbdf411a2c33715af2c"} Apr 16 20:33:17.865402 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:17.865367 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk" event={"ID":"5bf97a5e-af1d-42e1-937a-53ff1be8f1a5","Type":"ContainerDied","Data":"19ced008f2c9755642291593c4a05b5a4d8ea7d1a367d0dce6255a9a15b888b4"} Apr 16 20:33:17.865402 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:17.865389 2561 scope.go:117] "RemoveContainer" containerID="7a366bb4f53d2ff102e2714f433098cdb295ac9b3669fdbdf411a2c33715af2c" Apr 16 20:33:17.874012 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:17.873981 2561 scope.go:117] "RemoveContainer" containerID="7a366bb4f53d2ff102e2714f433098cdb295ac9b3669fdbdf411a2c33715af2c" Apr 16 20:33:17.874375 ip-10-0-128-201 kubenswrapper[2561]: E0416 20:33:17.874344 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a366bb4f53d2ff102e2714f433098cdb295ac9b3669fdbdf411a2c33715af2c\": container with ID starting with 7a366bb4f53d2ff102e2714f433098cdb295ac9b3669fdbdf411a2c33715af2c not found: ID does not exist" containerID="7a366bb4f53d2ff102e2714f433098cdb295ac9b3669fdbdf411a2c33715af2c" Apr 16 20:33:17.874471 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:17.874390 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a366bb4f53d2ff102e2714f433098cdb295ac9b3669fdbdf411a2c33715af2c"} err="failed to get container status \"7a366bb4f53d2ff102e2714f433098cdb295ac9b3669fdbdf411a2c33715af2c\": rpc error: code = NotFound desc = could not find container \"7a366bb4f53d2ff102e2714f433098cdb295ac9b3669fdbdf411a2c33715af2c\": container with ID starting with 7a366bb4f53d2ff102e2714f433098cdb295ac9b3669fdbdf411a2c33715af2c not found: ID does not exist" Apr 16 20:33:17.888387 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:17.888318 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk"] Apr 16 20:33:17.889686 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:17.889656 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9a649-predictor-b857f97b5-fnrvk"] Apr 16 20:33:17.937993 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:17.937971 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff" Apr 16 20:33:18.840219 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:18.840187 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bf97a5e-af1d-42e1-937a-53ff1be8f1a5" path="/var/lib/kubelet/pods/5bf97a5e-af1d-42e1-937a-53ff1be8f1a5/volumes" Apr 16 20:33:18.875694 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:18.875655 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff" event={"ID":"c34f3dfa-55e4-4781-9399-d716b78583d0","Type":"ContainerDied","Data":"d49635483b9577017f3a1cbc3a0fabd15302df6128e4af7392252ac7d361d850"} Apr 16 20:33:18.875844 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:18.875704 2561 scope.go:117] "RemoveContainer" containerID="acd30573fb07ff745694a9d8e7607f3c982670896dc9d8f59f6b303a27beae20" Apr 16 20:33:18.875844 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:18.875737 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff" Apr 16 20:33:18.896143 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:18.896083 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff"] Apr 16 20:33:18.898215 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:18.898195 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9a649-predictor-779bcbc86d-k8kff"] Apr 16 20:33:20.839995 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:20.839960 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c34f3dfa-55e4-4781-9399-d716b78583d0" path="/var/lib/kubelet/pods/c34f3dfa-55e4-4781-9399-d716b78583d0/volumes" Apr 16 20:33:42.917549 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:42.917445 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/1.log" Apr 16 20:33:42.934562 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:42.922164 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 20:33:42.934562 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:42.925638 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/1.log" Apr 16 20:33:42.934562 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:33:42.930056 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 20:38:42.939355 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:38:42.939255 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/1.log" Apr 16 20:38:42.944127 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:38:42.944103 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 20:38:42.948021 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:38:42.947998 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/1.log" Apr 16 20:38:42.952253 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:38:42.952234 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 20:40:34.392074 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:34.392033 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk"] Apr 16 20:40:34.392639 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:34.392331 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk" podUID="48d83598-87ee-44ad-b07e-45af3b6338dc" containerName="kserve-container" containerID="cri-o://cbee0f7293493f8afe0e5a99487b50b49ca23eff07fc54d8e451d07b97514fa9" gracePeriod=30 Apr 16 20:40:34.446147 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:34.446116 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t"] Apr 16 20:40:34.446366 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:34.446337 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t" podUID="6793b1e0-04e8-4184-8df4-4a3694e50d66" containerName="kserve-container" containerID="cri-o://4847852b4b94d92c959883d212ae45b6f7c1e15629be008d3854f4e6ac8c5d66" gracePeriod=30 Apr 16 20:40:36.848617 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:36.848578 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk" podUID="48d83598-87ee-44ad-b07e-45af3b6338dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 20:40:36.849085 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:36.848578 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t" podUID="6793b1e0-04e8-4184-8df4-4a3694e50d66" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:40:37.497325 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:37.497257 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk" Apr 16 20:40:37.594262 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:37.594235 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t" Apr 16 20:40:38.273396 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:38.273361 2561 generic.go:358] "Generic (PLEG): container finished" podID="48d83598-87ee-44ad-b07e-45af3b6338dc" containerID="cbee0f7293493f8afe0e5a99487b50b49ca23eff07fc54d8e451d07b97514fa9" exitCode=0 Apr 16 20:40:38.273852 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:38.273427 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk" Apr 16 20:40:38.273852 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:38.273451 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk" event={"ID":"48d83598-87ee-44ad-b07e-45af3b6338dc","Type":"ContainerDied","Data":"cbee0f7293493f8afe0e5a99487b50b49ca23eff07fc54d8e451d07b97514fa9"} Apr 16 20:40:38.273852 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:38.273496 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk" event={"ID":"48d83598-87ee-44ad-b07e-45af3b6338dc","Type":"ContainerDied","Data":"5e487cc089f448381b33b140425937c083670c82dcc599bd2a3b502dcd29ac19"} Apr 16 20:40:38.273852 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:38.273517 2561 scope.go:117] "RemoveContainer" containerID="cbee0f7293493f8afe0e5a99487b50b49ca23eff07fc54d8e451d07b97514fa9" Apr 16 20:40:38.274559 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:38.274494 2561 generic.go:358] "Generic (PLEG): container finished" podID="6793b1e0-04e8-4184-8df4-4a3694e50d66" containerID="4847852b4b94d92c959883d212ae45b6f7c1e15629be008d3854f4e6ac8c5d66" exitCode=0 Apr 16 20:40:38.274559 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:38.274520 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t" event={"ID":"6793b1e0-04e8-4184-8df4-4a3694e50d66","Type":"ContainerDied","Data":"4847852b4b94d92c959883d212ae45b6f7c1e15629be008d3854f4e6ac8c5d66"} Apr 16 20:40:38.274559 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:38.274551 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t" event={"ID":"6793b1e0-04e8-4184-8df4-4a3694e50d66","Type":"ContainerDied","Data":"730a6eb32ec0bdd804c4697b52c8b64aa79cd5a191f8fdd1741b4cc040c5ac60"} Apr 16 20:40:38.274680 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:38.274573 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t" Apr 16 20:40:38.281390 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:38.281369 2561 scope.go:117] "RemoveContainer" containerID="cbee0f7293493f8afe0e5a99487b50b49ca23eff07fc54d8e451d07b97514fa9" Apr 16 20:40:38.281636 ip-10-0-128-201 kubenswrapper[2561]: E0416 20:40:38.281617 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbee0f7293493f8afe0e5a99487b50b49ca23eff07fc54d8e451d07b97514fa9\": container with ID starting with cbee0f7293493f8afe0e5a99487b50b49ca23eff07fc54d8e451d07b97514fa9 not found: ID does not exist" containerID="cbee0f7293493f8afe0e5a99487b50b49ca23eff07fc54d8e451d07b97514fa9" Apr 16 20:40:38.281708 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:38.281644 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbee0f7293493f8afe0e5a99487b50b49ca23eff07fc54d8e451d07b97514fa9"} err="failed to get container status \"cbee0f7293493f8afe0e5a99487b50b49ca23eff07fc54d8e451d07b97514fa9\": rpc error: code = NotFound desc = could not find container \"cbee0f7293493f8afe0e5a99487b50b49ca23eff07fc54d8e451d07b97514fa9\": container with ID starting with cbee0f7293493f8afe0e5a99487b50b49ca23eff07fc54d8e451d07b97514fa9 not found: ID does not exist" Apr 16 20:40:38.281708 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:38.281659 2561 scope.go:117] "RemoveContainer" containerID="4847852b4b94d92c959883d212ae45b6f7c1e15629be008d3854f4e6ac8c5d66" Apr 16 20:40:38.288697 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:38.288682 2561 scope.go:117] "RemoveContainer" containerID="4847852b4b94d92c959883d212ae45b6f7c1e15629be008d3854f4e6ac8c5d66" Apr 16 20:40:38.288936 ip-10-0-128-201 kubenswrapper[2561]: E0416 20:40:38.288918 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4847852b4b94d92c959883d212ae45b6f7c1e15629be008d3854f4e6ac8c5d66\": container with ID starting with 4847852b4b94d92c959883d212ae45b6f7c1e15629be008d3854f4e6ac8c5d66 not found: ID does not exist" containerID="4847852b4b94d92c959883d212ae45b6f7c1e15629be008d3854f4e6ac8c5d66" Apr 16 20:40:38.288994 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:38.288942 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4847852b4b94d92c959883d212ae45b6f7c1e15629be008d3854f4e6ac8c5d66"} err="failed to get container status \"4847852b4b94d92c959883d212ae45b6f7c1e15629be008d3854f4e6ac8c5d66\": rpc error: code = NotFound desc = could not find container \"4847852b4b94d92c959883d212ae45b6f7c1e15629be008d3854f4e6ac8c5d66\": container with ID starting with 4847852b4b94d92c959883d212ae45b6f7c1e15629be008d3854f4e6ac8c5d66 not found: ID does not exist" Apr 16 20:40:38.297924 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:38.297905 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t"] Apr 16 20:40:38.299967 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:38.299941 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-653df-predictor-fd849f8f8-w5j6t"] Apr 16 20:40:38.309359 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:38.309337 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk"] Apr 16 20:40:38.311632 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:38.311613 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-653df-predictor-7db8685f5-75qnk"] Apr 16 20:40:38.839486 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:38.839450 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48d83598-87ee-44ad-b07e-45af3b6338dc" path="/var/lib/kubelet/pods/48d83598-87ee-44ad-b07e-45af3b6338dc/volumes" Apr 16 20:40:38.839782 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:40:38.839764 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6793b1e0-04e8-4184-8df4-4a3694e50d66" path="/var/lib/kubelet/pods/6793b1e0-04e8-4184-8df4-4a3694e50d66/volumes" Apr 16 20:41:02.476943 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:02.476913 2561 ???:1] "http: TLS handshake error from 10.0.140.191:48730: EOF" Apr 16 20:41:02.482489 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:02.482467 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-wfp62_8593da00-be3a-459a-9bf6-ee2f4988af66/global-pull-secret-syncer/0.log" Apr 16 20:41:02.605697 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:02.605671 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-wm2x4_54ec5af6-8d3b-4667-ae54-83fef36ee26c/konnectivity-agent/0.log" Apr 16 20:41:02.625307 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:02.625286 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-201.ec2.internal_d9f1723847b6ccf58b5c375746506d34/haproxy/0.log" Apr 16 20:41:05.994584 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:05.994553 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-tpdvb_0f95dfef-42c3-454a-9807-3c895a970729/cluster-monitoring-operator/0.log" Apr 16 20:41:06.235364 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:06.235337 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nsnhg_d24c0325-ce50-47e5-98c8-87e5ba46e3ca/node-exporter/0.log" Apr 16 20:41:06.256484 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:06.256427 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nsnhg_d24c0325-ce50-47e5-98c8-87e5ba46e3ca/kube-rbac-proxy/0.log" Apr 16 20:41:06.277836 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:06.277816 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nsnhg_d24c0325-ce50-47e5-98c8-87e5ba46e3ca/init-textfile/0.log" Apr 16 20:41:08.109217 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:08.109185 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-xbnwr_8814ad85-70d4-48f0-8e96-6cc0a48c07eb/networking-console-plugin/0.log" Apr 16 20:41:08.567544 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:08.567517 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/1.log" Apr 16 20:41:08.571908 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:08.571887 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln2rf_be232f65-8167-4e83-83a8-d40670fbf702/console-operator/2.log" Apr 16 20:41:09.355529 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.355500 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-gjqjl_fe409e83-4bf3-40c8-b46d-61a088fdae77/volume-data-source-validator/0.log" Apr 16 20:41:09.647421 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.647388 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb"] Apr 16 20:41:09.648055 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.648032 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbee9c44-d672-4fc6-81bf-67c5c6b666c9" containerName="kserve-container" Apr 16 20:41:09.648055 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.648058 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbee9c44-d672-4fc6-81bf-67c5c6b666c9" containerName="kserve-container" Apr 16 20:41:09.648228 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.648095 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48d83598-87ee-44ad-b07e-45af3b6338dc" containerName="kserve-container" Apr 16 20:41:09.648228 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.648104 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="48d83598-87ee-44ad-b07e-45af3b6338dc" containerName="kserve-container" Apr 16 20:41:09.648228 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.648125 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6793b1e0-04e8-4184-8df4-4a3694e50d66" containerName="kserve-container" Apr 16 20:41:09.648228 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.648135 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="6793b1e0-04e8-4184-8df4-4a3694e50d66" containerName="kserve-container" Apr 16 20:41:09.648228 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.648153 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d448a51-f3bb-4062-981e-316ed8f1f8dc" containerName="kserve-container" Apr 16 20:41:09.648228 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.648163 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d448a51-f3bb-4062-981e-316ed8f1f8dc" containerName="kserve-container" Apr 16 20:41:09.648228 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.648191 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5bf97a5e-af1d-42e1-937a-53ff1be8f1a5" containerName="kserve-container" Apr 16 20:41:09.648228 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.648201 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bf97a5e-af1d-42e1-937a-53ff1be8f1a5" containerName="kserve-container" Apr 16 20:41:09.648228 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.648222 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c34f3dfa-55e4-4781-9399-d716b78583d0" containerName="kserve-container" Apr 16 20:41:09.648228 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.648231 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="c34f3dfa-55e4-4781-9399-d716b78583d0" containerName="kserve-container" Apr 16 20:41:09.648906 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.648381 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="5bf97a5e-af1d-42e1-937a-53ff1be8f1a5" containerName="kserve-container" Apr 16 20:41:09.648906 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.648403 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="cbee9c44-d672-4fc6-81bf-67c5c6b666c9" containerName="kserve-container" Apr 16 20:41:09.648906 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.648415 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="6793b1e0-04e8-4184-8df4-4a3694e50d66" containerName="kserve-container" Apr 16 20:41:09.648906 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.648433 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d448a51-f3bb-4062-981e-316ed8f1f8dc" containerName="kserve-container" Apr 16 20:41:09.648906 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.648462 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="48d83598-87ee-44ad-b07e-45af3b6338dc" containerName="kserve-container" Apr 16 20:41:09.648906 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.648476 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="c34f3dfa-55e4-4781-9399-d716b78583d0" containerName="kserve-container" Apr 16 20:41:09.652146 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.652125 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb" Apr 16 20:41:09.655894 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.655868 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-zn2n8\"/\"default-dockercfg-n9vjb\"" Apr 16 20:41:09.656002 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.655873 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zn2n8\"/\"openshift-service-ca.crt\"" Apr 16 20:41:09.656002 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.655958 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zn2n8\"/\"kube-root-ca.crt\"" Apr 16 20:41:09.656384 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.656363 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb"] Apr 16 20:41:09.800483 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.800455 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2q78\" (UniqueName: \"kubernetes.io/projected/d712622b-8ec0-4743-a5da-9ddf50985afb-kube-api-access-m2q78\") pod \"perf-node-gather-daemonset-8sggb\" (UID: \"d712622b-8ec0-4743-a5da-9ddf50985afb\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb" Apr 16 20:41:09.800595 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.800490 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d712622b-8ec0-4743-a5da-9ddf50985afb-podres\") pod \"perf-node-gather-daemonset-8sggb\" (UID: \"d712622b-8ec0-4743-a5da-9ddf50985afb\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb" Apr 16 20:41:09.800595 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.800527 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d712622b-8ec0-4743-a5da-9ddf50985afb-lib-modules\") pod \"perf-node-gather-daemonset-8sggb\" (UID: \"d712622b-8ec0-4743-a5da-9ddf50985afb\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb" Apr 16 20:41:09.800595 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.800560 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d712622b-8ec0-4743-a5da-9ddf50985afb-sys\") pod \"perf-node-gather-daemonset-8sggb\" (UID: \"d712622b-8ec0-4743-a5da-9ddf50985afb\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb" Apr 16 20:41:09.800595 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.800575 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d712622b-8ec0-4743-a5da-9ddf50985afb-proc\") pod \"perf-node-gather-daemonset-8sggb\" (UID: \"d712622b-8ec0-4743-a5da-9ddf50985afb\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb" Apr 16 20:41:09.901311 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.901236 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2q78\" (UniqueName: \"kubernetes.io/projected/d712622b-8ec0-4743-a5da-9ddf50985afb-kube-api-access-m2q78\") pod \"perf-node-gather-daemonset-8sggb\" (UID: \"d712622b-8ec0-4743-a5da-9ddf50985afb\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb" Apr 16 20:41:09.901311 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.901268 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d712622b-8ec0-4743-a5da-9ddf50985afb-podres\") pod \"perf-node-gather-daemonset-8sggb\" (UID: \"d712622b-8ec0-4743-a5da-9ddf50985afb\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb" Apr 16 20:41:09.901311 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.901298 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d712622b-8ec0-4743-a5da-9ddf50985afb-lib-modules\") pod \"perf-node-gather-daemonset-8sggb\" (UID: \"d712622b-8ec0-4743-a5da-9ddf50985afb\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb" Apr 16 20:41:09.901519 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.901335 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d712622b-8ec0-4743-a5da-9ddf50985afb-sys\") pod \"perf-node-gather-daemonset-8sggb\" (UID: \"d712622b-8ec0-4743-a5da-9ddf50985afb\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb" Apr 16 20:41:09.901519 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.901360 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d712622b-8ec0-4743-a5da-9ddf50985afb-proc\") pod \"perf-node-gather-daemonset-8sggb\" (UID: \"d712622b-8ec0-4743-a5da-9ddf50985afb\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb" Apr 16 20:41:09.901519 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.901435 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d712622b-8ec0-4743-a5da-9ddf50985afb-lib-modules\") pod \"perf-node-gather-daemonset-8sggb\" (UID: \"d712622b-8ec0-4743-a5da-9ddf50985afb\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb" Apr 16 20:41:09.901519 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.901435 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d712622b-8ec0-4743-a5da-9ddf50985afb-proc\") pod \"perf-node-gather-daemonset-8sggb\" (UID: \"d712622b-8ec0-4743-a5da-9ddf50985afb\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb" Apr 16 20:41:09.901519 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.901463 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d712622b-8ec0-4743-a5da-9ddf50985afb-sys\") pod \"perf-node-gather-daemonset-8sggb\" (UID: \"d712622b-8ec0-4743-a5da-9ddf50985afb\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb" Apr 16 20:41:09.901519 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.901477 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d712622b-8ec0-4743-a5da-9ddf50985afb-podres\") pod \"perf-node-gather-daemonset-8sggb\" (UID: \"d712622b-8ec0-4743-a5da-9ddf50985afb\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb" Apr 16 20:41:09.909113 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.909092 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2q78\" (UniqueName: \"kubernetes.io/projected/d712622b-8ec0-4743-a5da-9ddf50985afb-kube-api-access-m2q78\") pod \"perf-node-gather-daemonset-8sggb\" (UID: \"d712622b-8ec0-4743-a5da-9ddf50985afb\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb" Apr 16 20:41:09.962198 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.962159 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb" Apr 16 20:41:09.994263 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:09.994233 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cdc8w_8e50b93c-a156-438b-a41f-2b4bac946727/dns/0.log" Apr 16 20:41:10.015513 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:10.015471 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cdc8w_8e50b93c-a156-438b-a41f-2b4bac946727/kube-rbac-proxy/0.log" Apr 16 20:41:10.083120 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:10.083060 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb"] Apr 16 20:41:10.085738 ip-10-0-128-201 kubenswrapper[2561]: W0416 20:41:10.085709 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd712622b_8ec0_4743_a5da_9ddf50985afb.slice/crio-8b0e6c81937038358fb913fd66916a5898d9d11588f5ded15f2ca01a40584cec WatchSource:0}: Error finding container 8b0e6c81937038358fb913fd66916a5898d9d11588f5ded15f2ca01a40584cec: Status 404 returned error can't find the container with id 8b0e6c81937038358fb913fd66916a5898d9d11588f5ded15f2ca01a40584cec Apr 16 20:41:10.087445 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:10.087426 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:41:10.165533 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:10.165473 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hnzkl_4219a57d-3ca0-4b3e-9d49-d7d1178e2c5e/dns-node-resolver/0.log" Apr 16 20:41:10.378428 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:10.378394 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb" event={"ID":"d712622b-8ec0-4743-a5da-9ddf50985afb","Type":"ContainerStarted","Data":"a8db56e45c656f6d29d9f427f36644f912394b9260523260180c33ab0f569cbf"} Apr 16 20:41:10.378428 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:10.378430 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb" event={"ID":"d712622b-8ec0-4743-a5da-9ddf50985afb","Type":"ContainerStarted","Data":"8b0e6c81937038358fb913fd66916a5898d9d11588f5ded15f2ca01a40584cec"} Apr 16 20:41:10.378879 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:10.378477 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb" Apr 16 20:41:10.395737 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:10.395687 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb" podStartSLOduration=1.39567456 podStartE2EDuration="1.39567456s" podCreationTimestamp="2026-04-16 20:41:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:41:10.393505776 +0000 UTC m=+2848.116551682" watchObservedRunningTime="2026-04-16 20:41:10.39567456 +0000 UTC m=+2848.118720466" Apr 16 20:41:10.590749 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:10.590673 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ddx6x_c3753347-dfcc-47be-a251-65c3470b8045/node-ca/0.log" Apr 16 20:41:11.314875 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:11.314843 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-98598b8fb-wkrmq_b3bac591-c06b-4b40-b42a-f85548b297f0/router/0.log" Apr 16 20:41:11.656862 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:11.656836 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-k2dcx_165d7242-2ef1-481d-992a-09e3364e0626/serve-healthcheck-canary/0.log" Apr 16 20:41:12.007029 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:12.006945 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-mwqfp_b33484f0-9ae5-4b31-8baa-d4219e39ddd9/insights-operator/0.log" Apr 16 20:41:12.007709 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:12.007690 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-mwqfp_b33484f0-9ae5-4b31-8baa-d4219e39ddd9/insights-operator/1.log" Apr 16 20:41:12.105682 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:12.105662 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9wh9h_74e6b4dd-9a81-4310-8471-186da2714610/kube-rbac-proxy/0.log" Apr 16 20:41:12.126984 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:12.126957 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9wh9h_74e6b4dd-9a81-4310-8471-186da2714610/exporter/0.log" Apr 16 20:41:12.147518 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:12.147497 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9wh9h_74e6b4dd-9a81-4310-8471-186da2714610/extractor/0.log" Apr 16 20:41:14.240129 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:14.240101 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-659c8cbdc-rz7sf_d003760d-7c1d-443a-85f8-14280df4b2cd/manager/0.log" Apr 16 20:41:14.284160 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:14.284138 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-dz25p_32c8dc29-f0b2-42a4-93c0-6adc40019f95/server/0.log" Apr 16 20:41:14.776956 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:14.776929 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-jgbp8_eb93b256-25cf-46a1-b413-2e64d58db62b/seaweedfs/0.log" Apr 16 20:41:16.390997 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:16.390970 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-8sggb" Apr 16 20:41:18.472837 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:18.472808 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-gf47w_2d72cccc-2e90-4646-b78f-8afabf5aee06/migrator/0.log" Apr 16 20:41:18.494219 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:18.494194 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-gf47w_2d72cccc-2e90-4646-b78f-8afabf5aee06/graceful-termination/0.log" Apr 16 20:41:18.873386 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:18.873359 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-spvh5_3e7e6010-fad1-4881-8816-b024c8853151/kube-storage-version-migrator-operator/1.log" Apr 16 20:41:18.874127 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:18.874107 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-spvh5_3e7e6010-fad1-4881-8816-b024c8853151/kube-storage-version-migrator-operator/0.log" Apr 16 20:41:19.837560 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:19.837533 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2wctn_8012fac3-113c-4958-9ea3-7cdbc5e5c6e9/kube-multus/0.log" Apr 16 20:41:19.916725 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:19.916700 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-88wb2_6f07541a-6ad1-43d0-9a04-540a16f67cec/kube-multus-additional-cni-plugins/0.log" Apr 16 20:41:19.936807 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:19.936773 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-88wb2_6f07541a-6ad1-43d0-9a04-540a16f67cec/egress-router-binary-copy/0.log" Apr 16 20:41:19.956875 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:19.956851 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-88wb2_6f07541a-6ad1-43d0-9a04-540a16f67cec/cni-plugins/0.log" Apr 16 20:41:19.975421 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:19.975400 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-88wb2_6f07541a-6ad1-43d0-9a04-540a16f67cec/bond-cni-plugin/0.log" Apr 16 20:41:19.995339 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:19.995320 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-88wb2_6f07541a-6ad1-43d0-9a04-540a16f67cec/routeoverride-cni/0.log" Apr 16 20:41:20.013906 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:20.013889 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-88wb2_6f07541a-6ad1-43d0-9a04-540a16f67cec/whereabouts-cni-bincopy/0.log" Apr 16 20:41:20.037177 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:20.037158 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-88wb2_6f07541a-6ad1-43d0-9a04-540a16f67cec/whereabouts-cni/0.log" Apr 16 20:41:20.437043 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:20.437017 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nx45q_e0b9420a-1c3e-47b5-b187-827cb7f39aea/network-metrics-daemon/0.log" Apr 16 20:41:20.460017 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:20.459995 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nx45q_e0b9420a-1c3e-47b5-b187-827cb7f39aea/kube-rbac-proxy/0.log" Apr 16 20:41:21.633271 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:21.633237 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-controller/0.log" Apr 16 20:41:21.648980 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:21.648958 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/0.log" Apr 16 20:41:21.661274 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:21.661251 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovn-acl-logging/1.log" Apr 16 20:41:21.678144 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:21.678122 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/kube-rbac-proxy-node/0.log" Apr 16 20:41:21.702410 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:21.702388 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 20:41:21.720065 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:21.720044 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/northd/0.log" Apr 16 20:41:21.740125 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:21.740108 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/nbdb/0.log" Apr 16 20:41:21.758779 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:21.758763 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/sbdb/0.log" Apr 16 20:41:21.853317 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:21.853284 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k4p2g_356a3ae0-1448-42b5-a8eb-eb35ac7b6f96/ovnkube-controller/0.log" Apr 16 20:41:23.047745 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:23.047715 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-sqd4v_da823131-b1c7-41ae-a0e6-3fa763f3d110/check-endpoints/0.log" Apr 16 20:41:23.117442 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:23.117417 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-qnt55_58737d84-3f2c-46d7-b1e4-6b7f5cda8b4f/network-check-target-container/0.log" Apr 16 20:41:23.973464 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:23.973435 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-84k6z_24e2ccb6-cefd-4e6b-baff-95b016092cf8/iptables-alerter/0.log" Apr 16 20:41:24.613377 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:24.613343 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-rlrsk_09d46eec-98b3-409a-adf0-e27e7e7fa496/tuned/0.log" Apr 16 20:41:26.238985 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:26.238950 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-bjppc_18721546-1063-46a1-8715-a40872933b22/cluster-samples-operator/0.log" Apr 16 20:41:26.257548 ip-10-0-128-201 kubenswrapper[2561]: I0416 20:41:26.257523 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-bjppc_18721546-1063-46a1-8715-a40872933b22/cluster-samples-operator-watch/0.log"