Apr 16 18:14:58.984106 ip-10-0-134-167 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 18:14:58.984115 ip-10-0-134-167 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 18:14:58.984122 ip-10-0-134-167 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 18:14:58.984342 ip-10-0-134-167 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 18:15:09.041171 ip-10-0-134-167 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 18:15:09.041189 ip-10-0-134-167 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot fcda48893be1481393f3f37fdb39be32 -- Apr 16 18:17:35.020524 ip-10-0-134-167 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:17:35.519937 ip-10-0-134-167 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:17:35.519937 ip-10-0-134-167 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:17:35.519937 ip-10-0-134-167 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:17:35.519937 ip-10-0-134-167 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:17:35.519937 ip-10-0-134-167 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:17:35.523930 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.523844 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:17:35.527988 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.527970 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:35.527988 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.527986 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:35.527988 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.527991 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:35.528178 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.527998 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:35.528178 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528002 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:35.528178 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528006 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:35.528178 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528010 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:35.528178 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528015 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:35.528178 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528019 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:35.528178 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528023 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:35.528178 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528028 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:35.528178 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528032 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:35.528178 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528035 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:35.528178 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528040 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:35.528178 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528044 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:35.528178 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528048 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:35.528178 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528052 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:35.528178 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528056 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:35.528178 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528059 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:35.528178 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528064 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:35.528178 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528067 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:35.528178 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528071 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:35.528178 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528075 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:35.528992 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528079 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:35.528992 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528083 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:35.528992 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528088 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:35.528992 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528092 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:35.528992 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528096 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:35.528992 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528100 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:35.528992 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528104 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:35.528992 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528109 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:35.528992 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528113 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:35.528992 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528117 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:35.528992 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528121 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:35.528992 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528126 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:35.528992 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528130 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:35.528992 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528136 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:35.528992 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528140 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:35.528992 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528144 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:35.528992 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528149 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:35.528992 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528153 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:35.528992 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528157 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:35.528992 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528162 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:35.529765 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528166 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:35.529765 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528170 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:35.529765 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528174 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:35.529765 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528180 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:35.529765 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528184 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:35.529765 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528188 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:35.529765 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528192 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:35.529765 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528196 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:35.529765 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528200 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:35.529765 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528207 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:35.529765 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528213 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:35.529765 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528218 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:35.529765 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528222 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:35.529765 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528227 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:35.529765 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528234 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:35.529765 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528239 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:35.529765 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528244 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:35.529765 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528248 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:35.529765 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528252 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:35.530229 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528257 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:35.530229 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528261 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:35.530229 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528283 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:35.530229 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528289 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:35.530229 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528294 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:35.530229 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528298 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:35.530229 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528306 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:35.530229 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528311 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:35.530229 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528328 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:35.530229 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528333 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:35.530229 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528337 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:35.530229 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528341 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:35.530229 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528345 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:35.530229 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528350 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:35.530229 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528354 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:35.530229 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528358 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:35.530229 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528363 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:35.530229 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528367 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:35.530229 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528371 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:35.530229 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528375 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:35.531064 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528379 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:35.531064 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528383 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:35.531064 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528387 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:35.531064 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.528392 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:35.531064 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529007 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:35.531064 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529016 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:35.531064 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529021 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:35.531064 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529027 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:35.531064 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529033 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:35.531064 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529037 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:35.531064 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529042 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:35.531064 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529046 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:35.531064 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529049 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:35.531064 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529053 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:35.531064 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529058 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:35.531064 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529062 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:35.531064 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529067 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:35.531064 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529072 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:35.531064 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529077 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:35.531064 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529081 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:35.531864 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529085 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:35.531864 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529089 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:35.531864 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529094 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:35.531864 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529098 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:35.531864 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529102 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:35.531864 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529105 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:35.531864 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529110 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:35.531864 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529114 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:35.531864 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529118 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:35.531864 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529122 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:35.531864 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529126 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:35.531864 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529130 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:35.531864 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529135 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:35.531864 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529139 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:35.531864 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529143 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:35.531864 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529147 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:35.531864 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529152 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:35.531864 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529156 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:35.531864 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529160 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:35.532374 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529164 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:35.532374 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529168 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:35.532374 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529172 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:35.532374 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529177 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:35.532374 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529181 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:35.532374 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529185 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:35.532374 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529190 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:35.532374 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529194 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:35.532374 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529198 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:35.532374 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529202 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:35.532374 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529210 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:35.532374 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529216 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:35.532374 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529221 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:35.532374 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529225 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:35.532374 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529229 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:35.532374 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529234 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:35.532374 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529238 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:35.532374 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529242 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:35.532374 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529246 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:35.532929 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529251 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:35.532929 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529255 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:35.532929 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529259 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:35.532929 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529263 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:35.532929 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529286 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:35.532929 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529290 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:35.532929 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529295 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:35.532929 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529299 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:35.532929 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529303 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:35.532929 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529307 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:35.532929 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529311 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:35.532929 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529315 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:35.532929 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529321 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:35.532929 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529326 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:35.532929 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529330 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:35.532929 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529335 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:35.532929 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529339 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:35.532929 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529344 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:35.532929 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529348 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:35.532929 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529373 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:35.533608 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529380 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:35.533608 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529387 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:35.533608 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529396 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:35.533608 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529401 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:35.533608 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529409 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:35.533608 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529415 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:35.533608 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529419 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:35.533608 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529423 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:35.533608 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529427 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:35.533608 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529431 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:35.533608 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529436 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:35.533608 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.529440 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:35.533608 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530347 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:17:35.533608 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530362 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:17:35.533608 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530374 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:17:35.533608 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530381 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:17:35.533608 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530388 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:17:35.533608 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530393 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:17:35.533608 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530399 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:17:35.533608 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530407 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:17:35.533608 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530412 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530417 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530423 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530428 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530433 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530438 2579 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530442 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530447 2579 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530452 2579 flags.go:64] FLAG: --cloud-config="" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530459 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530464 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530470 2579 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530475 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530480 2579 flags.go:64] FLAG: --config-dir="" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530486 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530491 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530498 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530504 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530509 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530514 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530519 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530523 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530528 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530533 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530538 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:17:35.534150 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530545 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530550 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530554 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530560 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530564 2579 flags.go:64] FLAG: --enable-server="true" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530569 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530577 2579 flags.go:64] FLAG: --event-burst="100" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530582 2579 flags.go:64] FLAG: --event-qps="50" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530587 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530592 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530596 2579 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530602 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530607 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530612 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530617 2579 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530622 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530629 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530634 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530638 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530643 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530648 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530654 2579 flags.go:64] FLAG: --feature-gates="" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530660 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530666 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530671 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:17:35.534831 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530677 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530682 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530687 2579 flags.go:64] FLAG: --help="false" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530691 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-134-167.ec2.internal" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530697 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530702 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530707 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530713 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530719 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530723 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530728 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530733 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530738 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530742 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530748 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530752 2579 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530757 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530761 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530766 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530771 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530775 2579 flags.go:64] FLAG: --lock-file="" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530780 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530784 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530791 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:17:35.535491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530800 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530805 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530809 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530814 2579 flags.go:64] FLAG: --logging-format="text" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530821 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530826 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530831 2579 flags.go:64] FLAG: --manifest-url="" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530836 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530843 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530848 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530854 2579 flags.go:64] FLAG: --max-pods="110" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530859 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530864 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530868 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530873 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530878 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530883 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530888 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530899 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530904 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530909 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530914 2579 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530919 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:17:35.536124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530928 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530932 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530937 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530942 2579 flags.go:64] FLAG: --port="10250" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530947 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530952 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a659c08268af2082" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530957 2579 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530962 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530968 2579 flags.go:64] FLAG: --register-node="true" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530973 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530978 2579 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530984 2579 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530988 2579 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530993 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.530999 2579 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531006 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531010 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531015 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531020 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531025 2579 flags.go:64] FLAG: --runonce="false" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531030 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531036 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531040 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531045 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531050 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531055 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:17:35.536702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531060 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531065 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531069 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531074 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531078 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531083 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531088 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531092 2579 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531097 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531106 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531111 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531115 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531122 2579 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531126 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531131 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531136 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531141 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531146 2579 flags.go:64] FLAG: --v="2" Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531152 2579 flags.go:64] FLAG: --version="false" Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531158 2579 flags.go:64] FLAG: --vmodule="" Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531164 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.531170 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531334 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531342 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:35.537375 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531347 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:35.538031 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531351 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:35.538031 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531356 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:35.538031 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531360 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:35.538031 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531364 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:35.538031 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531368 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:35.538031 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531372 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:35.538031 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531377 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:35.538031 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531381 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:35.538031 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531385 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:35.538031 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531389 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:35.538031 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531396 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:35.538031 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531402 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:35.538031 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531406 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:35.538031 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531410 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:35.538031 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531414 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:35.538031 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531419 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:35.538031 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531423 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:35.538031 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531427 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:35.538031 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531431 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:35.538573 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531435 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:35.538573 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531439 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:35.538573 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531443 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:35.538573 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531448 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:35.538573 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531453 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:35.538573 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531457 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:35.538573 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531461 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:35.538573 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531465 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:35.538573 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531469 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:35.538573 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531474 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:35.538573 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531477 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:35.538573 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531481 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:35.538573 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531486 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:35.538573 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531490 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:35.538573 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531494 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:35.538573 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531503 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:35.538573 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531508 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:35.538573 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531512 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:35.538573 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531516 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:35.538573 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531521 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:35.539059 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531525 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:35.539059 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531529 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:35.539059 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531533 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:35.539059 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531537 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:35.539059 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531541 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:35.539059 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531545 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:35.539059 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531549 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:35.539059 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531556 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:35.539059 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531561 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:35.539059 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531567 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:35.539059 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531571 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:35.539059 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531576 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:35.539059 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531580 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:35.539059 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531585 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:35.539059 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531589 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:35.539059 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531593 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:35.539059 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531598 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:35.539059 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531602 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:35.539059 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531606 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:35.539059 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531611 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:35.539563 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531615 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:35.539563 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531619 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:35.539563 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531623 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:35.539563 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531627 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:35.539563 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531639 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:35.539563 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531643 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:35.539563 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531647 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:35.539563 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531651 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:35.539563 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531656 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:35.539563 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531660 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:35.539563 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531664 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:35.539563 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531668 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:35.539563 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531672 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:35.539563 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531676 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:35.539563 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531681 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:35.539563 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531685 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:35.539563 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531689 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:35.539563 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531693 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:35.539563 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531697 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:35.539563 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531701 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:35.540047 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531705 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:35.540047 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531709 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:35.540047 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531713 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:35.540047 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.531717 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:35.540047 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.532536 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:17:35.540637 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.540618 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:17:35.540668 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.540638 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:17:35.540701 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540684 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:35.540701 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540689 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:35.540701 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540693 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:35.540701 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540696 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:35.540701 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540699 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:35.540701 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540702 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:35.540851 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540705 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:35.540851 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540709 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:35.540851 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540712 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:35.540851 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540714 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:35.540851 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540717 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:35.540851 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540719 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:35.540851 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540722 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:35.540851 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540725 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:35.540851 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540728 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:35.540851 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540730 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:35.540851 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540733 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:35.540851 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540736 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:35.540851 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540739 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:35.540851 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540742 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:35.540851 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540745 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:35.540851 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540748 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:35.540851 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540751 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:35.540851 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540753 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:35.540851 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540756 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:35.540851 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540758 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:35.541350 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540761 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:35.541350 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540764 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:35.541350 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540767 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:35.541350 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540770 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:35.541350 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540772 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:35.541350 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540776 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:35.541350 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540779 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:35.541350 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540781 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:35.541350 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540784 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:35.541350 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540787 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:35.541350 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540789 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:35.541350 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540792 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:35.541350 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540794 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:35.541350 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540798 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:35.541350 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540801 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:35.541350 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540803 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:35.541350 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540806 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:35.541350 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540809 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:35.541350 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540811 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:35.541350 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540814 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:35.541863 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540816 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:35.541863 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540819 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:35.541863 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540821 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:35.541863 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540824 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:35.541863 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540826 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:35.541863 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540829 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:35.541863 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540831 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:35.541863 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540834 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:35.541863 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540837 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:35.541863 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540839 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:35.541863 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540842 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:35.541863 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540845 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:35.541863 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540847 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:35.541863 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540850 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:35.541863 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540853 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:35.541863 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540856 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:35.541863 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540860 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:35.541863 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540864 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:35.541863 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540868 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:35.541863 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540870 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:35.542460 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540873 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:35.542460 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540875 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:35.542460 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540878 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:35.542460 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540881 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:35.542460 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540883 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:35.542460 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540886 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:35.542460 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540889 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:35.542460 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540892 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:35.542460 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540894 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:35.542460 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540897 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:35.542460 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540900 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:35.542460 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540902 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:35.542460 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540906 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:35.542460 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540911 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:35.542460 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540914 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:35.542460 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540917 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:35.542460 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540920 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:35.542460 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540924 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:35.542460 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540926 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:35.542912 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.540929 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:35.542912 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.540934 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:17:35.542912 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541026 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:35.542912 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541031 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:35.542912 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541034 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:35.542912 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541037 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:35.542912 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541040 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:35.542912 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541043 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:35.542912 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541046 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:35.542912 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541048 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:35.542912 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541051 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:35.542912 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541054 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:35.542912 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541058 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:35.542912 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541062 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:35.542912 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541065 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:35.543303 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541068 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:35.543303 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541071 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:35.543303 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541073 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:35.543303 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541076 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:35.543303 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541079 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:35.543303 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541082 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:35.543303 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541084 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:35.543303 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541087 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:35.543303 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541089 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:35.543303 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541092 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:35.543303 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541096 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:35.543303 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541099 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:35.543303 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541102 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:35.543303 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541105 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:35.543303 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541107 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:35.543303 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541110 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:35.543303 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541113 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:35.543303 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541115 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:35.543303 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541118 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:35.543303 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541120 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:35.543806 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541122 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:35.543806 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541125 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:35.543806 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541128 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:35.543806 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541130 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:35.543806 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541133 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:35.543806 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541136 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:35.543806 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541139 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:35.543806 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541141 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:35.543806 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541144 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:35.543806 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541147 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:35.543806 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541150 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:35.543806 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541153 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:35.543806 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541155 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:35.543806 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541158 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:35.543806 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541160 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:35.543806 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541164 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:35.543806 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541166 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:35.543806 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541169 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:35.543806 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541172 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:35.543806 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541174 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:35.544317 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541177 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:35.544317 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541180 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:35.544317 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541182 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:35.544317 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541185 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:35.544317 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541187 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:35.544317 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541190 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:35.544317 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541192 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:35.544317 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541195 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:35.544317 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541198 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:35.544317 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541200 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:35.544317 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541202 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:35.544317 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541205 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:35.544317 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541208 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:35.544317 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541210 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:35.544317 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541213 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:35.544317 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541215 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:35.544317 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541218 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:35.544317 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541221 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:35.544317 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541224 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:35.544770 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541226 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:35.544770 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541229 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:35.544770 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541231 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:35.544770 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541234 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:35.544770 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541237 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:35.544770 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541239 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:35.544770 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541242 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:35.544770 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541244 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:35.544770 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541247 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:35.544770 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541249 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:35.544770 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541252 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:35.544770 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541254 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:35.544770 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541257 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:35.544770 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:35.541259 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:35.544770 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.541264 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:17:35.545136 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.542807 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:17:35.545239 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.545225 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:17:35.546443 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.546432 2579 server.go:1019] "Starting client certificate rotation" Apr 16 18:17:35.546579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.546563 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:17:35.546613 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.546602 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:17:35.574932 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.574916 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:17:35.583028 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.583014 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:17:35.599670 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.599653 2579 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:17:35.606039 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.606024 2579 log.go:25] "Validated CRI v1 image API" Apr 16 18:17:35.607515 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.607499 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:17:35.609006 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.608989 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:17:35.611708 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.611690 2579 fs.go:135] Filesystem UUIDs: map[47bbfd41-27f8-4d9b-b34b-4893dacb05d4:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 dd1e4964-f801-4399-83c1-9f74e5654979:/dev/nvme0n1p3] Apr 16 18:17:35.611767 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.611708 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:17:35.617296 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.617177 2579 manager.go:217] Machine: {Timestamp:2026-04-16 18:17:35.615370899 +0000 UTC m=+0.462531382 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099756 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23e34effe39b4ca03cfd4cd1186937 SystemUUID:ec23e34e-ffe3-9b4c-a03c-fd4cd1186937 BootID:fcda4889-3be1-4813-93f3-f37fdb39be32 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:8a:12:b3:9e:f9 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:8a:12:b3:9e:f9 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:d2:85:64:d5:06:74 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:17:35.617296 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.617294 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:17:35.617400 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.617368 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:17:35.618484 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.618459 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:17:35.618612 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.618488 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-167.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:17:35.618657 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.618621 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:17:35.618657 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.618630 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:17:35.618657 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.618642 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:17:35.619612 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.619602 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:17:35.621845 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.621835 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:17:35.621946 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.621937 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:17:35.624357 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.624347 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:17:35.624396 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.624360 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:17:35.624396 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.624374 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:17:35.624396 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.624383 2579 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:17:35.624396 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.624392 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:17:35.625414 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.625402 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:17:35.625445 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.625421 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:17:35.628793 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.628774 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:17:35.630616 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.630600 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:17:35.631957 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.631946 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:17:35.632017 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.631963 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:17:35.632017 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.631969 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:17:35.632017 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.631975 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:17:35.632017 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.631980 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:17:35.632017 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.631986 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:17:35.632017 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.631992 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:17:35.632017 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.631998 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:17:35.632017 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.632005 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:17:35.632017 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.632011 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:17:35.632017 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.632020 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:17:35.632295 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.632029 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:17:35.632981 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.632971 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:17:35.632981 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.632981 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:17:35.634870 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:35.634832 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:17:35.634935 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:35.634867 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-167.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:17:35.636549 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.636536 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:17:35.636585 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.636572 2579 server.go:1295] "Started kubelet" Apr 16 18:17:35.636714 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.636674 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:17:35.636768 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.636692 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:17:35.636768 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.636737 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:17:35.638524 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.638499 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:17:35.638809 ip-10-0-134-167 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:17:35.640821 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.640806 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:17:35.646074 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.646059 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-167.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:17:35.646757 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.646733 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:17:35.646757 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.646746 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:17:35.647314 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:35.646302 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-167.ec2.internal.18a6e926bd33af67 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-167.ec2.internal,UID:ip-10-0-134-167.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-167.ec2.internal,},FirstTimestamp:2026-04-16 18:17:35.636549479 +0000 UTC m=+0.483709961,LastTimestamp:2026-04-16 18:17:35.636549479 +0000 UTC m=+0.483709961,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-167.ec2.internal,}" Apr 16 18:17:35.647568 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:35.647552 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-167.ec2.internal\" not found" Apr 16 18:17:35.647717 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.647701 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:17:35.647717 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.647702 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:17:35.647838 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.647727 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:17:35.647838 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.647757 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:17:35.647838 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.647771 2579 factory.go:55] Registering systemd factory Apr 16 18:17:35.647838 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.647781 2579 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:17:35.647838 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.647795 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:17:35.647838 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.647804 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:17:35.648075 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.648040 2579 factory.go:153] Registering CRI-O factory Apr 16 18:17:35.648075 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.648053 2579 factory.go:223] Registration of the crio container factory successfully Apr 16 18:17:35.648075 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.648075 2579 factory.go:103] Registering Raw factory Apr 16 18:17:35.648173 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.648089 2579 manager.go:1196] Started watching for new ooms in manager Apr 16 18:17:35.648670 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.648651 2579 manager.go:319] Starting recovery of all containers Apr 16 18:17:35.650132 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:35.650113 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:17:35.659461 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:35.659307 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 18:17:35.659461 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:35.659404 2579 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-167.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 18:17:35.659604 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.659515 2579 manager.go:324] Recovery completed Apr 16 18:17:35.660459 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.660440 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gjf85" Apr 16 18:17:35.663802 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.663789 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:35.666244 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.666229 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-167.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:35.666316 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.666255 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-167.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:35.666316 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.666280 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-167.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:35.666749 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.666730 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:17:35.666749 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.666748 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:17:35.666860 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.666765 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:17:35.666913 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.666900 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gjf85" Apr 16 18:17:35.669505 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.669492 2579 policy_none.go:49] "None policy: Start" Apr 16 18:17:35.669566 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.669535 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:17:35.669566 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.669546 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:17:35.718636 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.705428 2579 manager.go:341] "Starting Device Plugin manager" Apr 16 18:17:35.718636 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:35.705452 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:17:35.718636 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.705461 2579 server.go:85] "Starting device plugin registration server" Apr 16 18:17:35.718636 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.705694 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:17:35.718636 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.705708 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:17:35.718636 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.705793 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:17:35.718636 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.706032 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:17:35.718636 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.706047 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:17:35.718636 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:35.706976 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:17:35.718636 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:35.707042 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-167.ec2.internal\" not found" Apr 16 18:17:35.748818 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.748785 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:17:35.749911 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.749892 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:17:35.749990 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.749916 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:17:35.749990 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.749932 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:17:35.749990 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.749938 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:17:35.749990 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:35.749974 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:17:35.753616 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.753595 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:35.806417 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.806379 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:35.807064 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.807048 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-167.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:35.807126 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.807074 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-167.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:35.807126 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.807084 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-167.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:35.807126 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.807103 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-167.ec2.internal" Apr 16 18:17:35.816293 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.816262 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-167.ec2.internal" Apr 16 18:17:35.816351 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:35.816295 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-167.ec2.internal\": node \"ip-10-0-134-167.ec2.internal\" not found" Apr 16 18:17:35.831430 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:35.831412 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-167.ec2.internal\" not found" Apr 16 18:17:35.850316 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.850299 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-167.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-167.ec2.internal"] Apr 16 18:17:35.850370 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.850349 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:35.850988 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.850975 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-167.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:35.851040 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.850999 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-167.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:35.851040 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.851009 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-167.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:35.852128 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.852117 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:35.852263 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.852248 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-167.ec2.internal" Apr 16 18:17:35.852317 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.852290 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:35.852735 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.852720 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-167.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:35.852797 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.852745 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-167.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:35.852797 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.852759 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-167.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:35.852797 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.852728 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-167.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:35.852933 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.852817 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-167.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:35.852933 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.852834 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-167.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:35.854475 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.854458 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-167.ec2.internal" Apr 16 18:17:35.854554 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.854486 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:35.855127 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.855112 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-167.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:35.855202 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.855136 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-167.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:35.855202 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.855149 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-167.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:35.877791 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:35.877766 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-167.ec2.internal\" not found" node="ip-10-0-134-167.ec2.internal" Apr 16 18:17:35.881931 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:35.881917 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-167.ec2.internal\" not found" node="ip-10-0-134-167.ec2.internal" Apr 16 18:17:35.932124 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:35.932105 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-167.ec2.internal\" not found" Apr 16 18:17:35.949565 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.949545 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/69d25570a0cabb0e86b33b5d502c3716-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-167.ec2.internal\" (UID: \"69d25570a0cabb0e86b33b5d502c3716\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-167.ec2.internal" Apr 16 18:17:35.949626 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.949570 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69d25570a0cabb0e86b33b5d502c3716-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-167.ec2.internal\" (UID: \"69d25570a0cabb0e86b33b5d502c3716\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-167.ec2.internal" Apr 16 18:17:35.949626 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:35.949592 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/69e28dff0cb5be54806a6f5d3d910a4e-config\") pod \"kube-apiserver-proxy-ip-10-0-134-167.ec2.internal\" (UID: \"69e28dff0cb5be54806a6f5d3d910a4e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-167.ec2.internal" Apr 16 18:17:36.033004 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:36.032974 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-167.ec2.internal\" not found" Apr 16 18:17:36.050410 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:36.050388 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69d25570a0cabb0e86b33b5d502c3716-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-167.ec2.internal\" (UID: \"69d25570a0cabb0e86b33b5d502c3716\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-167.ec2.internal" Apr 16 18:17:36.050491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:36.050421 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/69e28dff0cb5be54806a6f5d3d910a4e-config\") pod \"kube-apiserver-proxy-ip-10-0-134-167.ec2.internal\" (UID: \"69e28dff0cb5be54806a6f5d3d910a4e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-167.ec2.internal" Apr 16 18:17:36.050491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:36.050444 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/69d25570a0cabb0e86b33b5d502c3716-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-167.ec2.internal\" (UID: \"69d25570a0cabb0e86b33b5d502c3716\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-167.ec2.internal" Apr 16 18:17:36.050491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:36.050471 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/69d25570a0cabb0e86b33b5d502c3716-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-167.ec2.internal\" (UID: \"69d25570a0cabb0e86b33b5d502c3716\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-167.ec2.internal" Apr 16 18:17:36.050491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:36.050388 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69d25570a0cabb0e86b33b5d502c3716-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-167.ec2.internal\" (UID: \"69d25570a0cabb0e86b33b5d502c3716\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-167.ec2.internal" Apr 16 18:17:36.050636 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:36.050527 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/69e28dff0cb5be54806a6f5d3d910a4e-config\") pod \"kube-apiserver-proxy-ip-10-0-134-167.ec2.internal\" (UID: \"69e28dff0cb5be54806a6f5d3d910a4e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-167.ec2.internal" Apr 16 18:17:36.133820 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:36.133769 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-167.ec2.internal\" not found" Apr 16 18:17:36.181230 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:36.181207 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-167.ec2.internal" Apr 16 18:17:36.185303 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:36.185260 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-167.ec2.internal" Apr 16 18:17:36.234571 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:36.234553 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-167.ec2.internal\" not found" Apr 16 18:17:36.335131 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:36.335105 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-167.ec2.internal\" not found" Apr 16 18:17:36.435699 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:36.435645 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-167.ec2.internal\" not found" Apr 16 18:17:36.536131 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:36.536095 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-167.ec2.internal\" not found" Apr 16 18:17:36.540491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:36.540473 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:36.546130 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:36.546117 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:17:36.546289 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:36.546257 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:17:36.546333 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:36.546299 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:17:36.637036 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:36.637011 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-167.ec2.internal\" not found" Apr 16 18:17:36.647178 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:36.647156 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:17:36.665578 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:36.665553 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:17:36.670162 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:36.670134 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:12:35 +0000 UTC" deadline="2027-11-10 09:59:52.025594196 +0000 UTC" Apr 16 18:17:36.670162 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:36.670159 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13743h42m15.355438595s" Apr 16 18:17:36.688901 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:36.688832 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-vlkvs" Apr 16 18:17:36.697304 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:36.697286 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-vlkvs" Apr 16 18:17:36.737500 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:36.737477 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-167.ec2.internal\" not found" Apr 16 18:17:36.806101 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:36.806077 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69e28dff0cb5be54806a6f5d3d910a4e.slice/crio-807eddf20d62d7fd0d45703de7d0139981eccf0c09d47a1c3dff4cf0487c3d92 WatchSource:0}: Error finding container 807eddf20d62d7fd0d45703de7d0139981eccf0c09d47a1c3dff4cf0487c3d92: Status 404 returned error can't find the container with id 807eddf20d62d7fd0d45703de7d0139981eccf0c09d47a1c3dff4cf0487c3d92 Apr 16 18:17:36.806646 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:36.806615 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69d25570a0cabb0e86b33b5d502c3716.slice/crio-9e998872e07e7dfaf40a320b79aeb7fdc212e2268a4e6c8bfe89deabd4b9696d WatchSource:0}: Error finding container 9e998872e07e7dfaf40a320b79aeb7fdc212e2268a4e6c8bfe89deabd4b9696d: Status 404 returned error can't find the container with id 9e998872e07e7dfaf40a320b79aeb7fdc212e2268a4e6c8bfe89deabd4b9696d Apr 16 18:17:36.810741 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:36.810727 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:17:36.838543 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:36.838522 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-167.ec2.internal\" not found" Apr 16 18:17:36.939101 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:36.939045 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-167.ec2.internal\" not found" Apr 16 18:17:36.962015 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:36.961994 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:37.047380 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.047357 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-167.ec2.internal" Apr 16 18:17:37.063424 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.063401 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:17:37.065112 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.065098 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-167.ec2.internal" Apr 16 18:17:37.074750 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.074729 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:17:37.080756 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.080735 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:37.625514 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.625486 2579 apiserver.go:52] "Watching apiserver" Apr 16 18:17:37.631129 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.631107 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:17:37.632320 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.632294 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-g8hdh","openshift-multus/multus-additional-cni-plugins-nw6hr","openshift-multus/multus-mtzwt","openshift-network-diagnostics/network-check-target-sq4vb","openshift-network-operator/iptables-alerter-7b4wl","kube-system/kube-apiserver-proxy-ip-10-0-134-167.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j","openshift-dns/node-resolver-mt4jc","openshift-image-registry/node-ca-l4nhg","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-167.ec2.internal","openshift-multus/network-metrics-daemon-k6s9z","openshift-ovn-kubernetes/ovnkube-node-md54p","kube-system/konnectivity-agent-q8spv"] Apr 16 18:17:37.634462 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.634345 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mt4jc" Apr 16 18:17:37.635741 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.635663 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.636950 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.636823 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.636950 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.636836 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:17:37.637090 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.636959 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:17:37.637090 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.636972 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qhm2v\"" Apr 16 18:17:37.637633 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.637616 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:17:37.637858 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.637811 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:17:37.638066 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.638046 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:17:37.638140 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.638049 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:17:37.638187 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:37.638150 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6s9z" podUID="e2bb1680-b343-4014-bde1-6cc6bcd9805c" Apr 16 18:17:37.638463 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.638446 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:17:37.638548 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.638463 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:17:37.638764 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.638745 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-cshbk\"" Apr 16 18:17:37.638857 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.638783 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-rf6nn\"" Apr 16 18:17:37.638989 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.638973 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:17:37.639592 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.639257 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7b4wl" Apr 16 18:17:37.640596 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.640578 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l4nhg" Apr 16 18:17:37.641192 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.641172 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:17:37.641327 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.641293 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:17:37.641327 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.641319 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:17:37.641463 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.641401 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-v79tc\"" Apr 16 18:17:37.641745 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.641728 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.642442 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.642422 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:17:37.642524 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.642469 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5n5rs\"" Apr 16 18:17:37.642610 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.642594 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:17:37.642689 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.642675 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:17:37.643793 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.643775 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:17:37.643888 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.643792 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:17:37.643946 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.643929 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:17:37.644003 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.643929 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7w8xx\"" Apr 16 18:17:37.644471 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.644455 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:17:37.644558 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:37.644539 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq4vb" podUID="17b7bd66-ef40-4ff5-89de-2c7c3408fdc6" Apr 16 18:17:37.644616 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.644573 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.645825 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.645808 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.646490 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.646438 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:17:37.646490 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.646460 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:17:37.646659 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.646528 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-j2zj2\"" Apr 16 18:17:37.647256 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.647236 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q8spv" Apr 16 18:17:37.648915 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.648845 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:17:37.648915 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.648899 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:17:37.649207 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.649140 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:17:37.649396 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.649377 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:17:37.649491 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.649383 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:17:37.650419 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.650399 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:17:37.650419 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.650416 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:17:37.650548 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.650461 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gjs6b\"" Apr 16 18:17:37.650648 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.650630 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-ql4vk\"" Apr 16 18:17:37.650705 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.650658 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:17:37.652794 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.652768 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:17:37.658819 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.658770 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnqsq\" (UniqueName: \"kubernetes.io/projected/140c4c84-432b-4ef8-b103-3d2b0694f222-kube-api-access-vnqsq\") pod \"aws-ebs-csi-driver-node-8hv5j\" (UID: \"140c4c84-432b-4ef8-b103-3d2b0694f222\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.658819 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.658809 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-var-lib-kubelet\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.658999 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.658836 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e6e3d626-b1d8-4140-83e2-92db90a4eae4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.658999 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.658883 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e6e3d626-b1d8-4140-83e2-92db90a4eae4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.658999 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.658909 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-etc-sysctl-d\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.658999 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.658934 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/eec98933-e01a-4818-94e6-b087062491fa-iptables-alerter-script\") pod \"iptables-alerter-7b4wl\" (UID: \"eec98933-e01a-4818-94e6-b087062491fa\") " pod="openshift-network-operator/iptables-alerter-7b4wl" Apr 16 18:17:37.658999 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.658977 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-host-run-multus-certs\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.659232 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.659026 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.659232 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.659056 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e302771b-8af3-42dd-92e1-04faaff1c6e8-ovnkube-config\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.659232 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.659083 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e302771b-8af3-42dd-92e1-04faaff1c6e8-ovnkube-script-lib\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.659578 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.659472 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3ea5a78e-da64-4bee-b206-3f22bfd07fbc-agent-certs\") pod \"konnectivity-agent-q8spv\" (UID: \"3ea5a78e-da64-4bee-b206-3f22bfd07fbc\") " pod="kube-system/konnectivity-agent-q8spv" Apr 16 18:17:37.659578 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.659554 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e6e3d626-b1d8-4140-83e2-92db90a4eae4-system-cni-dir\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.659725 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.659588 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e6e3d626-b1d8-4140-83e2-92db90a4eae4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.659725 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.659619 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmbz6\" (UniqueName: \"kubernetes.io/projected/e6e3d626-b1d8-4140-83e2-92db90a4eae4-kube-api-access-gmbz6\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.659725 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.659676 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5d102faf-ea74-4afa-95c1-4133f4d71f8b-cni-binary-copy\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.660033 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.659997 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-host-var-lib-cni-bin\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.660103 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.660052 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-host-var-lib-cni-multus\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.660103 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.660086 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/140c4c84-432b-4ef8-b103-3d2b0694f222-device-dir\") pod \"aws-ebs-csi-driver-node-8hv5j\" (UID: \"140c4c84-432b-4ef8-b103-3d2b0694f222\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.660201 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.660123 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/140c4c84-432b-4ef8-b103-3d2b0694f222-sys-fs\") pod \"aws-ebs-csi-driver-node-8hv5j\" (UID: \"140c4c84-432b-4ef8-b103-3d2b0694f222\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.660201 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.660155 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-etc-tuned\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.660201 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.660191 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-var-lib-openvswitch\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.660336 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.660235 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-etc-kubernetes\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.660336 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.660319 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b843015-ad80-4dc0-aad1-a22e5a3909f6-host\") pod \"node-ca-l4nhg\" (UID: \"2b843015-ad80-4dc0-aad1-a22e5a3909f6\") " pod="openshift-image-registry/node-ca-l4nhg" Apr 16 18:17:37.660430 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.660353 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-host-slash\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.660430 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.660407 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9e454ceb-b1f7-44d0-899f-7d0a1be98b35-hosts-file\") pod \"node-resolver-mt4jc\" (UID: \"9e454ceb-b1f7-44d0-899f-7d0a1be98b35\") " pod="openshift-dns/node-resolver-mt4jc" Apr 16 18:17:37.660579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.660440 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e6e3d626-b1d8-4140-83e2-92db90a4eae4-cnibin\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.660579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.660471 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-host-var-lib-kubelet\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.660579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.660502 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs\") pod \"network-metrics-daemon-k6s9z\" (UID: \"e2bb1680-b343-4014-bde1-6cc6bcd9805c\") " pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:17:37.660710 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.660576 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xg5v\" (UniqueName: \"kubernetes.io/projected/2b843015-ad80-4dc0-aad1-a22e5a3909f6-kube-api-access-8xg5v\") pod \"node-ca-l4nhg\" (UID: \"2b843015-ad80-4dc0-aad1-a22e5a3909f6\") " pod="openshift-image-registry/node-ca-l4nhg" Apr 16 18:17:37.660710 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.660652 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-multus-cni-dir\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.660710 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.660700 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-os-release\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.660869 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.660734 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-etc-modprobe-d\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.660925 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.660873 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-run\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.661018 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.660993 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e6e3d626-b1d8-4140-83e2-92db90a4eae4-os-release\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.661079 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661058 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/140c4c84-432b-4ef8-b103-3d2b0694f222-etc-selinux\") pod \"aws-ebs-csi-driver-node-8hv5j\" (UID: \"140c4c84-432b-4ef8-b103-3d2b0694f222\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.661134 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661112 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-etc-sysconfig\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.661180 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661152 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-run-openvswitch\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.661225 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661180 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-host-run-ovn-kubernetes\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.661225 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661202 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-etc-sysctl-conf\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.661339 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661227 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-lib-modules\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.661339 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661290 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-run-ovn\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.661339 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661318 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e6e3d626-b1d8-4140-83e2-92db90a4eae4-cni-binary-copy\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.661466 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661359 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-multus-socket-dir-parent\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.661466 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661381 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjssq\" (UniqueName: \"kubernetes.io/projected/5d102faf-ea74-4afa-95c1-4133f4d71f8b-kube-api-access-sjssq\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.661466 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661407 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-host\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.661466 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661447 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8stvz\" (UniqueName: \"kubernetes.io/projected/eec98933-e01a-4818-94e6-b087062491fa-kube-api-access-8stvz\") pod \"iptables-alerter-7b4wl\" (UID: \"eec98933-e01a-4818-94e6-b087062491fa\") " pod="openshift-network-operator/iptables-alerter-7b4wl" Apr 16 18:17:37.661640 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661490 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2b843015-ad80-4dc0-aad1-a22e5a3909f6-serviceca\") pod \"node-ca-l4nhg\" (UID: \"2b843015-ad80-4dc0-aad1-a22e5a3909f6\") " pod="openshift-image-registry/node-ca-l4nhg" Apr 16 18:17:37.661640 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661518 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-multus-conf-dir\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.661640 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661542 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-host-kubelet\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.661640 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661589 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-cnibin\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.661640 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661614 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-host-run-netns\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.661848 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661667 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-host-cni-netd\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.661848 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661695 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/140c4c84-432b-4ef8-b103-3d2b0694f222-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8hv5j\" (UID: \"140c4c84-432b-4ef8-b103-3d2b0694f222\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.661848 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661727 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-system-cni-dir\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.661848 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661753 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-hostroot\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.661848 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661775 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-sys\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.661848 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661803 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eec98933-e01a-4818-94e6-b087062491fa-host-slash\") pod \"iptables-alerter-7b4wl\" (UID: \"eec98933-e01a-4818-94e6-b087062491fa\") " pod="openshift-network-operator/iptables-alerter-7b4wl" Apr 16 18:17:37.661848 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661831 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fgjj\" (UniqueName: \"kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj\") pod \"network-check-target-sq4vb\" (UID: \"17b7bd66-ef40-4ff5-89de-2c7c3408fdc6\") " pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:17:37.662139 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661860 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-run-systemd\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.662139 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661886 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-node-log\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.662139 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661910 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9e454ceb-b1f7-44d0-899f-7d0a1be98b35-tmp-dir\") pod \"node-resolver-mt4jc\" (UID: \"9e454ceb-b1f7-44d0-899f-7d0a1be98b35\") " pod="openshift-dns/node-resolver-mt4jc" Apr 16 18:17:37.662139 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661937 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6zjr\" (UniqueName: \"kubernetes.io/projected/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-kube-api-access-m6zjr\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.662139 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661967 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-host-run-netns\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.662139 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.661997 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e302771b-8af3-42dd-92e1-04faaff1c6e8-ovn-node-metrics-cert\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.662139 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.662026 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssc9m\" (UniqueName: \"kubernetes.io/projected/e302771b-8af3-42dd-92e1-04faaff1c6e8-kube-api-access-ssc9m\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.662139 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.662054 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3ea5a78e-da64-4bee-b206-3f22bfd07fbc-konnectivity-ca\") pod \"konnectivity-agent-q8spv\" (UID: \"3ea5a78e-da64-4bee-b206-3f22bfd07fbc\") " pod="kube-system/konnectivity-agent-q8spv" Apr 16 18:17:37.662139 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.662082 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-host-run-k8s-cni-cncf-io\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.662139 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.662113 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5d102faf-ea74-4afa-95c1-4133f4d71f8b-multus-daemon-config\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.662583 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.662142 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/140c4c84-432b-4ef8-b103-3d2b0694f222-socket-dir\") pod \"aws-ebs-csi-driver-node-8hv5j\" (UID: \"140c4c84-432b-4ef8-b103-3d2b0694f222\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.662583 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.662181 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-systemd-units\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.662583 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.662209 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-etc-openvswitch\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.662583 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.662237 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-host-cni-bin\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.662583 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.662260 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6sdw\" (UniqueName: \"kubernetes.io/projected/e2bb1680-b343-4014-bde1-6cc6bcd9805c-kube-api-access-q6sdw\") pod \"network-metrics-daemon-k6s9z\" (UID: \"e2bb1680-b343-4014-bde1-6cc6bcd9805c\") " pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:17:37.662583 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.662304 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/140c4c84-432b-4ef8-b103-3d2b0694f222-registration-dir\") pod \"aws-ebs-csi-driver-node-8hv5j\" (UID: \"140c4c84-432b-4ef8-b103-3d2b0694f222\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.662583 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.662331 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-etc-kubernetes\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.662583 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.662359 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-etc-systemd\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.662583 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.662385 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-tmp\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.662583 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.662406 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-log-socket\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.662583 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.662433 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e302771b-8af3-42dd-92e1-04faaff1c6e8-env-overrides\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.662583 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.662461 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcj62\" (UniqueName: \"kubernetes.io/projected/9e454ceb-b1f7-44d0-899f-7d0a1be98b35-kube-api-access-tcj62\") pod \"node-resolver-mt4jc\" (UID: \"9e454ceb-b1f7-44d0-899f-7d0a1be98b35\") " pod="openshift-dns/node-resolver-mt4jc" Apr 16 18:17:37.698133 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.698104 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:12:36 +0000 UTC" deadline="2027-11-04 20:20:26.57377789 +0000 UTC" Apr 16 18:17:37.698133 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.698132 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13610h2m48.875649497s" Apr 16 18:17:37.753816 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.753769 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-167.ec2.internal" event={"ID":"69e28dff0cb5be54806a6f5d3d910a4e","Type":"ContainerStarted","Data":"807eddf20d62d7fd0d45703de7d0139981eccf0c09d47a1c3dff4cf0487c3d92"} Apr 16 18:17:37.754860 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.754839 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-167.ec2.internal" event={"ID":"69d25570a0cabb0e86b33b5d502c3716","Type":"ContainerStarted","Data":"9e998872e07e7dfaf40a320b79aeb7fdc212e2268a4e6c8bfe89deabd4b9696d"} Apr 16 18:17:37.762714 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.762688 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-host-slash\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.762714 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.762717 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9e454ceb-b1f7-44d0-899f-7d0a1be98b35-hosts-file\") pod \"node-resolver-mt4jc\" (UID: \"9e454ceb-b1f7-44d0-899f-7d0a1be98b35\") " pod="openshift-dns/node-resolver-mt4jc" Apr 16 18:17:37.762854 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.762733 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e6e3d626-b1d8-4140-83e2-92db90a4eae4-cnibin\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.762854 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.762753 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-host-var-lib-kubelet\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.762854 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.762778 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs\") pod \"network-metrics-daemon-k6s9z\" (UID: \"e2bb1680-b343-4014-bde1-6cc6bcd9805c\") " pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:17:37.762854 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.762802 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xg5v\" (UniqueName: \"kubernetes.io/projected/2b843015-ad80-4dc0-aad1-a22e5a3909f6-kube-api-access-8xg5v\") pod \"node-ca-l4nhg\" (UID: \"2b843015-ad80-4dc0-aad1-a22e5a3909f6\") " pod="openshift-image-registry/node-ca-l4nhg" Apr 16 18:17:37.762854 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.762814 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e6e3d626-b1d8-4140-83e2-92db90a4eae4-cnibin\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.762854 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.762824 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-multus-cni-dir\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.762854 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.762825 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-host-var-lib-kubelet\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.762854 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.762814 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9e454ceb-b1f7-44d0-899f-7d0a1be98b35-hosts-file\") pod \"node-resolver-mt4jc\" (UID: \"9e454ceb-b1f7-44d0-899f-7d0a1be98b35\") " pod="openshift-dns/node-resolver-mt4jc" Apr 16 18:17:37.763215 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.762881 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-host-slash\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.763215 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.762936 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-os-release\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.763215 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:37.762951 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:37.763215 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.762997 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-multus-cni-dir\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.763215 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763022 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-etc-modprobe-d\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.763215 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:37.763040 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs podName:e2bb1680-b343-4014-bde1-6cc6bcd9805c nodeName:}" failed. No retries permitted until 2026-04-16 18:17:38.263015262 +0000 UTC m=+3.110175754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs") pod "network-metrics-daemon-k6s9z" (UID: "e2bb1680-b343-4014-bde1-6cc6bcd9805c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:37.763215 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763092 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-run\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.763215 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763124 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e6e3d626-b1d8-4140-83e2-92db90a4eae4-os-release\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.763215 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763138 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-etc-modprobe-d\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.763215 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763149 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/140c4c84-432b-4ef8-b103-3d2b0694f222-etc-selinux\") pod \"aws-ebs-csi-driver-node-8hv5j\" (UID: \"140c4c84-432b-4ef8-b103-3d2b0694f222\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.763215 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763190 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-run\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.763215 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763193 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-etc-sysconfig\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.763215 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763205 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e6e3d626-b1d8-4140-83e2-92db90a4eae4-os-release\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.763852 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763227 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-etc-sysconfig\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.763852 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763230 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-run-openvswitch\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.763852 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763243 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/140c4c84-432b-4ef8-b103-3d2b0694f222-etc-selinux\") pod \"aws-ebs-csi-driver-node-8hv5j\" (UID: \"140c4c84-432b-4ef8-b103-3d2b0694f222\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.763852 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763259 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-host-run-ovn-kubernetes\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.763852 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763282 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-run-openvswitch\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.763852 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763299 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-etc-sysctl-conf\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.763852 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763321 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-lib-modules\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.763852 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763328 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-host-run-ovn-kubernetes\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.763852 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763336 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-run-ovn\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.763852 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763367 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-run-ovn\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.763852 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763383 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e6e3d626-b1d8-4140-83e2-92db90a4eae4-cni-binary-copy\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.763852 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763407 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-multus-socket-dir-parent\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.763852 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763431 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjssq\" (UniqueName: \"kubernetes.io/projected/5d102faf-ea74-4afa-95c1-4133f4d71f8b-kube-api-access-sjssq\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.763852 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763447 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-lib-modules\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.763852 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763451 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-host\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.763852 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763485 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8stvz\" (UniqueName: \"kubernetes.io/projected/eec98933-e01a-4818-94e6-b087062491fa-kube-api-access-8stvz\") pod \"iptables-alerter-7b4wl\" (UID: \"eec98933-e01a-4818-94e6-b087062491fa\") " pod="openshift-network-operator/iptables-alerter-7b4wl" Apr 16 18:17:37.763852 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763488 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-etc-sysctl-conf\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.764614 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763493 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-multus-socket-dir-parent\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.764614 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763515 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2b843015-ad80-4dc0-aad1-a22e5a3909f6-serviceca\") pod \"node-ca-l4nhg\" (UID: \"2b843015-ad80-4dc0-aad1-a22e5a3909f6\") " pod="openshift-image-registry/node-ca-l4nhg" Apr 16 18:17:37.764614 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763561 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-multus-conf-dir\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.764614 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763585 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-host-kubelet\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.764614 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763609 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-cnibin\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.764614 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763639 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-host-run-netns\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.764614 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763664 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-host-cni-netd\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.764614 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763687 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/140c4c84-432b-4ef8-b103-3d2b0694f222-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8hv5j\" (UID: \"140c4c84-432b-4ef8-b103-3d2b0694f222\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.764614 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763695 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-host\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.764614 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763714 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-system-cni-dir\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.764614 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763743 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-hostroot\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.764614 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763766 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-sys\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.764614 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763778 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-cnibin\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.764614 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763815 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eec98933-e01a-4818-94e6-b087062491fa-host-slash\") pod \"iptables-alerter-7b4wl\" (UID: \"eec98933-e01a-4818-94e6-b087062491fa\") " pod="openshift-network-operator/iptables-alerter-7b4wl" Apr 16 18:17:37.764614 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763835 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-host-run-netns\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.764614 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763850 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fgjj\" (UniqueName: \"kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj\") pod \"network-check-target-sq4vb\" (UID: \"17b7bd66-ef40-4ff5-89de-2c7c3408fdc6\") " pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:17:37.764614 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763880 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-host-cni-netd\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.764614 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763904 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eec98933-e01a-4818-94e6-b087062491fa-host-slash\") pod \"iptables-alerter-7b4wl\" (UID: \"eec98933-e01a-4818-94e6-b087062491fa\") " pod="openshift-network-operator/iptables-alerter-7b4wl" Apr 16 18:17:37.765376 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763910 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-host-kubelet\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.765376 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763944 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-run-systemd\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.765376 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763968 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-system-cni-dir\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.765376 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763989 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/140c4c84-432b-4ef8-b103-3d2b0694f222-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8hv5j\" (UID: \"140c4c84-432b-4ef8-b103-3d2b0694f222\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.765376 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763907 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-run-systemd\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.765376 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764000 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2b843015-ad80-4dc0-aad1-a22e5a3909f6-serviceca\") pod \"node-ca-l4nhg\" (UID: \"2b843015-ad80-4dc0-aad1-a22e5a3909f6\") " pod="openshift-image-registry/node-ca-l4nhg" Apr 16 18:17:37.765376 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764011 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-hostroot\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.765376 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764024 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-node-log\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.765376 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764051 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9e454ceb-b1f7-44d0-899f-7d0a1be98b35-tmp-dir\") pod \"node-resolver-mt4jc\" (UID: \"9e454ceb-b1f7-44d0-899f-7d0a1be98b35\") " pod="openshift-dns/node-resolver-mt4jc" Apr 16 18:17:37.765376 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764056 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-sys\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.765376 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.763348 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-os-release\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.765376 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764084 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6zjr\" (UniqueName: \"kubernetes.io/projected/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-kube-api-access-m6zjr\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.765376 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764087 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-node-log\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.765376 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764125 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-multus-conf-dir\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.765376 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764204 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-host-run-netns\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.765376 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764232 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e302771b-8af3-42dd-92e1-04faaff1c6e8-ovn-node-metrics-cert\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.765376 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764260 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssc9m\" (UniqueName: \"kubernetes.io/projected/e302771b-8af3-42dd-92e1-04faaff1c6e8-kube-api-access-ssc9m\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.765376 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764300 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3ea5a78e-da64-4bee-b206-3f22bfd07fbc-konnectivity-ca\") pod \"konnectivity-agent-q8spv\" (UID: \"3ea5a78e-da64-4bee-b206-3f22bfd07fbc\") " pod="kube-system/konnectivity-agent-q8spv" Apr 16 18:17:37.766579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764329 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-host-run-k8s-cni-cncf-io\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.766579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764354 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5d102faf-ea74-4afa-95c1-4133f4d71f8b-multus-daemon-config\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.766579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764353 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-host-run-netns\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.766579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764406 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9e454ceb-b1f7-44d0-899f-7d0a1be98b35-tmp-dir\") pod \"node-resolver-mt4jc\" (UID: \"9e454ceb-b1f7-44d0-899f-7d0a1be98b35\") " pod="openshift-dns/node-resolver-mt4jc" Apr 16 18:17:37.766579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764501 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-host-run-k8s-cni-cncf-io\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.766579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764532 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/140c4c84-432b-4ef8-b103-3d2b0694f222-socket-dir\") pod \"aws-ebs-csi-driver-node-8hv5j\" (UID: \"140c4c84-432b-4ef8-b103-3d2b0694f222\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.766579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764561 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-systemd-units\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.766579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764610 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-etc-openvswitch\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.766579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764627 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-host-cni-bin\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.766579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764646 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e6e3d626-b1d8-4140-83e2-92db90a4eae4-cni-binary-copy\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.766579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764656 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6sdw\" (UniqueName: \"kubernetes.io/projected/e2bb1680-b343-4014-bde1-6cc6bcd9805c-kube-api-access-q6sdw\") pod \"network-metrics-daemon-k6s9z\" (UID: \"e2bb1680-b343-4014-bde1-6cc6bcd9805c\") " pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:17:37.766579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764669 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/140c4c84-432b-4ef8-b103-3d2b0694f222-socket-dir\") pod \"aws-ebs-csi-driver-node-8hv5j\" (UID: \"140c4c84-432b-4ef8-b103-3d2b0694f222\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.766579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764680 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/140c4c84-432b-4ef8-b103-3d2b0694f222-registration-dir\") pod \"aws-ebs-csi-driver-node-8hv5j\" (UID: \"140c4c84-432b-4ef8-b103-3d2b0694f222\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.766579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764720 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/140c4c84-432b-4ef8-b103-3d2b0694f222-registration-dir\") pod \"aws-ebs-csi-driver-node-8hv5j\" (UID: \"140c4c84-432b-4ef8-b103-3d2b0694f222\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.766579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764721 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-etc-kubernetes\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.766579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764745 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-systemd-units\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.766579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764749 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-etc-systemd\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.766579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764746 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:17:37.767434 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764779 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-tmp\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.767434 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764799 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-host-cni-bin\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.767434 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764805 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-log-socket\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.767434 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764831 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e302771b-8af3-42dd-92e1-04faaff1c6e8-env-overrides\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.767434 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764877 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcj62\" (UniqueName: \"kubernetes.io/projected/9e454ceb-b1f7-44d0-899f-7d0a1be98b35-kube-api-access-tcj62\") pod \"node-resolver-mt4jc\" (UID: \"9e454ceb-b1f7-44d0-899f-7d0a1be98b35\") " pod="openshift-dns/node-resolver-mt4jc" Apr 16 18:17:37.767434 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764898 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3ea5a78e-da64-4bee-b206-3f22bfd07fbc-konnectivity-ca\") pod \"konnectivity-agent-q8spv\" (UID: \"3ea5a78e-da64-4bee-b206-3f22bfd07fbc\") " pod="kube-system/konnectivity-agent-q8spv" Apr 16 18:17:37.767434 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764906 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnqsq\" (UniqueName: \"kubernetes.io/projected/140c4c84-432b-4ef8-b103-3d2b0694f222-kube-api-access-vnqsq\") pod \"aws-ebs-csi-driver-node-8hv5j\" (UID: \"140c4c84-432b-4ef8-b103-3d2b0694f222\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.767434 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764781 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-etc-openvswitch\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.767434 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764933 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-var-lib-kubelet\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.767434 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764960 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e6e3d626-b1d8-4140-83e2-92db90a4eae4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.767434 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764979 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-etc-kubernetes\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.767434 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.764986 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e6e3d626-b1d8-4140-83e2-92db90a4eae4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.767434 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765015 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-etc-sysctl-d\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.767434 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765022 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-etc-systemd\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.767434 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765041 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/eec98933-e01a-4818-94e6-b087062491fa-iptables-alerter-script\") pod \"iptables-alerter-7b4wl\" (UID: \"eec98933-e01a-4818-94e6-b087062491fa\") " pod="openshift-network-operator/iptables-alerter-7b4wl" Apr 16 18:17:37.767434 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765067 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-host-run-multus-certs\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.767434 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765100 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.768310 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765112 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5d102faf-ea74-4afa-95c1-4133f4d71f8b-multus-daemon-config\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.768310 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765141 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e302771b-8af3-42dd-92e1-04faaff1c6e8-ovnkube-config\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.768310 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765169 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e302771b-8af3-42dd-92e1-04faaff1c6e8-ovnkube-script-lib\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.768310 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765190 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-var-lib-kubelet\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.768310 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765194 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3ea5a78e-da64-4bee-b206-3f22bfd07fbc-agent-certs\") pod \"konnectivity-agent-q8spv\" (UID: \"3ea5a78e-da64-4bee-b206-3f22bfd07fbc\") " pod="kube-system/konnectivity-agent-q8spv" Apr 16 18:17:37.768310 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765221 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e6e3d626-b1d8-4140-83e2-92db90a4eae4-system-cni-dir\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.768310 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765293 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e6e3d626-b1d8-4140-83e2-92db90a4eae4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.768310 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765325 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmbz6\" (UniqueName: \"kubernetes.io/projected/e6e3d626-b1d8-4140-83e2-92db90a4eae4-kube-api-access-gmbz6\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.768310 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765351 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5d102faf-ea74-4afa-95c1-4133f4d71f8b-cni-binary-copy\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.768310 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765380 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-host-var-lib-cni-bin\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.768310 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765411 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-host-var-lib-cni-multus\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.768310 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765438 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/140c4c84-432b-4ef8-b103-3d2b0694f222-device-dir\") pod \"aws-ebs-csi-driver-node-8hv5j\" (UID: \"140c4c84-432b-4ef8-b103-3d2b0694f222\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.768310 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765451 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.768310 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765463 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/140c4c84-432b-4ef8-b103-3d2b0694f222-sys-fs\") pod \"aws-ebs-csi-driver-node-8hv5j\" (UID: \"140c4c84-432b-4ef8-b103-3d2b0694f222\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.768310 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765490 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-etc-tuned\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.768310 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765542 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-var-lib-openvswitch\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.768310 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765566 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-etc-sysctl-d\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.769103 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765570 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-etc-kubernetes\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.769103 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765610 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b843015-ad80-4dc0-aad1-a22e5a3909f6-host\") pod \"node-ca-l4nhg\" (UID: \"2b843015-ad80-4dc0-aad1-a22e5a3909f6\") " pod="openshift-image-registry/node-ca-l4nhg" Apr 16 18:17:37.769103 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765627 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e6e3d626-b1d8-4140-83e2-92db90a4eae4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.769103 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765659 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/eec98933-e01a-4818-94e6-b087062491fa-iptables-alerter-script\") pod \"iptables-alerter-7b4wl\" (UID: \"eec98933-e01a-4818-94e6-b087062491fa\") " pod="openshift-network-operator/iptables-alerter-7b4wl" Apr 16 18:17:37.769103 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765705 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-log-socket\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.769103 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765724 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b843015-ad80-4dc0-aad1-a22e5a3909f6-host\") pod \"node-ca-l4nhg\" (UID: \"2b843015-ad80-4dc0-aad1-a22e5a3909f6\") " pod="openshift-image-registry/node-ca-l4nhg" Apr 16 18:17:37.769103 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765729 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e302771b-8af3-42dd-92e1-04faaff1c6e8-ovnkube-config\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.769103 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765749 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-host-run-multus-certs\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.769103 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765800 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-host-var-lib-cni-multus\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.769103 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765800 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-host-var-lib-cni-bin\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.769103 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765814 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d102faf-ea74-4afa-95c1-4133f4d71f8b-etc-kubernetes\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.769103 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765857 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e6e3d626-b1d8-4140-83e2-92db90a4eae4-system-cni-dir\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.769103 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765873 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/140c4c84-432b-4ef8-b103-3d2b0694f222-sys-fs\") pod \"aws-ebs-csi-driver-node-8hv5j\" (UID: \"140c4c84-432b-4ef8-b103-3d2b0694f222\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.769103 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765923 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e302771b-8af3-42dd-92e1-04faaff1c6e8-var-lib-openvswitch\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.769103 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.765979 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e302771b-8af3-42dd-92e1-04faaff1c6e8-env-overrides\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.769103 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.766039 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/140c4c84-432b-4ef8-b103-3d2b0694f222-device-dir\") pod \"aws-ebs-csi-driver-node-8hv5j\" (UID: \"140c4c84-432b-4ef8-b103-3d2b0694f222\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.769103 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.766311 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e6e3d626-b1d8-4140-83e2-92db90a4eae4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.769594 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.766438 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e6e3d626-b1d8-4140-83e2-92db90a4eae4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.769594 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.766456 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e302771b-8af3-42dd-92e1-04faaff1c6e8-ovnkube-script-lib\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.769594 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.766708 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5d102faf-ea74-4afa-95c1-4133f4d71f8b-cni-binary-copy\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.769594 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.768590 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-tmp\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.769594 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.768634 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-etc-tuned\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.769594 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.768683 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e302771b-8af3-42dd-92e1-04faaff1c6e8-ovn-node-metrics-cert\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.769594 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.768767 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3ea5a78e-da64-4bee-b206-3f22bfd07fbc-agent-certs\") pod \"konnectivity-agent-q8spv\" (UID: \"3ea5a78e-da64-4bee-b206-3f22bfd07fbc\") " pod="kube-system/konnectivity-agent-q8spv" Apr 16 18:17:37.782382 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:37.782363 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:37.782382 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:37.782384 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:37.782513 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:37.782393 2579 projected.go:194] Error preparing data for projected volume kube-api-access-5fgjj for pod openshift-network-diagnostics/network-check-target-sq4vb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:37.782513 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:37.782452 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj podName:17b7bd66-ef40-4ff5-89de-2c7c3408fdc6 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:38.282433139 +0000 UTC m=+3.129593607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5fgjj" (UniqueName: "kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj") pod "network-check-target-sq4vb" (UID: "17b7bd66-ef40-4ff5-89de-2c7c3408fdc6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:37.783686 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.783664 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6sdw\" (UniqueName: \"kubernetes.io/projected/e2bb1680-b343-4014-bde1-6cc6bcd9805c-kube-api-access-q6sdw\") pod \"network-metrics-daemon-k6s9z\" (UID: \"e2bb1680-b343-4014-bde1-6cc6bcd9805c\") " pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:17:37.787666 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.787612 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjssq\" (UniqueName: \"kubernetes.io/projected/5d102faf-ea74-4afa-95c1-4133f4d71f8b-kube-api-access-sjssq\") pod \"multus-mtzwt\" (UID: \"5d102faf-ea74-4afa-95c1-4133f4d71f8b\") " pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.788189 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.788172 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnqsq\" (UniqueName: \"kubernetes.io/projected/140c4c84-432b-4ef8-b103-3d2b0694f222-kube-api-access-vnqsq\") pod \"aws-ebs-csi-driver-node-8hv5j\" (UID: \"140c4c84-432b-4ef8-b103-3d2b0694f222\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.788430 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.788405 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6zjr\" (UniqueName: \"kubernetes.io/projected/4dbfa418-d2c8-46b1-82c2-404a333f8ad9-kube-api-access-m6zjr\") pod \"tuned-g8hdh\" (UID: \"4dbfa418-d2c8-46b1-82c2-404a333f8ad9\") " pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.789156 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.789130 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmbz6\" (UniqueName: \"kubernetes.io/projected/e6e3d626-b1d8-4140-83e2-92db90a4eae4-kube-api-access-gmbz6\") pod \"multus-additional-cni-plugins-nw6hr\" (UID: \"e6e3d626-b1d8-4140-83e2-92db90a4eae4\") " pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.789671 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.789520 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8stvz\" (UniqueName: \"kubernetes.io/projected/eec98933-e01a-4818-94e6-b087062491fa-kube-api-access-8stvz\") pod \"iptables-alerter-7b4wl\" (UID: \"eec98933-e01a-4818-94e6-b087062491fa\") " pod="openshift-network-operator/iptables-alerter-7b4wl" Apr 16 18:17:37.789671 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.789571 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssc9m\" (UniqueName: \"kubernetes.io/projected/e302771b-8af3-42dd-92e1-04faaff1c6e8-kube-api-access-ssc9m\") pod \"ovnkube-node-md54p\" (UID: \"e302771b-8af3-42dd-92e1-04faaff1c6e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:37.790207 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.790187 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xg5v\" (UniqueName: \"kubernetes.io/projected/2b843015-ad80-4dc0-aad1-a22e5a3909f6-kube-api-access-8xg5v\") pod \"node-ca-l4nhg\" (UID: \"2b843015-ad80-4dc0-aad1-a22e5a3909f6\") " pod="openshift-image-registry/node-ca-l4nhg" Apr 16 18:17:37.790661 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.790629 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcj62\" (UniqueName: \"kubernetes.io/projected/9e454ceb-b1f7-44d0-899f-7d0a1be98b35-kube-api-access-tcj62\") pod \"node-resolver-mt4jc\" (UID: \"9e454ceb-b1f7-44d0-899f-7d0a1be98b35\") " pod="openshift-dns/node-resolver-mt4jc" Apr 16 18:17:37.827743 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.827719 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:37.858483 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.858423 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-6zhb2"] Apr 16 18:17:37.861556 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.861537 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:37.861675 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:37.861615 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zhb2" podUID="530ccfcb-dfe5-440b-8750-09dd186b8702" Apr 16 18:17:37.947085 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.947018 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mt4jc" Apr 16 18:17:37.952775 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.952750 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nw6hr" Apr 16 18:17:37.961393 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.961370 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mtzwt" Apr 16 18:17:37.967129 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.967107 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/530ccfcb-dfe5-440b-8750-09dd186b8702-kubelet-config\") pod \"global-pull-secret-syncer-6zhb2\" (UID: \"530ccfcb-dfe5-440b-8750-09dd186b8702\") " pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:37.967206 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.967154 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret\") pod \"global-pull-secret-syncer-6zhb2\" (UID: \"530ccfcb-dfe5-440b-8750-09dd186b8702\") " pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:37.967279 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.967204 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/530ccfcb-dfe5-440b-8750-09dd186b8702-dbus\") pod \"global-pull-secret-syncer-6zhb2\" (UID: \"530ccfcb-dfe5-440b-8750-09dd186b8702\") " pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:37.968030 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.968011 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7b4wl" Apr 16 18:17:37.973531 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.973513 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l4nhg" Apr 16 18:17:37.981067 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.981048 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" Apr 16 18:17:37.990659 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.990640 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" Apr 16 18:17:37.997233 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:37.997215 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:17:38.001772 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.001757 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q8spv" Apr 16 18:17:38.068256 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.068233 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/530ccfcb-dfe5-440b-8750-09dd186b8702-kubelet-config\") pod \"global-pull-secret-syncer-6zhb2\" (UID: \"530ccfcb-dfe5-440b-8750-09dd186b8702\") " pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:38.068396 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.068290 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret\") pod \"global-pull-secret-syncer-6zhb2\" (UID: \"530ccfcb-dfe5-440b-8750-09dd186b8702\") " pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:38.068456 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:38.068392 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:38.068456 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.068410 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/530ccfcb-dfe5-440b-8750-09dd186b8702-dbus\") pod \"global-pull-secret-syncer-6zhb2\" (UID: \"530ccfcb-dfe5-440b-8750-09dd186b8702\") " pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:38.068456 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:38.068453 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret podName:530ccfcb-dfe5-440b-8750-09dd186b8702 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:38.568433581 +0000 UTC m=+3.415594065 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret") pod "global-pull-secret-syncer-6zhb2" (UID: "530ccfcb-dfe5-440b-8750-09dd186b8702") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:38.068590 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.068392 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/530ccfcb-dfe5-440b-8750-09dd186b8702-kubelet-config\") pod \"global-pull-secret-syncer-6zhb2\" (UID: \"530ccfcb-dfe5-440b-8750-09dd186b8702\") " pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:38.068590 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.068526 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/530ccfcb-dfe5-440b-8750-09dd186b8702-dbus\") pod \"global-pull-secret-syncer-6zhb2\" (UID: \"530ccfcb-dfe5-440b-8750-09dd186b8702\") " pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:38.153728 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.153704 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:38.270891 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.270827 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs\") pod \"network-metrics-daemon-k6s9z\" (UID: \"e2bb1680-b343-4014-bde1-6cc6bcd9805c\") " pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:17:38.271007 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:38.270984 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:38.271071 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:38.271060 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs podName:e2bb1680-b343-4014-bde1-6cc6bcd9805c nodeName:}" failed. No retries permitted until 2026-04-16 18:17:39.27103872 +0000 UTC m=+4.118199194 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs") pod "network-metrics-daemon-k6s9z" (UID: "e2bb1680-b343-4014-bde1-6cc6bcd9805c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:38.371816 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.371793 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fgjj\" (UniqueName: \"kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj\") pod \"network-check-target-sq4vb\" (UID: \"17b7bd66-ef40-4ff5-89de-2c7c3408fdc6\") " pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:17:38.371990 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:38.371917 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:38.371990 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:38.371934 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:38.371990 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:38.371942 2579 projected.go:194] Error preparing data for projected volume kube-api-access-5fgjj for pod openshift-network-diagnostics/network-check-target-sq4vb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:38.371990 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:38.371991 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj podName:17b7bd66-ef40-4ff5-89de-2c7c3408fdc6 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:39.371978252 +0000 UTC m=+4.219138721 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5fgjj" (UniqueName: "kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj") pod "network-check-target-sq4vb" (UID: "17b7bd66-ef40-4ff5-89de-2c7c3408fdc6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:38.437789 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:38.437763 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e454ceb_b1f7_44d0_899f_7d0a1be98b35.slice/crio-7b2e9171628305b673be8e6c6a6a661ceefbbbe19bbef20a9dd18df8e9133980 WatchSource:0}: Error finding container 7b2e9171628305b673be8e6c6a6a661ceefbbbe19bbef20a9dd18df8e9133980: Status 404 returned error can't find the container with id 7b2e9171628305b673be8e6c6a6a661ceefbbbe19bbef20a9dd18df8e9133980 Apr 16 18:17:38.439177 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:38.439132 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dbfa418_d2c8_46b1_82c2_404a333f8ad9.slice/crio-1df658632691bb3a2fb09eea7350a75460250996452d9b3fbb35c866bf3d6a82 WatchSource:0}: Error finding container 1df658632691bb3a2fb09eea7350a75460250996452d9b3fbb35c866bf3d6a82: Status 404 returned error can't find the container with id 1df658632691bb3a2fb09eea7350a75460250996452d9b3fbb35c866bf3d6a82 Apr 16 18:17:38.440217 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:38.440191 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeec98933_e01a_4818_94e6_b087062491fa.slice/crio-4259bc258b2ba57dbe02177efdb0119105480e721c77b6a4d915d938ea1ebc9d WatchSource:0}: Error finding container 4259bc258b2ba57dbe02177efdb0119105480e721c77b6a4d915d938ea1ebc9d: Status 404 returned error can't find the container with id 4259bc258b2ba57dbe02177efdb0119105480e721c77b6a4d915d938ea1ebc9d Apr 16 18:17:38.442966 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:38.442940 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d102faf_ea74_4afa_95c1_4133f4d71f8b.slice/crio-587c9e1a7d7758e689a1205133a8628bd41f18a885042640f93ab642ce61845d WatchSource:0}: Error finding container 587c9e1a7d7758e689a1205133a8628bd41f18a885042640f93ab642ce61845d: Status 404 returned error can't find the container with id 587c9e1a7d7758e689a1205133a8628bd41f18a885042640f93ab642ce61845d Apr 16 18:17:38.443970 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:38.443947 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6e3d626_b1d8_4140_83e2_92db90a4eae4.slice/crio-8c5b4ffd6039d8ea43c901107c71bf0a71c1cad10912b77423d8d5c883d3f6b4 WatchSource:0}: Error finding container 8c5b4ffd6039d8ea43c901107c71bf0a71c1cad10912b77423d8d5c883d3f6b4: Status 404 returned error can't find the container with id 8c5b4ffd6039d8ea43c901107c71bf0a71c1cad10912b77423d8d5c883d3f6b4 Apr 16 18:17:38.446209 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:38.446186 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod140c4c84_432b_4ef8_b103_3d2b0694f222.slice/crio-1d31837df4de937d35e6931488b3b2c5c028a4dd363b8ba2dc694fff4021894b WatchSource:0}: Error finding container 1d31837df4de937d35e6931488b3b2c5c028a4dd363b8ba2dc694fff4021894b: Status 404 returned error can't find the container with id 1d31837df4de937d35e6931488b3b2c5c028a4dd363b8ba2dc694fff4021894b Apr 16 18:17:38.447024 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:38.446991 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode302771b_8af3_42dd_92e1_04faaff1c6e8.slice/crio-abc4af53ec2bd136a3b125f4dfa0a7a69b1195fef75ccc764f19398c70e639a5 WatchSource:0}: Error finding container abc4af53ec2bd136a3b125f4dfa0a7a69b1195fef75ccc764f19398c70e639a5: Status 404 returned error can't find the container with id abc4af53ec2bd136a3b125f4dfa0a7a69b1195fef75ccc764f19398c70e639a5 Apr 16 18:17:38.448442 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:38.448404 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ea5a78e_da64_4bee_b206_3f22bfd07fbc.slice/crio-bfedd375d7626ce7f6724f64d5f5a9bedb4491c7fcd3acbaec3547d2d47b137e WatchSource:0}: Error finding container bfedd375d7626ce7f6724f64d5f5a9bedb4491c7fcd3acbaec3547d2d47b137e: Status 404 returned error can't find the container with id bfedd375d7626ce7f6724f64d5f5a9bedb4491c7fcd3acbaec3547d2d47b137e Apr 16 18:17:38.448683 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:17:38.448660 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b843015_ad80_4dc0_aad1_a22e5a3909f6.slice/crio-7e69499376100becd943f47a80e38614cf5df594eb9c1af1580e73878b6762a3 WatchSource:0}: Error finding container 7e69499376100becd943f47a80e38614cf5df594eb9c1af1580e73878b6762a3: Status 404 returned error can't find the container with id 7e69499376100becd943f47a80e38614cf5df594eb9c1af1580e73878b6762a3 Apr 16 18:17:38.573877 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.573752 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret\") pod \"global-pull-secret-syncer-6zhb2\" (UID: \"530ccfcb-dfe5-440b-8750-09dd186b8702\") " pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:38.573961 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:38.573877 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:38.573961 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:38.573942 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret podName:530ccfcb-dfe5-440b-8750-09dd186b8702 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:39.573922835 +0000 UTC m=+4.421083309 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret") pod "global-pull-secret-syncer-6zhb2" (UID: "530ccfcb-dfe5-440b-8750-09dd186b8702") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:38.699036 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.699009 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:12:36 +0000 UTC" deadline="2027-10-05 05:11:41.669489344 +0000 UTC" Apr 16 18:17:38.699036 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.699033 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12874h54m2.970458822s" Apr 16 18:17:38.757842 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.757815 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nw6hr" event={"ID":"e6e3d626-b1d8-4140-83e2-92db90a4eae4","Type":"ContainerStarted","Data":"8c5b4ffd6039d8ea43c901107c71bf0a71c1cad10912b77423d8d5c883d3f6b4"} Apr 16 18:17:38.760746 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.760721 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" event={"ID":"140c4c84-432b-4ef8-b103-3d2b0694f222","Type":"ContainerStarted","Data":"1d31837df4de937d35e6931488b3b2c5c028a4dd363b8ba2dc694fff4021894b"} Apr 16 18:17:38.761659 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.761639 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l4nhg" event={"ID":"2b843015-ad80-4dc0-aad1-a22e5a3909f6","Type":"ContainerStarted","Data":"7e69499376100becd943f47a80e38614cf5df594eb9c1af1580e73878b6762a3"} Apr 16 18:17:38.762602 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.762575 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mtzwt" event={"ID":"5d102faf-ea74-4afa-95c1-4133f4d71f8b","Type":"ContainerStarted","Data":"587c9e1a7d7758e689a1205133a8628bd41f18a885042640f93ab642ce61845d"} Apr 16 18:17:38.763463 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.763440 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" event={"ID":"4dbfa418-d2c8-46b1-82c2-404a333f8ad9","Type":"ContainerStarted","Data":"1df658632691bb3a2fb09eea7350a75460250996452d9b3fbb35c866bf3d6a82"} Apr 16 18:17:38.764564 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.764545 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mt4jc" event={"ID":"9e454ceb-b1f7-44d0-899f-7d0a1be98b35","Type":"ContainerStarted","Data":"7b2e9171628305b673be8e6c6a6a661ceefbbbe19bbef20a9dd18df8e9133980"} Apr 16 18:17:38.765965 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.765945 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-167.ec2.internal" event={"ID":"69e28dff0cb5be54806a6f5d3d910a4e","Type":"ContainerStarted","Data":"cf0ccb54035655f0a3932b684c672011c768eabcf92b4f9bd11f51857cb7718d"} Apr 16 18:17:38.766964 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.766938 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-md54p" event={"ID":"e302771b-8af3-42dd-92e1-04faaff1c6e8","Type":"ContainerStarted","Data":"abc4af53ec2bd136a3b125f4dfa0a7a69b1195fef75ccc764f19398c70e639a5"} Apr 16 18:17:38.767850 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.767830 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q8spv" event={"ID":"3ea5a78e-da64-4bee-b206-3f22bfd07fbc","Type":"ContainerStarted","Data":"bfedd375d7626ce7f6724f64d5f5a9bedb4491c7fcd3acbaec3547d2d47b137e"} Apr 16 18:17:38.772641 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.772613 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7b4wl" event={"ID":"eec98933-e01a-4818-94e6-b087062491fa","Type":"ContainerStarted","Data":"4259bc258b2ba57dbe02177efdb0119105480e721c77b6a4d915d938ea1ebc9d"} Apr 16 18:17:38.783842 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:38.783757 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-167.ec2.internal" podStartSLOduration=1.783742614 podStartE2EDuration="1.783742614s" podCreationTimestamp="2026-04-16 18:17:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:17:38.782940513 +0000 UTC m=+3.630101006" watchObservedRunningTime="2026-04-16 18:17:38.783742614 +0000 UTC m=+3.630903105" Apr 16 18:17:39.280560 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:39.280465 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs\") pod \"network-metrics-daemon-k6s9z\" (UID: \"e2bb1680-b343-4014-bde1-6cc6bcd9805c\") " pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:17:39.280721 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:39.280619 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:39.280721 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:39.280685 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs podName:e2bb1680-b343-4014-bde1-6cc6bcd9805c nodeName:}" failed. No retries permitted until 2026-04-16 18:17:41.280665343 +0000 UTC m=+6.127825835 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs") pod "network-metrics-daemon-k6s9z" (UID: "e2bb1680-b343-4014-bde1-6cc6bcd9805c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:39.382304 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:39.381861 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fgjj\" (UniqueName: \"kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj\") pod \"network-check-target-sq4vb\" (UID: \"17b7bd66-ef40-4ff5-89de-2c7c3408fdc6\") " pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:17:39.382304 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:39.382091 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:39.382304 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:39.382109 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:39.382304 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:39.382121 2579 projected.go:194] Error preparing data for projected volume kube-api-access-5fgjj for pod openshift-network-diagnostics/network-check-target-sq4vb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:39.382304 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:39.382176 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj podName:17b7bd66-ef40-4ff5-89de-2c7c3408fdc6 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:41.382158026 +0000 UTC m=+6.229318498 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-5fgjj" (UniqueName: "kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj") pod "network-check-target-sq4vb" (UID: "17b7bd66-ef40-4ff5-89de-2c7c3408fdc6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:39.583686 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:39.583073 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret\") pod \"global-pull-secret-syncer-6zhb2\" (UID: \"530ccfcb-dfe5-440b-8750-09dd186b8702\") " pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:39.583686 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:39.583279 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:39.583686 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:39.583341 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret podName:530ccfcb-dfe5-440b-8750-09dd186b8702 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:41.583321915 +0000 UTC m=+6.430482387 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret") pod "global-pull-secret-syncer-6zhb2" (UID: "530ccfcb-dfe5-440b-8750-09dd186b8702") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:39.751920 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:39.751457 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:39.751920 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:39.751576 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:17:39.751920 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:39.751586 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zhb2" podUID="530ccfcb-dfe5-440b-8750-09dd186b8702" Apr 16 18:17:39.751920 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:39.751706 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6s9z" podUID="e2bb1680-b343-4014-bde1-6cc6bcd9805c" Apr 16 18:17:39.751920 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:39.751758 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:17:39.751920 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:39.751824 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq4vb" podUID="17b7bd66-ef40-4ff5-89de-2c7c3408fdc6" Apr 16 18:17:39.796848 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:39.796325 2579 generic.go:358] "Generic (PLEG): container finished" podID="69d25570a0cabb0e86b33b5d502c3716" containerID="5f4c19c1fae3ba77262f24ff099d9f070a4915f03724d2291e223fe716cce0a1" exitCode=0 Apr 16 18:17:39.796848 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:39.796420 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-167.ec2.internal" event={"ID":"69d25570a0cabb0e86b33b5d502c3716","Type":"ContainerDied","Data":"5f4c19c1fae3ba77262f24ff099d9f070a4915f03724d2291e223fe716cce0a1"} Apr 16 18:17:40.806650 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:40.805945 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-167.ec2.internal" event={"ID":"69d25570a0cabb0e86b33b5d502c3716","Type":"ContainerStarted","Data":"95791bd0000341bad81d489db734aa22a80529253781de28306310ff2fb45c05"} Apr 16 18:17:41.298189 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:41.297621 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs\") pod \"network-metrics-daemon-k6s9z\" (UID: \"e2bb1680-b343-4014-bde1-6cc6bcd9805c\") " pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:17:41.298189 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:41.297775 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:41.298189 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:41.297830 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs podName:e2bb1680-b343-4014-bde1-6cc6bcd9805c nodeName:}" failed. No retries permitted until 2026-04-16 18:17:45.297812207 +0000 UTC m=+10.144972679 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs") pod "network-metrics-daemon-k6s9z" (UID: "e2bb1680-b343-4014-bde1-6cc6bcd9805c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:41.398287 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:41.398238 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fgjj\" (UniqueName: \"kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj\") pod \"network-check-target-sq4vb\" (UID: \"17b7bd66-ef40-4ff5-89de-2c7c3408fdc6\") " pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:17:41.398432 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:41.398405 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:41.398432 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:41.398423 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:41.398432 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:41.398435 2579 projected.go:194] Error preparing data for projected volume kube-api-access-5fgjj for pod openshift-network-diagnostics/network-check-target-sq4vb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:41.398583 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:41.398521 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj podName:17b7bd66-ef40-4ff5-89de-2c7c3408fdc6 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:45.398502451 +0000 UTC m=+10.245662923 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-5fgjj" (UniqueName: "kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj") pod "network-check-target-sq4vb" (UID: "17b7bd66-ef40-4ff5-89de-2c7c3408fdc6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:41.598936 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:41.598856 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret\") pod \"global-pull-secret-syncer-6zhb2\" (UID: \"530ccfcb-dfe5-440b-8750-09dd186b8702\") " pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:41.599096 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:41.599039 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:41.599166 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:41.599107 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret podName:530ccfcb-dfe5-440b-8750-09dd186b8702 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:45.599088843 +0000 UTC m=+10.446249314 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret") pod "global-pull-secret-syncer-6zhb2" (UID: "530ccfcb-dfe5-440b-8750-09dd186b8702") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:41.750285 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:41.750244 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:41.750460 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:41.750244 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:17:41.750460 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:41.750385 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zhb2" podUID="530ccfcb-dfe5-440b-8750-09dd186b8702" Apr 16 18:17:41.750575 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:41.750505 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq4vb" podUID="17b7bd66-ef40-4ff5-89de-2c7c3408fdc6" Apr 16 18:17:41.750628 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:41.750586 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:17:41.750704 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:41.750676 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6s9z" podUID="e2bb1680-b343-4014-bde1-6cc6bcd9805c" Apr 16 18:17:43.750424 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:43.750392 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:17:43.750833 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:43.750436 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:43.750833 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:43.750502 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:17:43.750833 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:43.750513 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq4vb" podUID="17b7bd66-ef40-4ff5-89de-2c7c3408fdc6" Apr 16 18:17:43.750833 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:43.750638 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6s9z" podUID="e2bb1680-b343-4014-bde1-6cc6bcd9805c" Apr 16 18:17:43.750833 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:43.750734 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zhb2" podUID="530ccfcb-dfe5-440b-8750-09dd186b8702" Apr 16 18:17:45.324235 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:45.324199 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs\") pod \"network-metrics-daemon-k6s9z\" (UID: \"e2bb1680-b343-4014-bde1-6cc6bcd9805c\") " pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:17:45.324725 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:45.324354 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:45.324725 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:45.324414 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs podName:e2bb1680-b343-4014-bde1-6cc6bcd9805c nodeName:}" failed. No retries permitted until 2026-04-16 18:17:53.324396225 +0000 UTC m=+18.171556708 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs") pod "network-metrics-daemon-k6s9z" (UID: "e2bb1680-b343-4014-bde1-6cc6bcd9805c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:45.424984 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:45.424939 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fgjj\" (UniqueName: \"kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj\") pod \"network-check-target-sq4vb\" (UID: \"17b7bd66-ef40-4ff5-89de-2c7c3408fdc6\") " pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:17:45.425145 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:45.425074 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:45.425145 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:45.425096 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:45.425145 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:45.425105 2579 projected.go:194] Error preparing data for projected volume kube-api-access-5fgjj for pod openshift-network-diagnostics/network-check-target-sq4vb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:45.425305 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:45.425157 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj podName:17b7bd66-ef40-4ff5-89de-2c7c3408fdc6 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:53.425139117 +0000 UTC m=+18.272299586 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-5fgjj" (UniqueName: "kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj") pod "network-check-target-sq4vb" (UID: "17b7bd66-ef40-4ff5-89de-2c7c3408fdc6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:45.626647 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:45.626143 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret\") pod \"global-pull-secret-syncer-6zhb2\" (UID: \"530ccfcb-dfe5-440b-8750-09dd186b8702\") " pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:45.626647 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:45.626328 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:45.626647 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:45.626403 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret podName:530ccfcb-dfe5-440b-8750-09dd186b8702 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:53.626383215 +0000 UTC m=+18.473543688 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret") pod "global-pull-secret-syncer-6zhb2" (UID: "530ccfcb-dfe5-440b-8750-09dd186b8702") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:45.751737 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:45.751703 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:45.751896 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:45.751810 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zhb2" podUID="530ccfcb-dfe5-440b-8750-09dd186b8702" Apr 16 18:17:45.752179 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:45.752022 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:17:45.752179 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:45.752136 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6s9z" podUID="e2bb1680-b343-4014-bde1-6cc6bcd9805c" Apr 16 18:17:45.752179 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:45.752167 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:17:45.752404 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:45.752263 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq4vb" podUID="17b7bd66-ef40-4ff5-89de-2c7c3408fdc6" Apr 16 18:17:47.750882 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:47.750847 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:17:47.751356 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:47.751022 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6s9z" podUID="e2bb1680-b343-4014-bde1-6cc6bcd9805c" Apr 16 18:17:47.751356 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:47.751082 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:17:47.751356 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:47.751082 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:47.751356 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:47.751190 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq4vb" podUID="17b7bd66-ef40-4ff5-89de-2c7c3408fdc6" Apr 16 18:17:47.751356 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:47.751301 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zhb2" podUID="530ccfcb-dfe5-440b-8750-09dd186b8702" Apr 16 18:17:49.750144 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:49.750112 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:49.750576 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:49.750112 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:17:49.750576 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:49.750229 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zhb2" podUID="530ccfcb-dfe5-440b-8750-09dd186b8702" Apr 16 18:17:49.750576 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:49.750346 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq4vb" podUID="17b7bd66-ef40-4ff5-89de-2c7c3408fdc6" Apr 16 18:17:49.750576 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:49.750389 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:17:49.750576 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:49.750493 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6s9z" podUID="e2bb1680-b343-4014-bde1-6cc6bcd9805c" Apr 16 18:17:51.750693 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:51.750651 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:51.751178 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:51.750651 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:17:51.751178 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:51.750776 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zhb2" podUID="530ccfcb-dfe5-440b-8750-09dd186b8702" Apr 16 18:17:51.751178 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:51.750651 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:17:51.751178 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:51.750872 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq4vb" podUID="17b7bd66-ef40-4ff5-89de-2c7c3408fdc6" Apr 16 18:17:51.751178 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:51.750985 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6s9z" podUID="e2bb1680-b343-4014-bde1-6cc6bcd9805c" Apr 16 18:17:53.385702 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:53.385670 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs\") pod \"network-metrics-daemon-k6s9z\" (UID: \"e2bb1680-b343-4014-bde1-6cc6bcd9805c\") " pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:17:53.386129 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:53.385831 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:53.386129 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:53.385907 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs podName:e2bb1680-b343-4014-bde1-6cc6bcd9805c nodeName:}" failed. No retries permitted until 2026-04-16 18:18:09.385886886 +0000 UTC m=+34.233047364 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs") pod "network-metrics-daemon-k6s9z" (UID: "e2bb1680-b343-4014-bde1-6cc6bcd9805c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:53.486457 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:53.486422 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fgjj\" (UniqueName: \"kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj\") pod \"network-check-target-sq4vb\" (UID: \"17b7bd66-ef40-4ff5-89de-2c7c3408fdc6\") " pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:17:53.486589 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:53.486563 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:53.486589 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:53.486577 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:53.486589 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:53.486585 2579 projected.go:194] Error preparing data for projected volume kube-api-access-5fgjj for pod openshift-network-diagnostics/network-check-target-sq4vb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:53.486718 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:53.486636 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj podName:17b7bd66-ef40-4ff5-89de-2c7c3408fdc6 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:09.486620363 +0000 UTC m=+34.333780833 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-5fgjj" (UniqueName: "kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj") pod "network-check-target-sq4vb" (UID: "17b7bd66-ef40-4ff5-89de-2c7c3408fdc6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:53.687766 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:53.687693 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret\") pod \"global-pull-secret-syncer-6zhb2\" (UID: \"530ccfcb-dfe5-440b-8750-09dd186b8702\") " pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:53.687902 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:53.687830 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:53.687902 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:53.687889 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret podName:530ccfcb-dfe5-440b-8750-09dd186b8702 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:09.687872559 +0000 UTC m=+34.535033028 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret") pod "global-pull-secret-syncer-6zhb2" (UID: "530ccfcb-dfe5-440b-8750-09dd186b8702") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:53.751032 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:53.751004 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:53.751032 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:53.751019 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:17:53.751250 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:53.751016 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:17:53.751250 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:53.751115 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zhb2" podUID="530ccfcb-dfe5-440b-8750-09dd186b8702" Apr 16 18:17:53.751250 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:53.751221 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6s9z" podUID="e2bb1680-b343-4014-bde1-6cc6bcd9805c" Apr 16 18:17:53.751424 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:53.751316 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq4vb" podUID="17b7bd66-ef40-4ff5-89de-2c7c3408fdc6" Apr 16 18:17:55.751498 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:55.751336 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:55.752031 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:55.751422 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:17:55.752031 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:55.751583 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zhb2" podUID="530ccfcb-dfe5-440b-8750-09dd186b8702" Apr 16 18:17:55.752031 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:55.751705 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6s9z" podUID="e2bb1680-b343-4014-bde1-6cc6bcd9805c" Apr 16 18:17:55.752031 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:55.751449 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:17:55.752031 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:55.751798 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq4vb" podUID="17b7bd66-ef40-4ff5-89de-2c7c3408fdc6" Apr 16 18:17:55.829084 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:55.829047 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" event={"ID":"140c4c84-432b-4ef8-b103-3d2b0694f222","Type":"ContainerStarted","Data":"4eccd1c09252b0c47c0567377bdced42a6e31b07b97a720618215fd322c826b5"} Apr 16 18:17:55.830477 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:55.830437 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l4nhg" event={"ID":"2b843015-ad80-4dc0-aad1-a22e5a3909f6","Type":"ContainerStarted","Data":"a0e8c3ec2f09ae219b38aa0efa4bc61ec40706336a29e338b72f981978436ba1"} Apr 16 18:17:55.831906 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:55.831876 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mtzwt" event={"ID":"5d102faf-ea74-4afa-95c1-4133f4d71f8b","Type":"ContainerStarted","Data":"636c33ca655cfa9e247e24cb74b46259893a5c4e19a4af17e69eeb3015b992c8"} Apr 16 18:17:55.834043 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:55.833613 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" event={"ID":"4dbfa418-d2c8-46b1-82c2-404a333f8ad9","Type":"ContainerStarted","Data":"d576e75bd2388a2b0364de1f3616844a9ed44c7d0a5786b79dac300d5db2c1c9"} Apr 16 18:17:55.839643 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:55.839611 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mt4jc" event={"ID":"9e454ceb-b1f7-44d0-899f-7d0a1be98b35","Type":"ContainerStarted","Data":"1042e23e9065181a0f663c9daf8e7719630714c599462ac320f865e768cef356"} Apr 16 18:17:55.841591 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:55.841573 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 18:17:55.841894 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:55.841876 2579 generic.go:358] "Generic (PLEG): container finished" podID="e302771b-8af3-42dd-92e1-04faaff1c6e8" containerID="4f3fae68e07cb565982a22cfbf11a1f52bdeb8f6bf1a4a2ef83bd871abf0e561" exitCode=1 Apr 16 18:17:55.841968 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:55.841923 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-md54p" event={"ID":"e302771b-8af3-42dd-92e1-04faaff1c6e8","Type":"ContainerStarted","Data":"c57dfbc2695f89bce2830e70155c67cd210657169cebe0b375433523f6259d59"} Apr 16 18:17:55.841968 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:55.841939 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-md54p" event={"ID":"e302771b-8af3-42dd-92e1-04faaff1c6e8","Type":"ContainerStarted","Data":"099aab9a95ed87b4be89d255b347f38440aeb79ca7c7848d21e77fbbb8ba97de"} Apr 16 18:17:55.841968 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:55.841948 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-md54p" event={"ID":"e302771b-8af3-42dd-92e1-04faaff1c6e8","Type":"ContainerDied","Data":"4f3fae68e07cb565982a22cfbf11a1f52bdeb8f6bf1a4a2ef83bd871abf0e561"} Apr 16 18:17:55.841968 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:55.841958 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-md54p" event={"ID":"e302771b-8af3-42dd-92e1-04faaff1c6e8","Type":"ContainerStarted","Data":"d6b50ca640bc97a0b23a129b769ad8b69425789172701d7d21dd3ec140e30e35"} Apr 16 18:17:55.843091 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:55.843070 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q8spv" event={"ID":"3ea5a78e-da64-4bee-b206-3f22bfd07fbc","Type":"ContainerStarted","Data":"6fb7f83bea4b034733a6b12b927ec66731189ef70b203cd01ef74dec950dc22c"} Apr 16 18:17:55.844683 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:55.844664 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nw6hr" event={"ID":"e6e3d626-b1d8-4140-83e2-92db90a4eae4","Type":"ContainerStarted","Data":"1ade3d6895bcb239826e1e9501791ee44a2086ee76d6170f7aa4b2c22ac9066c"} Apr 16 18:17:55.847186 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:55.847138 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-l4nhg" podStartSLOduration=8.679527059 podStartE2EDuration="20.847122568s" podCreationTimestamp="2026-04-16 18:17:35 +0000 UTC" firstStartedPulling="2026-04-16 18:17:38.450981121 +0000 UTC m=+3.298141603" lastFinishedPulling="2026-04-16 18:17:50.618576628 +0000 UTC m=+15.465737112" observedRunningTime="2026-04-16 18:17:55.846155413 +0000 UTC m=+20.693315925" watchObservedRunningTime="2026-04-16 18:17:55.847122568 +0000 UTC m=+20.694283061" Apr 16 18:17:55.847315 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:55.847233 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-167.ec2.internal" podStartSLOduration=18.847226916 podStartE2EDuration="18.847226916s" podCreationTimestamp="2026-04-16 18:17:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:17:40.833392317 +0000 UTC m=+5.680552811" watchObservedRunningTime="2026-04-16 18:17:55.847226916 +0000 UTC m=+20.694387408" Apr 16 18:17:55.859918 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:55.859876 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-mt4jc" podStartSLOduration=4.076980511 podStartE2EDuration="20.859866042s" podCreationTimestamp="2026-04-16 18:17:35 +0000 UTC" firstStartedPulling="2026-04-16 18:17:38.439434803 +0000 UTC m=+3.286595275" lastFinishedPulling="2026-04-16 18:17:55.222320327 +0000 UTC m=+20.069480806" observedRunningTime="2026-04-16 18:17:55.859794922 +0000 UTC m=+20.706955431" watchObservedRunningTime="2026-04-16 18:17:55.859866042 +0000 UTC m=+20.707026533" Apr 16 18:17:55.894747 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:55.894572 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-q8spv" podStartSLOduration=3.122836321 podStartE2EDuration="19.894552502s" podCreationTimestamp="2026-04-16 18:17:36 +0000 UTC" firstStartedPulling="2026-04-16 18:17:38.450832971 +0000 UTC m=+3.297993453" lastFinishedPulling="2026-04-16 18:17:55.222549156 +0000 UTC m=+20.069709634" observedRunningTime="2026-04-16 18:17:55.894205493 +0000 UTC m=+20.741365995" watchObservedRunningTime="2026-04-16 18:17:55.894552502 +0000 UTC m=+20.741712994" Apr 16 18:17:55.909820 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:55.909784 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mtzwt" podStartSLOduration=4.0952317560000004 podStartE2EDuration="20.909771761s" podCreationTimestamp="2026-04-16 18:17:35 +0000 UTC" firstStartedPulling="2026-04-16 18:17:38.444653251 +0000 UTC m=+3.291813721" lastFinishedPulling="2026-04-16 18:17:55.259193252 +0000 UTC m=+20.106353726" observedRunningTime="2026-04-16 18:17:55.909691456 +0000 UTC m=+20.756851948" watchObservedRunningTime="2026-04-16 18:17:55.909771761 +0000 UTC m=+20.756932253" Apr 16 18:17:56.807583 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:56.807545 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:17:56.849525 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:56.849496 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 18:17:56.849867 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:56.849844 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-md54p" event={"ID":"e302771b-8af3-42dd-92e1-04faaff1c6e8","Type":"ContainerStarted","Data":"30f94a260c3b3a907083ae722b98a989942f1abeb6a2725df3c12b03cbbe47c2"} Apr 16 18:17:56.849945 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:56.849879 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-md54p" event={"ID":"e302771b-8af3-42dd-92e1-04faaff1c6e8","Type":"ContainerStarted","Data":"6756d31404251e928a6dad0031ce0c7942117093893342fff40f7b54e60a66bb"} Apr 16 18:17:56.851182 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:56.851150 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7b4wl" event={"ID":"eec98933-e01a-4818-94e6-b087062491fa","Type":"ContainerStarted","Data":"0a569b6ebc367fe83d1638986d35d06d8362cb027ccc976c4120fdb217188580"} Apr 16 18:17:56.852583 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:56.852563 2579 generic.go:358] "Generic (PLEG): container finished" podID="e6e3d626-b1d8-4140-83e2-92db90a4eae4" containerID="1ade3d6895bcb239826e1e9501791ee44a2086ee76d6170f7aa4b2c22ac9066c" exitCode=0 Apr 16 18:17:56.852674 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:56.852621 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nw6hr" event={"ID":"e6e3d626-b1d8-4140-83e2-92db90a4eae4","Type":"ContainerDied","Data":"1ade3d6895bcb239826e1e9501791ee44a2086ee76d6170f7aa4b2c22ac9066c"} Apr 16 18:17:56.854192 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:56.854122 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" event={"ID":"140c4c84-432b-4ef8-b103-3d2b0694f222","Type":"ContainerStarted","Data":"054ea21f3be2924e68c22a192c5a70f3c5ca9113df006a24a3f87a2dd45d0bc4"} Apr 16 18:17:56.869469 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:56.869424 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-7b4wl" podStartSLOduration=5.088889445 podStartE2EDuration="21.869409317s" podCreationTimestamp="2026-04-16 18:17:35 +0000 UTC" firstStartedPulling="2026-04-16 18:17:38.441766634 +0000 UTC m=+3.288927110" lastFinishedPulling="2026-04-16 18:17:55.2222865 +0000 UTC m=+20.069446982" observedRunningTime="2026-04-16 18:17:56.868533364 +0000 UTC m=+21.715693854" watchObservedRunningTime="2026-04-16 18:17:56.869409317 +0000 UTC m=+21.716569809" Apr 16 18:17:56.870195 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:56.870162 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-g8hdh" podStartSLOduration=4.089553193 podStartE2EDuration="20.870153319s" podCreationTimestamp="2026-04-16 18:17:36 +0000 UTC" firstStartedPulling="2026-04-16 18:17:38.44162787 +0000 UTC m=+3.288788342" lastFinishedPulling="2026-04-16 18:17:55.222227986 +0000 UTC m=+20.069388468" observedRunningTime="2026-04-16 18:17:55.923427485 +0000 UTC m=+20.770588000" watchObservedRunningTime="2026-04-16 18:17:56.870153319 +0000 UTC m=+21.717313813" Apr 16 18:17:57.717607 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:57.717504 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:17:56.807565647Z","UUID":"3cae1aaa-bbfd-41ce-b5b7-5d2d5c294c7f","Handler":null,"Name":"","Endpoint":""} Apr 16 18:17:57.720525 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:57.720492 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:17:57.720525 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:57.720522 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:17:57.750201 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:57.750143 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:17:57.750201 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:57.750157 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:17:57.750201 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:57.750167 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:57.750428 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:57.750257 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq4vb" podUID="17b7bd66-ef40-4ff5-89de-2c7c3408fdc6" Apr 16 18:17:57.750704 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:57.750682 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6s9z" podUID="e2bb1680-b343-4014-bde1-6cc6bcd9805c" Apr 16 18:17:57.750789 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:57.750770 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zhb2" podUID="530ccfcb-dfe5-440b-8750-09dd186b8702" Apr 16 18:17:58.859906 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:58.859881 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 18:17:58.860358 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:58.860252 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-md54p" event={"ID":"e302771b-8af3-42dd-92e1-04faaff1c6e8","Type":"ContainerStarted","Data":"9923982b6cb39b2ad067b2cd6cebd8df8919b4b3b38e612cf058933ddd41621c"} Apr 16 18:17:58.862332 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:58.862293 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" event={"ID":"140c4c84-432b-4ef8-b103-3d2b0694f222","Type":"ContainerStarted","Data":"208177453d640d7da88df1ad7572f5fb36e87773f318f9e8cecb9df6d2faaa88"} Apr 16 18:17:58.881893 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:58.881857 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8hv5j" podStartSLOduration=4.308999645 podStartE2EDuration="23.881845068s" podCreationTimestamp="2026-04-16 18:17:35 +0000 UTC" firstStartedPulling="2026-04-16 18:17:38.448630564 +0000 UTC m=+3.295791049" lastFinishedPulling="2026-04-16 18:17:58.021475988 +0000 UTC m=+22.868636472" observedRunningTime="2026-04-16 18:17:58.881754386 +0000 UTC m=+23.728914879" watchObservedRunningTime="2026-04-16 18:17:58.881845068 +0000 UTC m=+23.729005560" Apr 16 18:17:59.750294 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:59.750119 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:17:59.750436 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:59.750118 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:17:59.750436 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:59.750378 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6s9z" podUID="e2bb1680-b343-4014-bde1-6cc6bcd9805c" Apr 16 18:17:59.750436 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:17:59.750171 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:17:59.750592 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:59.750425 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq4vb" podUID="17b7bd66-ef40-4ff5-89de-2c7c3408fdc6" Apr 16 18:17:59.750592 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:17:59.750533 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zhb2" podUID="530ccfcb-dfe5-440b-8750-09dd186b8702" Apr 16 18:18:00.445168 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:00.445136 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-q8spv" Apr 16 18:18:00.445789 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:00.445691 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-q8spv" Apr 16 18:18:00.869011 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:00.868984 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 18:18:00.869409 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:00.869383 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-md54p" event={"ID":"e302771b-8af3-42dd-92e1-04faaff1c6e8","Type":"ContainerStarted","Data":"11069fc4e31e620519348db3d3e169f6215f75876a288f7e33b5994556c64c70"} Apr 16 18:18:00.869813 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:00.869761 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:18:00.869923 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:00.869903 2579 scope.go:117] "RemoveContainer" containerID="4f3fae68e07cb565982a22cfbf11a1f52bdeb8f6bf1a4a2ef83bd871abf0e561" Apr 16 18:18:00.871544 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:00.871383 2579 generic.go:358] "Generic (PLEG): container finished" podID="e6e3d626-b1d8-4140-83e2-92db90a4eae4" containerID="86e8a7fa9e87953a351ab7aecd2dfca94d756d348f454c364591a3c639c5126c" exitCode=0 Apr 16 18:18:00.871699 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:00.871476 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nw6hr" event={"ID":"e6e3d626-b1d8-4140-83e2-92db90a4eae4","Type":"ContainerDied","Data":"86e8a7fa9e87953a351ab7aecd2dfca94d756d348f454c364591a3c639c5126c"} Apr 16 18:18:00.871839 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:00.871813 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-q8spv" Apr 16 18:18:00.872412 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:00.872367 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-q8spv" Apr 16 18:18:00.886790 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:00.886769 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:18:01.750711 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:01.750687 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:18:01.751004 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:01.750690 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:18:01.751004 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:01.750809 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zhb2" podUID="530ccfcb-dfe5-440b-8750-09dd186b8702" Apr 16 18:18:01.751004 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:01.750690 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:18:01.751004 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:01.750886 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq4vb" podUID="17b7bd66-ef40-4ff5-89de-2c7c3408fdc6" Apr 16 18:18:01.751004 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:01.750963 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6s9z" podUID="e2bb1680-b343-4014-bde1-6cc6bcd9805c" Apr 16 18:18:01.876098 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:01.876032 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 18:18:01.876434 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:01.876401 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-md54p" event={"ID":"e302771b-8af3-42dd-92e1-04faaff1c6e8","Type":"ContainerStarted","Data":"15a99f56448474ad1e51d35af1a703f7fa3c6bd7d8925693c925d1ffd58461b5"} Apr 16 18:18:01.876770 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:01.876752 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:18:01.876869 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:01.876781 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:18:01.878385 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:01.878363 2579 generic.go:358] "Generic (PLEG): container finished" podID="e6e3d626-b1d8-4140-83e2-92db90a4eae4" containerID="eb93d485966432b2d569eb4b626b2a51dd46b03eebf961bc3d02863170ae250c" exitCode=0 Apr 16 18:18:01.878475 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:01.878445 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nw6hr" event={"ID":"e6e3d626-b1d8-4140-83e2-92db90a4eae4","Type":"ContainerDied","Data":"eb93d485966432b2d569eb4b626b2a51dd46b03eebf961bc3d02863170ae250c"} Apr 16 18:18:01.889810 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:01.889792 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:18:01.907676 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:01.907619 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-md54p" podStartSLOduration=8.900201301 podStartE2EDuration="25.907601477s" podCreationTimestamp="2026-04-16 18:17:36 +0000 UTC" firstStartedPulling="2026-04-16 18:17:38.450806086 +0000 UTC m=+3.297966570" lastFinishedPulling="2026-04-16 18:17:55.458206255 +0000 UTC m=+20.305366746" observedRunningTime="2026-04-16 18:18:01.906056542 +0000 UTC m=+26.753217035" watchObservedRunningTime="2026-04-16 18:18:01.907601477 +0000 UTC m=+26.754762172" Apr 16 18:18:02.312621 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:02.312545 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6zhb2"] Apr 16 18:18:02.312779 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:02.312628 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:18:02.312779 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:02.312701 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zhb2" podUID="530ccfcb-dfe5-440b-8750-09dd186b8702" Apr 16 18:18:02.315123 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:02.315092 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sq4vb"] Apr 16 18:18:02.315242 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:02.315161 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:18:02.315315 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:02.315239 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq4vb" podUID="17b7bd66-ef40-4ff5-89de-2c7c3408fdc6" Apr 16 18:18:02.318002 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:02.317976 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k6s9z"] Apr 16 18:18:02.318120 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:02.318071 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:18:02.318193 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:02.318173 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6s9z" podUID="e2bb1680-b343-4014-bde1-6cc6bcd9805c" Apr 16 18:18:02.881882 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:02.881852 2579 generic.go:358] "Generic (PLEG): container finished" podID="e6e3d626-b1d8-4140-83e2-92db90a4eae4" containerID="ab561f400dad7d88693719b1480df94ac0a646322b49799a04428f95316d72a8" exitCode=0 Apr 16 18:18:02.882229 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:02.881887 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nw6hr" event={"ID":"e6e3d626-b1d8-4140-83e2-92db90a4eae4","Type":"ContainerDied","Data":"ab561f400dad7d88693719b1480df94ac0a646322b49799a04428f95316d72a8"} Apr 16 18:18:03.750651 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:03.750622 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:18:03.750852 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:03.750733 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6s9z" podUID="e2bb1680-b343-4014-bde1-6cc6bcd9805c" Apr 16 18:18:03.750852 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:03.750742 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:18:03.750852 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:03.750773 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:18:03.750852 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:03.750828 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq4vb" podUID="17b7bd66-ef40-4ff5-89de-2c7c3408fdc6" Apr 16 18:18:03.751048 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:03.750899 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zhb2" podUID="530ccfcb-dfe5-440b-8750-09dd186b8702" Apr 16 18:18:05.751505 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:05.751469 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:18:05.752044 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:05.751481 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:18:05.752044 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:05.751585 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:18:05.752044 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:05.751592 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zhb2" podUID="530ccfcb-dfe5-440b-8750-09dd186b8702" Apr 16 18:18:05.752044 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:05.751679 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6s9z" podUID="e2bb1680-b343-4014-bde1-6cc6bcd9805c" Apr 16 18:18:05.752044 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:05.751758 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq4vb" podUID="17b7bd66-ef40-4ff5-89de-2c7c3408fdc6" Apr 16 18:18:07.750848 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:07.750643 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:18:07.751290 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:07.750643 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:18:07.751290 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:07.750922 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq4vb" podUID="17b7bd66-ef40-4ff5-89de-2c7c3408fdc6" Apr 16 18:18:07.751290 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:07.750653 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:18:07.751290 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:07.750991 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6s9z" podUID="e2bb1680-b343-4014-bde1-6cc6bcd9805c" Apr 16 18:18:07.751290 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:07.751101 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6zhb2" podUID="530ccfcb-dfe5-440b-8750-09dd186b8702" Apr 16 18:18:08.503074 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.503048 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-167.ec2.internal" event="NodeReady" Apr 16 18:18:08.503287 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.503150 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:18:08.539792 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.539764 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-574bddd967-6xntm"] Apr 16 18:18:08.559537 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.559512 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6"] Apr 16 18:18:08.559678 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.559662 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-574bddd967-6xntm" Apr 16 18:18:08.562125 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.562067 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 18:18:08.562362 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.562204 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 18:18:08.563326 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.562935 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-vbnkr\"" Apr 16 18:18:08.563326 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.563021 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 18:18:08.563326 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.563112 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 18:18:08.577522 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.577502 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5"] Apr 16 18:18:08.577663 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.577645 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6" Apr 16 18:18:08.579953 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.579930 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 18:18:08.580042 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.579943 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-jxdfl\"" Apr 16 18:18:08.580042 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.580028 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 18:18:08.596842 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.596824 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb"] Apr 16 18:18:08.596930 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.596911 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5" Apr 16 18:18:08.599030 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.599013 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 18:18:08.611415 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.611397 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-67744d6d54-rrhs6"] Apr 16 18:18:08.611526 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.611513 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" Apr 16 18:18:08.613848 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.613824 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 18:18:08.613848 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.613835 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 18:18:08.613995 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.613935 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 18:18:08.614076 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.614063 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 18:18:08.620886 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.620870 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-574bddd967-6xntm"] Apr 16 18:18:08.620965 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.620892 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-db7l8"] Apr 16 18:18:08.621022 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.621010 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.623184 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.623168 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:18:08.623284 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.623172 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:18:08.623284 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.623205 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-sb5br\"" Apr 16 18:18:08.623471 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.623459 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:18:08.630999 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.630979 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:18:08.636234 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.636212 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-g7fm9"] Apr 16 18:18:08.636342 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.636332 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-db7l8" Apr 16 18:18:08.638575 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.638555 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:18:08.638662 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.638579 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:18:08.638662 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.638611 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8xm9h\"" Apr 16 18:18:08.652999 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.652935 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6"] Apr 16 18:18:08.652999 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.652963 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5"] Apr 16 18:18:08.652999 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.652977 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb"] Apr 16 18:18:08.652999 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.652990 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-67744d6d54-rrhs6"] Apr 16 18:18:08.653233 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.653006 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-db7l8"] Apr 16 18:18:08.653233 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.653019 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-g7fm9"] Apr 16 18:18:08.653233 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.653070 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-g7fm9" Apr 16 18:18:08.655478 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.655458 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:18:08.655583 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.655548 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rxx2m\"" Apr 16 18:18:08.655675 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.655662 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:18:08.655743 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.655682 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:18:08.707686 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.707667 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/832eb3b9-2a4f-434a-b186-de137df079bb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5d9c75dd55-f2hwb\" (UID: \"832eb3b9-2a4f-434a-b186-de137df079bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" Apr 16 18:18:08.707793 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.707694 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.707793 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.707715 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmnn7\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-kube-api-access-mmnn7\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.707793 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.707734 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/832eb3b9-2a4f-434a-b186-de137df079bb-ca\") pod \"cluster-proxy-proxy-agent-5d9c75dd55-f2hwb\" (UID: \"832eb3b9-2a4f-434a-b186-de137df079bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" Apr 16 18:18:08.707793 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.707749 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/832eb3b9-2a4f-434a-b186-de137df079bb-hub\") pod \"cluster-proxy-proxy-agent-5d9c75dd55-f2hwb\" (UID: \"832eb3b9-2a4f-434a-b186-de137df079bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" Apr 16 18:18:08.707965 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.707823 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e8a62d11-c318-4e87-bc46-0ef49c451c49-ca-trust-extracted\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.707965 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.707847 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-certificates\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.707965 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.707863 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e377b43d-00d0-4346-bd5a-c60942b18f82-tmp\") pod \"klusterlet-addon-workmgr-669d6599bc-t8sg5\" (UID: \"e377b43d-00d0-4346-bd5a-c60942b18f82\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5" Apr 16 18:18:08.707965 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.707880 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/832eb3b9-2a4f-434a-b186-de137df079bb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5d9c75dd55-f2hwb\" (UID: \"832eb3b9-2a4f-434a-b186-de137df079bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" Apr 16 18:18:08.707965 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.707909 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0464fc65-3816-4d16-840a-ffda5744de6c-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-xcpp6\" (UID: \"0464fc65-3816-4d16-840a-ffda5744de6c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6" Apr 16 18:18:08.708215 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.708015 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e8a62d11-c318-4e87-bc46-0ef49c451c49-image-registry-private-configuration\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.708215 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.708050 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/832eb3b9-2a4f-434a-b186-de137df079bb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5d9c75dd55-f2hwb\" (UID: \"832eb3b9-2a4f-434a-b186-de137df079bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" Apr 16 18:18:08.708215 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.708083 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkqjk\" (UniqueName: \"kubernetes.io/projected/832eb3b9-2a4f-434a-b186-de137df079bb-kube-api-access-gkqjk\") pod \"cluster-proxy-proxy-agent-5d9c75dd55-f2hwb\" (UID: \"832eb3b9-2a4f-434a-b186-de137df079bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" Apr 16 18:18:08.708215 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.708112 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/eb1d7266-b90b-456f-9e72-104117e32970-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-574bddd967-6xntm\" (UID: \"eb1d7266-b90b-456f-9e72-104117e32970\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-574bddd967-6xntm" Apr 16 18:18:08.708215 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.708188 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e8a62d11-c318-4e87-bc46-0ef49c451c49-trusted-ca\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.708500 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.708219 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e377b43d-00d0-4346-bd5a-c60942b18f82-klusterlet-config\") pod \"klusterlet-addon-workmgr-669d6599bc-t8sg5\" (UID: \"e377b43d-00d0-4346-bd5a-c60942b18f82\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5" Apr 16 18:18:08.708500 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.708239 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dbkz\" (UniqueName: \"kubernetes.io/projected/e377b43d-00d0-4346-bd5a-c60942b18f82-kube-api-access-8dbkz\") pod \"klusterlet-addon-workmgr-669d6599bc-t8sg5\" (UID: \"e377b43d-00d0-4346-bd5a-c60942b18f82\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5" Apr 16 18:18:08.708500 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.708285 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e8a62d11-c318-4e87-bc46-0ef49c451c49-installation-pull-secrets\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.708500 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.708308 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-bound-sa-token\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.708500 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.708339 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-xcpp6\" (UID: \"0464fc65-3816-4d16-840a-ffda5744de6c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6" Apr 16 18:18:08.708500 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.708365 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhftf\" (UniqueName: \"kubernetes.io/projected/eb1d7266-b90b-456f-9e72-104117e32970-kube-api-access-xhftf\") pod \"managed-serviceaccount-addon-agent-574bddd967-6xntm\" (UID: \"eb1d7266-b90b-456f-9e72-104117e32970\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-574bddd967-6xntm" Apr 16 18:18:08.809685 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.809653 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/832eb3b9-2a4f-434a-b186-de137df079bb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5d9c75dd55-f2hwb\" (UID: \"832eb3b9-2a4f-434a-b186-de137df079bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" Apr 16 18:18:08.810357 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.809691 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e8a62d11-c318-4e87-bc46-0ef49c451c49-ca-trust-extracted\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.810357 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.809719 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e377b43d-00d0-4346-bd5a-c60942b18f82-tmp\") pod \"klusterlet-addon-workmgr-669d6599bc-t8sg5\" (UID: \"e377b43d-00d0-4346-bd5a-c60942b18f82\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5" Apr 16 18:18:08.810357 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.809753 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e8a62d11-c318-4e87-bc46-0ef49c451c49-image-registry-private-configuration\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.810357 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.809780 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-bound-sa-token\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.810357 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.809804 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/832eb3b9-2a4f-434a-b186-de137df079bb-ca\") pod \"cluster-proxy-proxy-agent-5d9c75dd55-f2hwb\" (UID: \"832eb3b9-2a4f-434a-b186-de137df079bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" Apr 16 18:18:08.810357 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.809827 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/832eb3b9-2a4f-434a-b186-de137df079bb-hub\") pod \"cluster-proxy-proxy-agent-5d9c75dd55-f2hwb\" (UID: \"832eb3b9-2a4f-434a-b186-de137df079bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" Apr 16 18:18:08.810357 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.809855 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53b4818f-dde2-45a8-a8aa-831951359360-config-volume\") pod \"dns-default-db7l8\" (UID: \"53b4818f-dde2-45a8-a8aa-831951359360\") " pod="openshift-dns/dns-default-db7l8" Apr 16 18:18:08.810357 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.809891 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e377b43d-00d0-4346-bd5a-c60942b18f82-klusterlet-config\") pod \"klusterlet-addon-workmgr-669d6599bc-t8sg5\" (UID: \"e377b43d-00d0-4346-bd5a-c60942b18f82\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5" Apr 16 18:18:08.810357 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.809937 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-xcpp6\" (UID: \"0464fc65-3816-4d16-840a-ffda5744de6c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6" Apr 16 18:18:08.810357 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.809965 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/53b4818f-dde2-45a8-a8aa-831951359360-tmp-dir\") pod \"dns-default-db7l8\" (UID: \"53b4818f-dde2-45a8-a8aa-831951359360\") " pod="openshift-dns/dns-default-db7l8" Apr 16 18:18:08.810357 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.809989 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.810357 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.810016 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b294h\" (UniqueName: \"kubernetes.io/projected/9d741d54-0257-44ff-8680-6b59a49600e3-kube-api-access-b294h\") pod \"ingress-canary-g7fm9\" (UID: \"9d741d54-0257-44ff-8680-6b59a49600e3\") " pod="openshift-ingress-canary/ingress-canary-g7fm9" Apr 16 18:18:08.810357 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.810042 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-certificates\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.810357 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.810063 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert\") pod \"ingress-canary-g7fm9\" (UID: \"9d741d54-0257-44ff-8680-6b59a49600e3\") " pod="openshift-ingress-canary/ingress-canary-g7fm9" Apr 16 18:18:08.810357 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.810089 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/832eb3b9-2a4f-434a-b186-de137df079bb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5d9c75dd55-f2hwb\" (UID: \"832eb3b9-2a4f-434a-b186-de137df079bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" Apr 16 18:18:08.810357 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.810098 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e8a62d11-c318-4e87-bc46-0ef49c451c49-ca-trust-extracted\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.811301 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.810123 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls\") pod \"dns-default-db7l8\" (UID: \"53b4818f-dde2-45a8-a8aa-831951359360\") " pod="openshift-dns/dns-default-db7l8" Apr 16 18:18:08.811301 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.810151 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0464fc65-3816-4d16-840a-ffda5744de6c-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-xcpp6\" (UID: \"0464fc65-3816-4d16-840a-ffda5744de6c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6" Apr 16 18:18:08.811301 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.810200 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e377b43d-00d0-4346-bd5a-c60942b18f82-tmp\") pod \"klusterlet-addon-workmgr-669d6599bc-t8sg5\" (UID: \"e377b43d-00d0-4346-bd5a-c60942b18f82\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5" Apr 16 18:18:08.811301 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:08.810344 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:18:08.811301 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:08.810362 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67744d6d54-rrhs6: secret "image-registry-tls" not found Apr 16 18:18:08.811301 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:08.810430 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls podName:e8a62d11-c318-4e87-bc46-0ef49c451c49 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:09.310410785 +0000 UTC m=+34.157571300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls") pod "image-registry-67744d6d54-rrhs6" (UID: "e8a62d11-c318-4e87-bc46-0ef49c451c49") : secret "image-registry-tls" not found Apr 16 18:18:08.811301 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:08.810540 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:18:08.811301 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:08.810605 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert podName:0464fc65-3816-4d16-840a-ffda5744de6c nodeName:}" failed. No retries permitted until 2026-04-16 18:18:09.310589341 +0000 UTC m=+34.157749827 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-xcpp6" (UID: "0464fc65-3816-4d16-840a-ffda5744de6c") : secret "networking-console-plugin-cert" not found Apr 16 18:18:08.811301 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.810781 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-certificates\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.811301 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.810827 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/832eb3b9-2a4f-434a-b186-de137df079bb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5d9c75dd55-f2hwb\" (UID: \"832eb3b9-2a4f-434a-b186-de137df079bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" Apr 16 18:18:08.811301 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.810900 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0464fc65-3816-4d16-840a-ffda5744de6c-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-xcpp6\" (UID: \"0464fc65-3816-4d16-840a-ffda5744de6c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6" Apr 16 18:18:08.811301 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.810954 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkqjk\" (UniqueName: \"kubernetes.io/projected/832eb3b9-2a4f-434a-b186-de137df079bb-kube-api-access-gkqjk\") pod \"cluster-proxy-proxy-agent-5d9c75dd55-f2hwb\" (UID: \"832eb3b9-2a4f-434a-b186-de137df079bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" Apr 16 18:18:08.811301 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.810958 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/832eb3b9-2a4f-434a-b186-de137df079bb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5d9c75dd55-f2hwb\" (UID: \"832eb3b9-2a4f-434a-b186-de137df079bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" Apr 16 18:18:08.811301 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.811016 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/eb1d7266-b90b-456f-9e72-104117e32970-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-574bddd967-6xntm\" (UID: \"eb1d7266-b90b-456f-9e72-104117e32970\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-574bddd967-6xntm" Apr 16 18:18:08.811301 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.811085 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e8a62d11-c318-4e87-bc46-0ef49c451c49-installation-pull-secrets\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.811301 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.811110 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmnn7\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-kube-api-access-mmnn7\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.812086 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.811138 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e8a62d11-c318-4e87-bc46-0ef49c451c49-trusted-ca\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.812086 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.811162 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dbkz\" (UniqueName: \"kubernetes.io/projected/e377b43d-00d0-4346-bd5a-c60942b18f82-kube-api-access-8dbkz\") pod \"klusterlet-addon-workmgr-669d6599bc-t8sg5\" (UID: \"e377b43d-00d0-4346-bd5a-c60942b18f82\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5" Apr 16 18:18:08.812086 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.811215 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrwb7\" (UniqueName: \"kubernetes.io/projected/53b4818f-dde2-45a8-a8aa-831951359360-kube-api-access-mrwb7\") pod \"dns-default-db7l8\" (UID: \"53b4818f-dde2-45a8-a8aa-831951359360\") " pod="openshift-dns/dns-default-db7l8" Apr 16 18:18:08.812086 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.811245 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhftf\" (UniqueName: \"kubernetes.io/projected/eb1d7266-b90b-456f-9e72-104117e32970-kube-api-access-xhftf\") pod \"managed-serviceaccount-addon-agent-574bddd967-6xntm\" (UID: \"eb1d7266-b90b-456f-9e72-104117e32970\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-574bddd967-6xntm" Apr 16 18:18:08.815582 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.815506 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/832eb3b9-2a4f-434a-b186-de137df079bb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5d9c75dd55-f2hwb\" (UID: \"832eb3b9-2a4f-434a-b186-de137df079bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" Apr 16 18:18:08.816019 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.815890 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/832eb3b9-2a4f-434a-b186-de137df079bb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5d9c75dd55-f2hwb\" (UID: \"832eb3b9-2a4f-434a-b186-de137df079bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" Apr 16 18:18:08.816019 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.815952 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/832eb3b9-2a4f-434a-b186-de137df079bb-hub\") pod \"cluster-proxy-proxy-agent-5d9c75dd55-f2hwb\" (UID: \"832eb3b9-2a4f-434a-b186-de137df079bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" Apr 16 18:18:08.816178 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.816052 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e8a62d11-c318-4e87-bc46-0ef49c451c49-image-registry-private-configuration\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.816178 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.816072 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e8a62d11-c318-4e87-bc46-0ef49c451c49-installation-pull-secrets\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.816178 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.816067 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e377b43d-00d0-4346-bd5a-c60942b18f82-klusterlet-config\") pod \"klusterlet-addon-workmgr-669d6599bc-t8sg5\" (UID: \"e377b43d-00d0-4346-bd5a-c60942b18f82\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5" Apr 16 18:18:08.816621 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.816597 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e8a62d11-c318-4e87-bc46-0ef49c451c49-trusted-ca\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.816721 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.816675 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/eb1d7266-b90b-456f-9e72-104117e32970-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-574bddd967-6xntm\" (UID: \"eb1d7266-b90b-456f-9e72-104117e32970\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-574bddd967-6xntm" Apr 16 18:18:08.818343 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.818324 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-bound-sa-token\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.820433 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.820410 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmnn7\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-kube-api-access-mmnn7\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:08.820735 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.820714 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhftf\" (UniqueName: \"kubernetes.io/projected/eb1d7266-b90b-456f-9e72-104117e32970-kube-api-access-xhftf\") pod \"managed-serviceaccount-addon-agent-574bddd967-6xntm\" (UID: \"eb1d7266-b90b-456f-9e72-104117e32970\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-574bddd967-6xntm" Apr 16 18:18:08.821011 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.820996 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dbkz\" (UniqueName: \"kubernetes.io/projected/e377b43d-00d0-4346-bd5a-c60942b18f82-kube-api-access-8dbkz\") pod \"klusterlet-addon-workmgr-669d6599bc-t8sg5\" (UID: \"e377b43d-00d0-4346-bd5a-c60942b18f82\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5" Apr 16 18:18:08.828754 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.828731 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/832eb3b9-2a4f-434a-b186-de137df079bb-ca\") pod \"cluster-proxy-proxy-agent-5d9c75dd55-f2hwb\" (UID: \"832eb3b9-2a4f-434a-b186-de137df079bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" Apr 16 18:18:08.830343 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.830327 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkqjk\" (UniqueName: \"kubernetes.io/projected/832eb3b9-2a4f-434a-b186-de137df079bb-kube-api-access-gkqjk\") pod \"cluster-proxy-proxy-agent-5d9c75dd55-f2hwb\" (UID: \"832eb3b9-2a4f-434a-b186-de137df079bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" Apr 16 18:18:08.881190 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.881167 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-574bddd967-6xntm" Apr 16 18:18:08.895502 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.895473 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nw6hr" event={"ID":"e6e3d626-b1d8-4140-83e2-92db90a4eae4","Type":"ContainerStarted","Data":"efeb976a043cacd9d3445806d17720c06300566106748d9f25fff1a5d80e663c"} Apr 16 18:18:08.904619 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.904597 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5" Apr 16 18:18:08.912353 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.912335 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrwb7\" (UniqueName: \"kubernetes.io/projected/53b4818f-dde2-45a8-a8aa-831951359360-kube-api-access-mrwb7\") pod \"dns-default-db7l8\" (UID: \"53b4818f-dde2-45a8-a8aa-831951359360\") " pod="openshift-dns/dns-default-db7l8" Apr 16 18:18:08.912440 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.912387 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53b4818f-dde2-45a8-a8aa-831951359360-config-volume\") pod \"dns-default-db7l8\" (UID: \"53b4818f-dde2-45a8-a8aa-831951359360\") " pod="openshift-dns/dns-default-db7l8" Apr 16 18:18:08.912497 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.912457 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/53b4818f-dde2-45a8-a8aa-831951359360-tmp-dir\") pod \"dns-default-db7l8\" (UID: \"53b4818f-dde2-45a8-a8aa-831951359360\") " pod="openshift-dns/dns-default-db7l8" Apr 16 18:18:08.912550 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.912505 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b294h\" (UniqueName: \"kubernetes.io/projected/9d741d54-0257-44ff-8680-6b59a49600e3-kube-api-access-b294h\") pod \"ingress-canary-g7fm9\" (UID: \"9d741d54-0257-44ff-8680-6b59a49600e3\") " pod="openshift-ingress-canary/ingress-canary-g7fm9" Apr 16 18:18:08.912550 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.912537 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert\") pod \"ingress-canary-g7fm9\" (UID: \"9d741d54-0257-44ff-8680-6b59a49600e3\") " pod="openshift-ingress-canary/ingress-canary-g7fm9" Apr 16 18:18:08.912648 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.912589 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls\") pod \"dns-default-db7l8\" (UID: \"53b4818f-dde2-45a8-a8aa-831951359360\") " pod="openshift-dns/dns-default-db7l8" Apr 16 18:18:08.912705 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:08.912677 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:08.912751 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:08.912708 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:08.912751 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:08.912736 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert podName:9d741d54-0257-44ff-8680-6b59a49600e3 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:09.412715327 +0000 UTC m=+34.259875813 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert") pod "ingress-canary-g7fm9" (UID: "9d741d54-0257-44ff-8680-6b59a49600e3") : secret "canary-serving-cert" not found Apr 16 18:18:08.912843 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:08.912760 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls podName:53b4818f-dde2-45a8-a8aa-831951359360 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:09.412743673 +0000 UTC m=+34.259904161 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls") pod "dns-default-db7l8" (UID: "53b4818f-dde2-45a8-a8aa-831951359360") : secret "dns-default-metrics-tls" not found Apr 16 18:18:08.913056 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.913035 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53b4818f-dde2-45a8-a8aa-831951359360-config-volume\") pod \"dns-default-db7l8\" (UID: \"53b4818f-dde2-45a8-a8aa-831951359360\") " pod="openshift-dns/dns-default-db7l8" Apr 16 18:18:08.917284 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.917247 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/53b4818f-dde2-45a8-a8aa-831951359360-tmp-dir\") pod \"dns-default-db7l8\" (UID: \"53b4818f-dde2-45a8-a8aa-831951359360\") " pod="openshift-dns/dns-default-db7l8" Apr 16 18:18:08.919336 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.919320 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" Apr 16 18:18:08.926657 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.926638 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrwb7\" (UniqueName: \"kubernetes.io/projected/53b4818f-dde2-45a8-a8aa-831951359360-kube-api-access-mrwb7\") pod \"dns-default-db7l8\" (UID: \"53b4818f-dde2-45a8-a8aa-831951359360\") " pod="openshift-dns/dns-default-db7l8" Apr 16 18:18:08.927224 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:08.927180 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b294h\" (UniqueName: \"kubernetes.io/projected/9d741d54-0257-44ff-8680-6b59a49600e3-kube-api-access-b294h\") pod \"ingress-canary-g7fm9\" (UID: \"9d741d54-0257-44ff-8680-6b59a49600e3\") " pod="openshift-ingress-canary/ingress-canary-g7fm9" Apr 16 18:18:09.070195 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.070025 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-574bddd967-6xntm"] Apr 16 18:18:09.073368 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:18:09.073340 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb1d7266_b90b_456f_9e72_104117e32970.slice/crio-a74b45ace695105d5fdfead568d021a1fccaf9d9db7cbf9f8a984b35a9b88b03 WatchSource:0}: Error finding container a74b45ace695105d5fdfead568d021a1fccaf9d9db7cbf9f8a984b35a9b88b03: Status 404 returned error can't find the container with id a74b45ace695105d5fdfead568d021a1fccaf9d9db7cbf9f8a984b35a9b88b03 Apr 16 18:18:09.076728 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.076701 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5"] Apr 16 18:18:09.085492 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.077630 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb"] Apr 16 18:18:09.315771 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.315692 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-xcpp6\" (UID: \"0464fc65-3816-4d16-840a-ffda5744de6c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6" Apr 16 18:18:09.315771 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.315723 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:09.315896 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:09.315831 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:18:09.315896 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:09.315841 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67744d6d54-rrhs6: secret "image-registry-tls" not found Apr 16 18:18:09.315896 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:09.315838 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:18:09.315896 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:09.315883 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls podName:e8a62d11-c318-4e87-bc46-0ef49c451c49 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:10.315869687 +0000 UTC m=+35.163030156 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls") pod "image-registry-67744d6d54-rrhs6" (UID: "e8a62d11-c318-4e87-bc46-0ef49c451c49") : secret "image-registry-tls" not found Apr 16 18:18:09.315896 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:09.315896 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert podName:0464fc65-3816-4d16-840a-ffda5744de6c nodeName:}" failed. No retries permitted until 2026-04-16 18:18:10.315889925 +0000 UTC m=+35.163050393 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-xcpp6" (UID: "0464fc65-3816-4d16-840a-ffda5744de6c") : secret "networking-console-plugin-cert" not found Apr 16 18:18:09.416874 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.416847 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs\") pod \"network-metrics-daemon-k6s9z\" (UID: \"e2bb1680-b343-4014-bde1-6cc6bcd9805c\") " pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:18:09.416972 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.416906 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert\") pod \"ingress-canary-g7fm9\" (UID: \"9d741d54-0257-44ff-8680-6b59a49600e3\") " pod="openshift-ingress-canary/ingress-canary-g7fm9" Apr 16 18:18:09.416972 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.416947 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls\") pod \"dns-default-db7l8\" (UID: \"53b4818f-dde2-45a8-a8aa-831951359360\") " pod="openshift-dns/dns-default-db7l8" Apr 16 18:18:09.417079 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:09.416984 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:18:09.417079 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:09.417039 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs podName:e2bb1680-b343-4014-bde1-6cc6bcd9805c nodeName:}" failed. No retries permitted until 2026-04-16 18:18:41.417020293 +0000 UTC m=+66.264180766 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs") pod "network-metrics-daemon-k6s9z" (UID: "e2bb1680-b343-4014-bde1-6cc6bcd9805c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:18:09.417079 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:09.417046 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:09.417079 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:09.417069 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:09.417229 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:09.417082 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert podName:9d741d54-0257-44ff-8680-6b59a49600e3 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:10.417070543 +0000 UTC m=+35.264231021 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert") pod "ingress-canary-g7fm9" (UID: "9d741d54-0257-44ff-8680-6b59a49600e3") : secret "canary-serving-cert" not found Apr 16 18:18:09.417229 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:09.417113 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls podName:53b4818f-dde2-45a8-a8aa-831951359360 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:10.417097914 +0000 UTC m=+35.264258383 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls") pod "dns-default-db7l8" (UID: "53b4818f-dde2-45a8-a8aa-831951359360") : secret "dns-default-metrics-tls" not found Apr 16 18:18:09.517634 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.517612 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fgjj\" (UniqueName: \"kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj\") pod \"network-check-target-sq4vb\" (UID: \"17b7bd66-ef40-4ff5-89de-2c7c3408fdc6\") " pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:18:09.518138 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:09.518124 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:18:09.518178 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:09.518141 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:18:09.518178 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:09.518150 2579 projected.go:194] Error preparing data for projected volume kube-api-access-5fgjj for pod openshift-network-diagnostics/network-check-target-sq4vb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:18:09.518238 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:09.518188 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj podName:17b7bd66-ef40-4ff5-89de-2c7c3408fdc6 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:41.518178847 +0000 UTC m=+66.365339317 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-5fgjj" (UniqueName: "kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj") pod "network-check-target-sq4vb" (UID: "17b7bd66-ef40-4ff5-89de-2c7c3408fdc6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:18:09.718958 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.718928 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret\") pod \"global-pull-secret-syncer-6zhb2\" (UID: \"530ccfcb-dfe5-440b-8750-09dd186b8702\") " pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:18:09.719415 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:09.719389 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:18:09.719512 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:09.719498 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret podName:530ccfcb-dfe5-440b-8750-09dd186b8702 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:41.719479061 +0000 UTC m=+66.566639530 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret") pod "global-pull-secret-syncer-6zhb2" (UID: "530ccfcb-dfe5-440b-8750-09dd186b8702") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:18:09.750893 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.750815 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:18:09.750893 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.750835 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:18:09.751171 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.751155 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:18:09.753448 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.753237 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:18:09.753448 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.753337 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kzxcx\"" Apr 16 18:18:09.753642 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.753465 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:18:09.753642 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.753581 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:18:09.753717 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.753702 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-k7xtv\"" Apr 16 18:18:09.753896 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.753774 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:18:09.901028 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.900994 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5" event={"ID":"e377b43d-00d0-4346-bd5a-c60942b18f82","Type":"ContainerStarted","Data":"98ede86bca17bb31c6e5ca6e4f319ad4e50bff21494f91a3a3eff3986ed02130"} Apr 16 18:18:09.903915 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.903888 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-574bddd967-6xntm" event={"ID":"eb1d7266-b90b-456f-9e72-104117e32970","Type":"ContainerStarted","Data":"a74b45ace695105d5fdfead568d021a1fccaf9d9db7cbf9f8a984b35a9b88b03"} Apr 16 18:18:09.907920 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.907895 2579 generic.go:358] "Generic (PLEG): container finished" podID="e6e3d626-b1d8-4140-83e2-92db90a4eae4" containerID="efeb976a043cacd9d3445806d17720c06300566106748d9f25fff1a5d80e663c" exitCode=0 Apr 16 18:18:09.908017 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.907988 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nw6hr" event={"ID":"e6e3d626-b1d8-4140-83e2-92db90a4eae4","Type":"ContainerDied","Data":"efeb976a043cacd9d3445806d17720c06300566106748d9f25fff1a5d80e663c"} Apr 16 18:18:09.915334 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:09.914932 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" event={"ID":"832eb3b9-2a4f-434a-b186-de137df079bb","Type":"ContainerStarted","Data":"eda0bee00ce92053bc799d8a57eaa888423bea0809125c3b612a86a3984d32ac"} Apr 16 18:18:10.324242 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:10.324149 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-xcpp6\" (UID: \"0464fc65-3816-4d16-840a-ffda5744de6c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6" Apr 16 18:18:10.324242 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:10.324198 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:10.324484 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:10.324378 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:18:10.324484 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:10.324394 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67744d6d54-rrhs6: secret "image-registry-tls" not found Apr 16 18:18:10.324484 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:10.324454 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls podName:e8a62d11-c318-4e87-bc46-0ef49c451c49 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:12.324435195 +0000 UTC m=+37.171595686 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls") pod "image-registry-67744d6d54-rrhs6" (UID: "e8a62d11-c318-4e87-bc46-0ef49c451c49") : secret "image-registry-tls" not found Apr 16 18:18:10.324858 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:10.324841 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:18:10.324933 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:10.324897 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert podName:0464fc65-3816-4d16-840a-ffda5744de6c nodeName:}" failed. No retries permitted until 2026-04-16 18:18:12.324881147 +0000 UTC m=+37.172041618 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-xcpp6" (UID: "0464fc65-3816-4d16-840a-ffda5744de6c") : secret "networking-console-plugin-cert" not found Apr 16 18:18:10.425094 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:10.425059 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert\") pod \"ingress-canary-g7fm9\" (UID: \"9d741d54-0257-44ff-8680-6b59a49600e3\") " pod="openshift-ingress-canary/ingress-canary-g7fm9" Apr 16 18:18:10.425727 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:10.425123 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls\") pod \"dns-default-db7l8\" (UID: \"53b4818f-dde2-45a8-a8aa-831951359360\") " pod="openshift-dns/dns-default-db7l8" Apr 16 18:18:10.425727 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:10.425385 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:10.425727 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:10.425417 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:10.425727 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:10.425451 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert podName:9d741d54-0257-44ff-8680-6b59a49600e3 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:12.425433123 +0000 UTC m=+37.272593605 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert") pod "ingress-canary-g7fm9" (UID: "9d741d54-0257-44ff-8680-6b59a49600e3") : secret "canary-serving-cert" not found Apr 16 18:18:10.425727 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:10.425474 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls podName:53b4818f-dde2-45a8-a8aa-831951359360 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:12.425457788 +0000 UTC m=+37.272618260 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls") pod "dns-default-db7l8" (UID: "53b4818f-dde2-45a8-a8aa-831951359360") : secret "dns-default-metrics-tls" not found Apr 16 18:18:10.925337 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:10.925301 2579 generic.go:358] "Generic (PLEG): container finished" podID="e6e3d626-b1d8-4140-83e2-92db90a4eae4" containerID="d8529fd15b4dd60685e02657a9f3d4d2ffe5fa247e85134fdbbeca821431848c" exitCode=0 Apr 16 18:18:10.926031 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:10.925376 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nw6hr" event={"ID":"e6e3d626-b1d8-4140-83e2-92db90a4eae4","Type":"ContainerDied","Data":"d8529fd15b4dd60685e02657a9f3d4d2ffe5fa247e85134fdbbeca821431848c"} Apr 16 18:18:12.341208 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:12.341169 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-xcpp6\" (UID: \"0464fc65-3816-4d16-840a-ffda5744de6c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6" Apr 16 18:18:12.341208 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:12.341212 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:12.341651 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:12.341292 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:18:12.341651 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:12.341314 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:18:12.341651 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:12.341324 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67744d6d54-rrhs6: secret "image-registry-tls" not found Apr 16 18:18:12.341651 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:12.341375 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert podName:0464fc65-3816-4d16-840a-ffda5744de6c nodeName:}" failed. No retries permitted until 2026-04-16 18:18:16.341351976 +0000 UTC m=+41.188512460 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-xcpp6" (UID: "0464fc65-3816-4d16-840a-ffda5744de6c") : secret "networking-console-plugin-cert" not found Apr 16 18:18:12.341651 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:12.341502 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls podName:e8a62d11-c318-4e87-bc46-0ef49c451c49 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:16.341480546 +0000 UTC m=+41.188641022 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls") pod "image-registry-67744d6d54-rrhs6" (UID: "e8a62d11-c318-4e87-bc46-0ef49c451c49") : secret "image-registry-tls" not found Apr 16 18:18:12.442459 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:12.442429 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert\") pod \"ingress-canary-g7fm9\" (UID: \"9d741d54-0257-44ff-8680-6b59a49600e3\") " pod="openshift-ingress-canary/ingress-canary-g7fm9" Apr 16 18:18:12.442610 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:12.442482 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls\") pod \"dns-default-db7l8\" (UID: \"53b4818f-dde2-45a8-a8aa-831951359360\") " pod="openshift-dns/dns-default-db7l8" Apr 16 18:18:12.442610 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:12.442607 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:12.442718 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:12.442606 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:12.442774 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:12.442696 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls podName:53b4818f-dde2-45a8-a8aa-831951359360 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:16.44267826 +0000 UTC m=+41.289838747 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls") pod "dns-default-db7l8" (UID: "53b4818f-dde2-45a8-a8aa-831951359360") : secret "dns-default-metrics-tls" not found Apr 16 18:18:12.442842 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:12.442787 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert podName:9d741d54-0257-44ff-8680-6b59a49600e3 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:16.442767151 +0000 UTC m=+41.289927624 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert") pod "ingress-canary-g7fm9" (UID: "9d741d54-0257-44ff-8680-6b59a49600e3") : secret "canary-serving-cert" not found Apr 16 18:18:14.937323 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:14.937289 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nw6hr" event={"ID":"e6e3d626-b1d8-4140-83e2-92db90a4eae4","Type":"ContainerStarted","Data":"bd0a40577e6365dc54dc3d05f41b4a5c1b3d04dc52a18e438eebfd702b31d883"} Apr 16 18:18:14.938632 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:14.938603 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" event={"ID":"832eb3b9-2a4f-434a-b186-de137df079bb","Type":"ContainerStarted","Data":"abe4a33fc3592329629d81e2b3e77fcc26b4b9721fd650d77b97c5098314b335"} Apr 16 18:18:14.939899 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:14.939873 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5" event={"ID":"e377b43d-00d0-4346-bd5a-c60942b18f82","Type":"ContainerStarted","Data":"fbab726a6d9ea9acf30496bb4cc257e54144f1ed778e9abd4df21a2eca010569"} Apr 16 18:18:14.940116 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:14.940097 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5" Apr 16 18:18:14.941861 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:14.941841 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5" Apr 16 18:18:14.964184 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:14.964136 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-nw6hr" podStartSLOduration=9.800223863 podStartE2EDuration="39.964125365s" podCreationTimestamp="2026-04-16 18:17:35 +0000 UTC" firstStartedPulling="2026-04-16 18:17:38.445689853 +0000 UTC m=+3.292850321" lastFinishedPulling="2026-04-16 18:18:08.609591352 +0000 UTC m=+33.456751823" observedRunningTime="2026-04-16 18:18:14.962860321 +0000 UTC m=+39.810020831" watchObservedRunningTime="2026-04-16 18:18:14.964125365 +0000 UTC m=+39.811285855" Apr 16 18:18:14.980786 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:14.980741 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5" podStartSLOduration=16.028325741 podStartE2EDuration="20.980725713s" podCreationTimestamp="2026-04-16 18:17:54 +0000 UTC" firstStartedPulling="2026-04-16 18:18:09.09996492 +0000 UTC m=+33.947125395" lastFinishedPulling="2026-04-16 18:18:14.052364894 +0000 UTC m=+38.899525367" observedRunningTime="2026-04-16 18:18:14.980130238 +0000 UTC m=+39.827290728" watchObservedRunningTime="2026-04-16 18:18:14.980725713 +0000 UTC m=+39.827886207" Apr 16 18:18:16.373308 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:16.373258 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-xcpp6\" (UID: \"0464fc65-3816-4d16-840a-ffda5744de6c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6" Apr 16 18:18:16.373669 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:16.373316 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:16.373669 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:16.373378 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:18:16.373669 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:16.373427 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:18:16.373669 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:16.373437 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67744d6d54-rrhs6: secret "image-registry-tls" not found Apr 16 18:18:16.373669 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:16.373452 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert podName:0464fc65-3816-4d16-840a-ffda5744de6c nodeName:}" failed. No retries permitted until 2026-04-16 18:18:24.373436805 +0000 UTC m=+49.220597274 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-xcpp6" (UID: "0464fc65-3816-4d16-840a-ffda5744de6c") : secret "networking-console-plugin-cert" not found Apr 16 18:18:16.373669 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:16.373474 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls podName:e8a62d11-c318-4e87-bc46-0ef49c451c49 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:24.373462146 +0000 UTC m=+49.220622615 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls") pod "image-registry-67744d6d54-rrhs6" (UID: "e8a62d11-c318-4e87-bc46-0ef49c451c49") : secret "image-registry-tls" not found Apr 16 18:18:16.474156 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:16.474131 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert\") pod \"ingress-canary-g7fm9\" (UID: \"9d741d54-0257-44ff-8680-6b59a49600e3\") " pod="openshift-ingress-canary/ingress-canary-g7fm9" Apr 16 18:18:16.474311 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:16.474169 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls\") pod \"dns-default-db7l8\" (UID: \"53b4818f-dde2-45a8-a8aa-831951359360\") " pod="openshift-dns/dns-default-db7l8" Apr 16 18:18:16.474311 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:16.474248 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:16.474311 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:16.474304 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:16.474423 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:16.474315 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert podName:9d741d54-0257-44ff-8680-6b59a49600e3 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:24.474303187 +0000 UTC m=+49.321463660 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert") pod "ingress-canary-g7fm9" (UID: "9d741d54-0257-44ff-8680-6b59a49600e3") : secret "canary-serving-cert" not found Apr 16 18:18:16.474423 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:16.474338 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls podName:53b4818f-dde2-45a8-a8aa-831951359360 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:24.474328424 +0000 UTC m=+49.321488894 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls") pod "dns-default-db7l8" (UID: "53b4818f-dde2-45a8-a8aa-831951359360") : secret "dns-default-metrics-tls" not found Apr 16 18:18:16.945559 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:16.945488 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" event={"ID":"832eb3b9-2a4f-434a-b186-de137df079bb","Type":"ContainerStarted","Data":"283da32cd7c1cddf193b855cf72d03d8f3f627bdc702d6b1cf8aca7963357e14"} Apr 16 18:18:16.945559 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:16.945525 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" event={"ID":"832eb3b9-2a4f-434a-b186-de137df079bb","Type":"ContainerStarted","Data":"555fe69249ce6482697a60041b044a884d27636cdbe971af82259c8fd63d0ed5"} Apr 16 18:18:16.969347 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:16.969305 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" podStartSLOduration=14.550732398 podStartE2EDuration="21.969291784s" podCreationTimestamp="2026-04-16 18:17:55 +0000 UTC" firstStartedPulling="2026-04-16 18:18:09.099934087 +0000 UTC m=+33.947094558" lastFinishedPulling="2026-04-16 18:18:16.518493462 +0000 UTC m=+41.365653944" observedRunningTime="2026-04-16 18:18:16.967723231 +0000 UTC m=+41.814883722" watchObservedRunningTime="2026-04-16 18:18:16.969291784 +0000 UTC m=+41.816452275" Apr 16 18:18:20.954307 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:20.954255 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-574bddd967-6xntm" event={"ID":"eb1d7266-b90b-456f-9e72-104117e32970","Type":"ContainerStarted","Data":"753849dded6b9cd448bed0a12196da9e2daa2addd4c463d58cc3118441215944"} Apr 16 18:18:20.973953 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:20.973881 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-574bddd967-6xntm" podStartSLOduration=15.336172615 podStartE2EDuration="26.973869312s" podCreationTimestamp="2026-04-16 18:17:54 +0000 UTC" firstStartedPulling="2026-04-16 18:18:09.075404792 +0000 UTC m=+33.922565261" lastFinishedPulling="2026-04-16 18:18:20.713101489 +0000 UTC m=+45.560261958" observedRunningTime="2026-04-16 18:18:20.973049556 +0000 UTC m=+45.820210048" watchObservedRunningTime="2026-04-16 18:18:20.973869312 +0000 UTC m=+45.821029781" Apr 16 18:18:24.429618 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:24.429581 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-xcpp6\" (UID: \"0464fc65-3816-4d16-840a-ffda5744de6c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6" Apr 16 18:18:24.429618 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:24.429620 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:24.430130 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:24.429732 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:18:24.430130 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:24.429760 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:18:24.430130 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:24.429772 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67744d6d54-rrhs6: secret "image-registry-tls" not found Apr 16 18:18:24.430130 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:24.429804 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert podName:0464fc65-3816-4d16-840a-ffda5744de6c nodeName:}" failed. No retries permitted until 2026-04-16 18:18:40.429786741 +0000 UTC m=+65.276947210 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-xcpp6" (UID: "0464fc65-3816-4d16-840a-ffda5744de6c") : secret "networking-console-plugin-cert" not found Apr 16 18:18:24.430130 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:24.429821 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls podName:e8a62d11-c318-4e87-bc46-0ef49c451c49 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:40.429813073 +0000 UTC m=+65.276973542 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls") pod "image-registry-67744d6d54-rrhs6" (UID: "e8a62d11-c318-4e87-bc46-0ef49c451c49") : secret "image-registry-tls" not found Apr 16 18:18:24.530508 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:24.530482 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert\") pod \"ingress-canary-g7fm9\" (UID: \"9d741d54-0257-44ff-8680-6b59a49600e3\") " pod="openshift-ingress-canary/ingress-canary-g7fm9" Apr 16 18:18:24.530606 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:24.530520 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls\") pod \"dns-default-db7l8\" (UID: \"53b4818f-dde2-45a8-a8aa-831951359360\") " pod="openshift-dns/dns-default-db7l8" Apr 16 18:18:24.530643 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:24.530606 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:24.530643 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:24.530627 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:24.530706 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:24.530662 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert podName:9d741d54-0257-44ff-8680-6b59a49600e3 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:40.530648325 +0000 UTC m=+65.377808794 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert") pod "ingress-canary-g7fm9" (UID: "9d741d54-0257-44ff-8680-6b59a49600e3") : secret "canary-serving-cert" not found Apr 16 18:18:24.530706 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:24.530678 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls podName:53b4818f-dde2-45a8-a8aa-831951359360 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:40.530671328 +0000 UTC m=+65.377831797 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls") pod "dns-default-db7l8" (UID: "53b4818f-dde2-45a8-a8aa-831951359360") : secret "dns-default-metrics-tls" not found Apr 16 18:18:33.895767 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:33.895735 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-md54p" Apr 16 18:18:40.448313 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:40.448251 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-xcpp6\" (UID: \"0464fc65-3816-4d16-840a-ffda5744de6c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6" Apr 16 18:18:40.448313 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:40.448313 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:18:40.448887 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:40.448411 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:18:40.448887 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:40.448474 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:18:40.448887 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:40.448495 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67744d6d54-rrhs6: secret "image-registry-tls" not found Apr 16 18:18:40.448887 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:40.448522 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert podName:0464fc65-3816-4d16-840a-ffda5744de6c nodeName:}" failed. No retries permitted until 2026-04-16 18:19:12.448499123 +0000 UTC m=+97.295659607 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-xcpp6" (UID: "0464fc65-3816-4d16-840a-ffda5744de6c") : secret "networking-console-plugin-cert" not found Apr 16 18:18:40.448887 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:40.448555 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls podName:e8a62d11-c318-4e87-bc46-0ef49c451c49 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:12.448540498 +0000 UTC m=+97.295700970 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls") pod "image-registry-67744d6d54-rrhs6" (UID: "e8a62d11-c318-4e87-bc46-0ef49c451c49") : secret "image-registry-tls" not found Apr 16 18:18:40.549521 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:40.549493 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert\") pod \"ingress-canary-g7fm9\" (UID: \"9d741d54-0257-44ff-8680-6b59a49600e3\") " pod="openshift-ingress-canary/ingress-canary-g7fm9" Apr 16 18:18:40.549653 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:40.549538 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls\") pod \"dns-default-db7l8\" (UID: \"53b4818f-dde2-45a8-a8aa-831951359360\") " pod="openshift-dns/dns-default-db7l8" Apr 16 18:18:40.549653 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:40.549626 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:40.549724 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:40.549691 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert podName:9d741d54-0257-44ff-8680-6b59a49600e3 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:12.549678541 +0000 UTC m=+97.396839013 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert") pod "ingress-canary-g7fm9" (UID: "9d741d54-0257-44ff-8680-6b59a49600e3") : secret "canary-serving-cert" not found Apr 16 18:18:40.549724 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:40.549708 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:40.549797 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:40.549762 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls podName:53b4818f-dde2-45a8-a8aa-831951359360 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:12.549742838 +0000 UTC m=+97.396903314 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls") pod "dns-default-db7l8" (UID: "53b4818f-dde2-45a8-a8aa-831951359360") : secret "dns-default-metrics-tls" not found Apr 16 18:18:41.456991 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:41.456957 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs\") pod \"network-metrics-daemon-k6s9z\" (UID: \"e2bb1680-b343-4014-bde1-6cc6bcd9805c\") " pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:18:41.459536 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:41.459517 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:18:41.467583 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:41.467565 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:18:41.467635 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:18:41.467617 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs podName:e2bb1680-b343-4014-bde1-6cc6bcd9805c nodeName:}" failed. No retries permitted until 2026-04-16 18:19:45.467602817 +0000 UTC m=+130.314763286 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs") pod "network-metrics-daemon-k6s9z" (UID: "e2bb1680-b343-4014-bde1-6cc6bcd9805c") : secret "metrics-daemon-secret" not found Apr 16 18:18:41.557503 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:41.557479 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fgjj\" (UniqueName: \"kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj\") pod \"network-check-target-sq4vb\" (UID: \"17b7bd66-ef40-4ff5-89de-2c7c3408fdc6\") " pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:18:41.560239 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:41.560222 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:18:41.570704 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:41.570684 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:18:41.582675 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:41.582643 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fgjj\" (UniqueName: \"kubernetes.io/projected/17b7bd66-ef40-4ff5-89de-2c7c3408fdc6-kube-api-access-5fgjj\") pod \"network-check-target-sq4vb\" (UID: \"17b7bd66-ef40-4ff5-89de-2c7c3408fdc6\") " pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:18:41.759489 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:41.759426 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret\") pod \"global-pull-secret-syncer-6zhb2\" (UID: \"530ccfcb-dfe5-440b-8750-09dd186b8702\") " pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:18:41.761980 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:41.761964 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:18:41.772002 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:41.771981 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/530ccfcb-dfe5-440b-8750-09dd186b8702-original-pull-secret\") pod \"global-pull-secret-syncer-6zhb2\" (UID: \"530ccfcb-dfe5-440b-8750-09dd186b8702\") " pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:18:41.864507 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:41.864474 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6zhb2" Apr 16 18:18:41.883713 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:41.883689 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kzxcx\"" Apr 16 18:18:41.892061 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:41.892029 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:18:41.996351 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:41.996331 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6zhb2"] Apr 16 18:18:41.999013 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:18:41.998988 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod530ccfcb_dfe5_440b_8750_09dd186b8702.slice/crio-e430601575cbfcc1b2c2fff396de09d60cf3d64a48fe1e8d96d491a752cbab61 WatchSource:0}: Error finding container e430601575cbfcc1b2c2fff396de09d60cf3d64a48fe1e8d96d491a752cbab61: Status 404 returned error can't find the container with id e430601575cbfcc1b2c2fff396de09d60cf3d64a48fe1e8d96d491a752cbab61 Apr 16 18:18:42.015733 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:42.015687 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sq4vb"] Apr 16 18:18:42.019926 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:18:42.019891 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17b7bd66_ef40_4ff5_89de_2c7c3408fdc6.slice/crio-000744e3a6d14c36dbec4ca158f09bab6d796e31f20df89341e5e7c5714f2e74 WatchSource:0}: Error finding container 000744e3a6d14c36dbec4ca158f09bab6d796e31f20df89341e5e7c5714f2e74: Status 404 returned error can't find the container with id 000744e3a6d14c36dbec4ca158f09bab6d796e31f20df89341e5e7c5714f2e74 Apr 16 18:18:42.996216 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:42.996181 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6zhb2" event={"ID":"530ccfcb-dfe5-440b-8750-09dd186b8702","Type":"ContainerStarted","Data":"e430601575cbfcc1b2c2fff396de09d60cf3d64a48fe1e8d96d491a752cbab61"} Apr 16 18:18:42.998165 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:42.998127 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sq4vb" event={"ID":"17b7bd66-ef40-4ff5-89de-2c7c3408fdc6","Type":"ContainerStarted","Data":"000744e3a6d14c36dbec4ca158f09bab6d796e31f20df89341e5e7c5714f2e74"} Apr 16 18:18:47.007133 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:47.007108 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sq4vb" event={"ID":"17b7bd66-ef40-4ff5-89de-2c7c3408fdc6","Type":"ContainerStarted","Data":"f37b95b86217f723560e196f0b7fbab88e356e65a94a210c887c316b19bdca20"} Apr 16 18:18:47.007438 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:47.007324 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:18:47.030942 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:47.030892 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-sq4vb" podStartSLOduration=66.215327923 podStartE2EDuration="1m11.030873028s" podCreationTimestamp="2026-04-16 18:17:36 +0000 UTC" firstStartedPulling="2026-04-16 18:18:42.021941616 +0000 UTC m=+66.869102085" lastFinishedPulling="2026-04-16 18:18:46.837486669 +0000 UTC m=+71.684647190" observedRunningTime="2026-04-16 18:18:47.029718833 +0000 UTC m=+71.876879328" watchObservedRunningTime="2026-04-16 18:18:47.030873028 +0000 UTC m=+71.878033521" Apr 16 18:18:48.011222 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:18:48.011179 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6zhb2" event={"ID":"530ccfcb-dfe5-440b-8750-09dd186b8702","Type":"ContainerStarted","Data":"d8b3933e1258feeb5b1a2d925d0389b7ebbed66b4bfdee2455bc7b3ec3e4c49f"} Apr 16 18:19:12.490008 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:19:12.489963 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-xcpp6\" (UID: \"0464fc65-3816-4d16-840a-ffda5744de6c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6" Apr 16 18:19:12.490008 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:19:12.490009 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:19:12.490543 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:19:12.490103 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:19:12.490543 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:19:12.490160 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:19:12.490543 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:19:12.490176 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67744d6d54-rrhs6: secret "image-registry-tls" not found Apr 16 18:19:12.490543 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:19:12.490164 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert podName:0464fc65-3816-4d16-840a-ffda5744de6c nodeName:}" failed. No retries permitted until 2026-04-16 18:20:16.490148678 +0000 UTC m=+161.337309147 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-xcpp6" (UID: "0464fc65-3816-4d16-840a-ffda5744de6c") : secret "networking-console-plugin-cert" not found Apr 16 18:19:12.490543 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:19:12.490246 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls podName:e8a62d11-c318-4e87-bc46-0ef49c451c49 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:16.490229367 +0000 UTC m=+161.337389839 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls") pod "image-registry-67744d6d54-rrhs6" (UID: "e8a62d11-c318-4e87-bc46-0ef49c451c49") : secret "image-registry-tls" not found Apr 16 18:19:12.591094 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:19:12.591065 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert\") pod \"ingress-canary-g7fm9\" (UID: \"9d741d54-0257-44ff-8680-6b59a49600e3\") " pod="openshift-ingress-canary/ingress-canary-g7fm9" Apr 16 18:19:12.591239 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:19:12.591102 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls\") pod \"dns-default-db7l8\" (UID: \"53b4818f-dde2-45a8-a8aa-831951359360\") " pod="openshift-dns/dns-default-db7l8" Apr 16 18:19:12.591239 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:19:12.591202 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:19:12.591239 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:19:12.591217 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:19:12.591434 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:19:12.591258 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert podName:9d741d54-0257-44ff-8680-6b59a49600e3 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:16.591244036 +0000 UTC m=+161.438404504 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert") pod "ingress-canary-g7fm9" (UID: "9d741d54-0257-44ff-8680-6b59a49600e3") : secret "canary-serving-cert" not found Apr 16 18:19:12.591434 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:19:12.591299 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls podName:53b4818f-dde2-45a8-a8aa-831951359360 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:16.59128488 +0000 UTC m=+161.438445353 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls") pod "dns-default-db7l8" (UID: "53b4818f-dde2-45a8-a8aa-831951359360") : secret "dns-default-metrics-tls" not found Apr 16 18:19:18.013660 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:19:18.013636 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-sq4vb" Apr 16 18:19:18.032414 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:19:18.032373 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-6zhb2" podStartSLOduration=96.184758067 podStartE2EDuration="1m41.032360601s" podCreationTimestamp="2026-04-16 18:17:37 +0000 UTC" firstStartedPulling="2026-04-16 18:18:42.000523389 +0000 UTC m=+66.847683858" lastFinishedPulling="2026-04-16 18:18:46.848125922 +0000 UTC m=+71.695286392" observedRunningTime="2026-04-16 18:18:48.027253261 +0000 UTC m=+72.874413751" watchObservedRunningTime="2026-04-16 18:19:18.032360601 +0000 UTC m=+102.879521091" Apr 16 18:19:45.521223 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:19:45.521185 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs\") pod \"network-metrics-daemon-k6s9z\" (UID: \"e2bb1680-b343-4014-bde1-6cc6bcd9805c\") " pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:19:45.521667 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:19:45.521328 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:19:45.521667 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:19:45.521392 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs podName:e2bb1680-b343-4014-bde1-6cc6bcd9805c nodeName:}" failed. No retries permitted until 2026-04-16 18:21:47.521377368 +0000 UTC m=+252.368537837 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs") pod "network-metrics-daemon-k6s9z" (UID: "e2bb1680-b343-4014-bde1-6cc6bcd9805c") : secret "metrics-daemon-secret" not found Apr 16 18:20:03.398395 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:03.398365 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mt4jc_9e454ceb-b1f7-44d0-899f-7d0a1be98b35/dns-node-resolver/0.log" Apr 16 18:20:04.395356 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:04.395324 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-l4nhg_2b843015-ad80-4dc0-aad1-a22e5a3909f6/node-ca/0.log" Apr 16 18:20:11.588016 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:20:11.587976 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6" podUID="0464fc65-3816-4d16-840a-ffda5744de6c" Apr 16 18:20:11.644850 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:20:11.644821 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" podUID="e8a62d11-c318-4e87-bc46-0ef49c451c49" Apr 16 18:20:11.652065 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:20:11.652049 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-db7l8" podUID="53b4818f-dde2-45a8-a8aa-831951359360" Apr 16 18:20:11.663206 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:20:11.663185 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-g7fm9" podUID="9d741d54-0257-44ff-8680-6b59a49600e3" Apr 16 18:20:12.206478 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:12.206451 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6" Apr 16 18:20:12.206478 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:12.206465 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:20:12.206640 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:12.206465 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-db7l8" Apr 16 18:20:12.206640 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:12.206465 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-g7fm9" Apr 16 18:20:12.773763 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:20:12.773727 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-k6s9z" podUID="e2bb1680-b343-4014-bde1-6cc6bcd9805c" Apr 16 18:20:14.940823 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:14.940765 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5" podUID="e377b43d-00d0-4346-bd5a-c60942b18f82" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.8:8000/readyz\": dial tcp 10.132.0.8:8000: connect: connection refused" Apr 16 18:20:15.215311 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:15.215248 2579 generic.go:358] "Generic (PLEG): container finished" podID="e377b43d-00d0-4346-bd5a-c60942b18f82" containerID="fbab726a6d9ea9acf30496bb4cc257e54144f1ed778e9abd4df21a2eca010569" exitCode=1 Apr 16 18:20:15.215422 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:15.215323 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5" event={"ID":"e377b43d-00d0-4346-bd5a-c60942b18f82","Type":"ContainerDied","Data":"fbab726a6d9ea9acf30496bb4cc257e54144f1ed778e9abd4df21a2eca010569"} Apr 16 18:20:15.215682 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:15.215666 2579 scope.go:117] "RemoveContainer" containerID="fbab726a6d9ea9acf30496bb4cc257e54144f1ed778e9abd4df21a2eca010569" Apr 16 18:20:16.218538 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:16.218507 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5" event={"ID":"e377b43d-00d0-4346-bd5a-c60942b18f82","Type":"ContainerStarted","Data":"3ab0f9941e6346b196e2240d5dbe6ef3d30959b8d0e9f29b0bce797065809e6d"} Apr 16 18:20:16.218980 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:16.218790 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5" Apr 16 18:20:16.219353 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:16.219336 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669d6599bc-t8sg5" Apr 16 18:20:16.550160 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:16.550076 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-xcpp6\" (UID: \"0464fc65-3816-4d16-840a-ffda5744de6c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6" Apr 16 18:20:16.550160 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:16.550111 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:20:16.552559 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:16.552531 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0464fc65-3816-4d16-840a-ffda5744de6c-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-xcpp6\" (UID: \"0464fc65-3816-4d16-840a-ffda5744de6c\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6" Apr 16 18:20:16.552669 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:16.552611 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls\") pod \"image-registry-67744d6d54-rrhs6\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:20:16.650554 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:16.650527 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls\") pod \"dns-default-db7l8\" (UID: \"53b4818f-dde2-45a8-a8aa-831951359360\") " pod="openshift-dns/dns-default-db7l8" Apr 16 18:20:16.650683 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:16.650593 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert\") pod \"ingress-canary-g7fm9\" (UID: \"9d741d54-0257-44ff-8680-6b59a49600e3\") " pod="openshift-ingress-canary/ingress-canary-g7fm9" Apr 16 18:20:16.652830 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:16.652803 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d741d54-0257-44ff-8680-6b59a49600e3-cert\") pod \"ingress-canary-g7fm9\" (UID: \"9d741d54-0257-44ff-8680-6b59a49600e3\") " pod="openshift-ingress-canary/ingress-canary-g7fm9" Apr 16 18:20:16.652946 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:16.652891 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53b4818f-dde2-45a8-a8aa-831951359360-metrics-tls\") pod \"dns-default-db7l8\" (UID: \"53b4818f-dde2-45a8-a8aa-831951359360\") " pod="openshift-dns/dns-default-db7l8" Apr 16 18:20:16.710449 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:16.710424 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rxx2m\"" Apr 16 18:20:16.710544 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:16.710426 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8xm9h\"" Apr 16 18:20:16.710544 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:16.710474 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-jxdfl\"" Apr 16 18:20:16.710544 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:16.710462 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-sb5br\"" Apr 16 18:20:16.717467 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:16.717446 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6" Apr 16 18:20:16.717505 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:16.717468 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-db7l8" Apr 16 18:20:16.717505 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:16.717498 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-g7fm9" Apr 16 18:20:16.717584 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:16.717514 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:20:16.880007 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:16.879982 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-g7fm9"] Apr 16 18:20:16.881515 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:20:16.881486 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d741d54_0257_44ff_8680_6b59a49600e3.slice/crio-beb6792f01fd5c45061a7eae7bc36895c2a22543039864b33c253cddeb690eac WatchSource:0}: Error finding container beb6792f01fd5c45061a7eae7bc36895c2a22543039864b33c253cddeb690eac: Status 404 returned error can't find the container with id beb6792f01fd5c45061a7eae7bc36895c2a22543039864b33c253cddeb690eac Apr 16 18:20:17.099853 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:17.099787 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6"] Apr 16 18:20:17.103377 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:20:17.103352 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0464fc65_3816_4d16_840a_ffda5744de6c.slice/crio-5dfbc34e3c00e258f142b7878a9aac72a59d99b2c8bb636e563aa618f0fd41ef WatchSource:0}: Error finding container 5dfbc34e3c00e258f142b7878a9aac72a59d99b2c8bb636e563aa618f0fd41ef: Status 404 returned error can't find the container with id 5dfbc34e3c00e258f142b7878a9aac72a59d99b2c8bb636e563aa618f0fd41ef Apr 16 18:20:17.105541 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:17.105497 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-db7l8"] Apr 16 18:20:17.106533 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:17.106496 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-67744d6d54-rrhs6"] Apr 16 18:20:17.109226 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:20:17.109206 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8a62d11_c318_4e87_bc46_0ef49c451c49.slice/crio-5e39f3ce6324d69da068d002a43d65da3e0d3ed7023bca1b0f039456d809742d WatchSource:0}: Error finding container 5e39f3ce6324d69da068d002a43d65da3e0d3ed7023bca1b0f039456d809742d: Status 404 returned error can't find the container with id 5e39f3ce6324d69da068d002a43d65da3e0d3ed7023bca1b0f039456d809742d Apr 16 18:20:17.109707 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:20:17.109685 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53b4818f_dde2_45a8_a8aa_831951359360.slice/crio-3aa6b045546eff1b9a0dfa1903d998b607969ca37026a8074a8f5598bc58951c WatchSource:0}: Error finding container 3aa6b045546eff1b9a0dfa1903d998b607969ca37026a8074a8f5598bc58951c: Status 404 returned error can't find the container with id 3aa6b045546eff1b9a0dfa1903d998b607969ca37026a8074a8f5598bc58951c Apr 16 18:20:17.222436 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:17.222399 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6" event={"ID":"0464fc65-3816-4d16-840a-ffda5744de6c","Type":"ContainerStarted","Data":"5dfbc34e3c00e258f142b7878a9aac72a59d99b2c8bb636e563aa618f0fd41ef"} Apr 16 18:20:17.223764 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:17.223734 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-g7fm9" event={"ID":"9d741d54-0257-44ff-8680-6b59a49600e3","Type":"ContainerStarted","Data":"beb6792f01fd5c45061a7eae7bc36895c2a22543039864b33c253cddeb690eac"} Apr 16 18:20:17.229028 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:17.225010 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-db7l8" event={"ID":"53b4818f-dde2-45a8-a8aa-831951359360","Type":"ContainerStarted","Data":"3aa6b045546eff1b9a0dfa1903d998b607969ca37026a8074a8f5598bc58951c"} Apr 16 18:20:17.230902 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:17.230863 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" event={"ID":"e8a62d11-c318-4e87-bc46-0ef49c451c49","Type":"ContainerStarted","Data":"37462c6b7db76c7a50e1894107ef5801b60902a270526c2db12862b64cb7b703"} Apr 16 18:20:17.231059 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:17.230911 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" event={"ID":"e8a62d11-c318-4e87-bc46-0ef49c451c49","Type":"ContainerStarted","Data":"5e39f3ce6324d69da068d002a43d65da3e0d3ed7023bca1b0f039456d809742d"} Apr 16 18:20:17.253092 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:17.253042 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" podStartSLOduration=161.253027566 podStartE2EDuration="2m41.253027566s" podCreationTimestamp="2026-04-16 18:17:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:20:17.25242043 +0000 UTC m=+162.099580922" watchObservedRunningTime="2026-04-16 18:20:17.253027566 +0000 UTC m=+162.100188059" Apr 16 18:20:18.234796 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:18.234758 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:20:19.239754 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:19.239728 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6" event={"ID":"0464fc65-3816-4d16-840a-ffda5744de6c","Type":"ContainerStarted","Data":"2bee8b7ab97b4f8164302d01d097a90dbee0c7cc99ad77ffa573105c3c75fe25"} Apr 16 18:20:19.241496 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:19.241467 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-g7fm9" event={"ID":"9d741d54-0257-44ff-8680-6b59a49600e3","Type":"ContainerStarted","Data":"b7748586e83d2246b01fa7abb72f614df9a533e19d7ec1213649897ef8abb4b4"} Apr 16 18:20:19.243054 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:19.243031 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-db7l8" event={"ID":"53b4818f-dde2-45a8-a8aa-831951359360","Type":"ContainerStarted","Data":"7f9a96c338393aab9f10c29a3e14eb2f923b7a4675fa376d0bbcf1a17a0a8d78"} Apr 16 18:20:19.243131 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:19.243063 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-db7l8" event={"ID":"53b4818f-dde2-45a8-a8aa-831951359360","Type":"ContainerStarted","Data":"bc8ff1b86bde8f76cb1be46fa03b775ff6dd789fbc2c52909f201310db4b2bfb"} Apr 16 18:20:19.258530 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:19.258488 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-xcpp6" podStartSLOduration=157.380111506 podStartE2EDuration="2m39.258474799s" podCreationTimestamp="2026-04-16 18:17:40 +0000 UTC" firstStartedPulling="2026-04-16 18:20:17.106113112 +0000 UTC m=+161.953273581" lastFinishedPulling="2026-04-16 18:20:18.984476391 +0000 UTC m=+163.831636874" observedRunningTime="2026-04-16 18:20:19.257559587 +0000 UTC m=+164.104720072" watchObservedRunningTime="2026-04-16 18:20:19.258474799 +0000 UTC m=+164.105635280" Apr 16 18:20:19.277162 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:19.277116 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-db7l8" podStartSLOduration=129.400271987 podStartE2EDuration="2m11.277100724s" podCreationTimestamp="2026-04-16 18:18:08 +0000 UTC" firstStartedPulling="2026-04-16 18:20:17.111427913 +0000 UTC m=+161.958588382" lastFinishedPulling="2026-04-16 18:20:18.988256651 +0000 UTC m=+163.835417119" observedRunningTime="2026-04-16 18:20:19.275608266 +0000 UTC m=+164.122768774" watchObservedRunningTime="2026-04-16 18:20:19.277100724 +0000 UTC m=+164.124261215" Apr 16 18:20:19.293219 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:19.293183 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-g7fm9" podStartSLOduration=129.190230252 podStartE2EDuration="2m11.293171333s" podCreationTimestamp="2026-04-16 18:18:08 +0000 UTC" firstStartedPulling="2026-04-16 18:20:16.883377038 +0000 UTC m=+161.730537506" lastFinishedPulling="2026-04-16 18:20:18.986318102 +0000 UTC m=+163.833478587" observedRunningTime="2026-04-16 18:20:19.291332166 +0000 UTC m=+164.138492676" watchObservedRunningTime="2026-04-16 18:20:19.293171333 +0000 UTC m=+164.140331824" Apr 16 18:20:20.247092 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:20.247025 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-db7l8" Apr 16 18:20:21.251659 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:21.251628 2579 generic.go:358] "Generic (PLEG): container finished" podID="eb1d7266-b90b-456f-9e72-104117e32970" containerID="753849dded6b9cd448bed0a12196da9e2daa2addd4c463d58cc3118441215944" exitCode=255 Apr 16 18:20:21.252004 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:21.251715 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-574bddd967-6xntm" event={"ID":"eb1d7266-b90b-456f-9e72-104117e32970","Type":"ContainerDied","Data":"753849dded6b9cd448bed0a12196da9e2daa2addd4c463d58cc3118441215944"} Apr 16 18:20:21.252147 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:21.252125 2579 scope.go:117] "RemoveContainer" containerID="753849dded6b9cd448bed0a12196da9e2daa2addd4c463d58cc3118441215944" Apr 16 18:20:22.259709 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:22.259670 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-574bddd967-6xntm" event={"ID":"eb1d7266-b90b-456f-9e72-104117e32970","Type":"ContainerStarted","Data":"ab46880d4ebb54679155dc1978193e753e970b848deceb1698f8d86716b08a63"} Apr 16 18:20:23.569504 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.569470 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-cgsl2"] Apr 16 18:20:23.574019 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.574003 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cgsl2" Apr 16 18:20:23.580584 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.580558 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:20:23.580780 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.580761 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:20:23.581072 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.581053 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4fhn9\"" Apr 16 18:20:23.581204 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.581188 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:20:23.583801 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.583787 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:20:23.594632 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.594605 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cgsl2"] Apr 16 18:20:23.702910 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.702887 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8d567708-c8b6-4dd6-a935-bbba6f557f09-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cgsl2\" (UID: \"8d567708-c8b6-4dd6-a935-bbba6f557f09\") " pod="openshift-insights/insights-runtime-extractor-cgsl2" Apr 16 18:20:23.703005 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.702921 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8d567708-c8b6-4dd6-a935-bbba6f557f09-data-volume\") pod \"insights-runtime-extractor-cgsl2\" (UID: \"8d567708-c8b6-4dd6-a935-bbba6f557f09\") " pod="openshift-insights/insights-runtime-extractor-cgsl2" Apr 16 18:20:23.703005 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.702940 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shcdn\" (UniqueName: \"kubernetes.io/projected/8d567708-c8b6-4dd6-a935-bbba6f557f09-kube-api-access-shcdn\") pod \"insights-runtime-extractor-cgsl2\" (UID: \"8d567708-c8b6-4dd6-a935-bbba6f557f09\") " pod="openshift-insights/insights-runtime-extractor-cgsl2" Apr 16 18:20:23.703005 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.702956 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8d567708-c8b6-4dd6-a935-bbba6f557f09-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cgsl2\" (UID: \"8d567708-c8b6-4dd6-a935-bbba6f557f09\") " pod="openshift-insights/insights-runtime-extractor-cgsl2" Apr 16 18:20:23.703005 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.702996 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8d567708-c8b6-4dd6-a935-bbba6f557f09-crio-socket\") pod \"insights-runtime-extractor-cgsl2\" (UID: \"8d567708-c8b6-4dd6-a935-bbba6f557f09\") " pod="openshift-insights/insights-runtime-extractor-cgsl2" Apr 16 18:20:23.803334 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.803306 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8d567708-c8b6-4dd6-a935-bbba6f557f09-data-volume\") pod \"insights-runtime-extractor-cgsl2\" (UID: \"8d567708-c8b6-4dd6-a935-bbba6f557f09\") " pod="openshift-insights/insights-runtime-extractor-cgsl2" Apr 16 18:20:23.803449 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.803342 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shcdn\" (UniqueName: \"kubernetes.io/projected/8d567708-c8b6-4dd6-a935-bbba6f557f09-kube-api-access-shcdn\") pod \"insights-runtime-extractor-cgsl2\" (UID: \"8d567708-c8b6-4dd6-a935-bbba6f557f09\") " pod="openshift-insights/insights-runtime-extractor-cgsl2" Apr 16 18:20:23.803449 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.803370 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8d567708-c8b6-4dd6-a935-bbba6f557f09-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cgsl2\" (UID: \"8d567708-c8b6-4dd6-a935-bbba6f557f09\") " pod="openshift-insights/insights-runtime-extractor-cgsl2" Apr 16 18:20:23.803449 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.803405 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8d567708-c8b6-4dd6-a935-bbba6f557f09-crio-socket\") pod \"insights-runtime-extractor-cgsl2\" (UID: \"8d567708-c8b6-4dd6-a935-bbba6f557f09\") " pod="openshift-insights/insights-runtime-extractor-cgsl2" Apr 16 18:20:23.803603 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.803490 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8d567708-c8b6-4dd6-a935-bbba6f557f09-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cgsl2\" (UID: \"8d567708-c8b6-4dd6-a935-bbba6f557f09\") " pod="openshift-insights/insights-runtime-extractor-cgsl2" Apr 16 18:20:23.803655 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.803600 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8d567708-c8b6-4dd6-a935-bbba6f557f09-crio-socket\") pod \"insights-runtime-extractor-cgsl2\" (UID: \"8d567708-c8b6-4dd6-a935-bbba6f557f09\") " pod="openshift-insights/insights-runtime-extractor-cgsl2" Apr 16 18:20:23.803715 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.803666 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8d567708-c8b6-4dd6-a935-bbba6f557f09-data-volume\") pod \"insights-runtime-extractor-cgsl2\" (UID: \"8d567708-c8b6-4dd6-a935-bbba6f557f09\") " pod="openshift-insights/insights-runtime-extractor-cgsl2" Apr 16 18:20:23.804025 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.804004 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8d567708-c8b6-4dd6-a935-bbba6f557f09-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cgsl2\" (UID: \"8d567708-c8b6-4dd6-a935-bbba6f557f09\") " pod="openshift-insights/insights-runtime-extractor-cgsl2" Apr 16 18:20:23.805828 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.805807 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8d567708-c8b6-4dd6-a935-bbba6f557f09-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cgsl2\" (UID: \"8d567708-c8b6-4dd6-a935-bbba6f557f09\") " pod="openshift-insights/insights-runtime-extractor-cgsl2" Apr 16 18:20:23.814703 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.814679 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shcdn\" (UniqueName: \"kubernetes.io/projected/8d567708-c8b6-4dd6-a935-bbba6f557f09-kube-api-access-shcdn\") pod \"insights-runtime-extractor-cgsl2\" (UID: \"8d567708-c8b6-4dd6-a935-bbba6f557f09\") " pod="openshift-insights/insights-runtime-extractor-cgsl2" Apr 16 18:20:23.882990 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:23.882967 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cgsl2" Apr 16 18:20:24.000489 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:24.000458 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cgsl2"] Apr 16 18:20:24.003375 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:20:24.003349 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d567708_c8b6_4dd6_a935_bbba6f557f09.slice/crio-3279de8f749ba3fd6468f5833f03f42a9215b5ab4d67d79f267253cfdbb9be49 WatchSource:0}: Error finding container 3279de8f749ba3fd6468f5833f03f42a9215b5ab4d67d79f267253cfdbb9be49: Status 404 returned error can't find the container with id 3279de8f749ba3fd6468f5833f03f42a9215b5ab4d67d79f267253cfdbb9be49 Apr 16 18:20:24.267581 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:24.267515 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cgsl2" event={"ID":"8d567708-c8b6-4dd6-a935-bbba6f557f09","Type":"ContainerStarted","Data":"e1630a133677c97239df2d1a682e0666ae7eeddeea253da0346a815e3b8af866"} Apr 16 18:20:24.267581 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:24.267552 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cgsl2" event={"ID":"8d567708-c8b6-4dd6-a935-bbba6f557f09","Type":"ContainerStarted","Data":"3279de8f749ba3fd6468f5833f03f42a9215b5ab4d67d79f267253cfdbb9be49"} Apr 16 18:20:24.750461 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:24.750441 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:20:25.272515 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:25.272483 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cgsl2" event={"ID":"8d567708-c8b6-4dd6-a935-bbba6f557f09","Type":"ContainerStarted","Data":"c378f6b5bed0efd00f2be57214704539f017459c0b1ac7b51aac121b4e2a4bab"} Apr 16 18:20:26.277439 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:26.277374 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cgsl2" event={"ID":"8d567708-c8b6-4dd6-a935-bbba6f557f09","Type":"ContainerStarted","Data":"b28a8891e6343ab7e42f26fb80c056a5d68ce87d90d418583b8b35b05595a49c"} Apr 16 18:20:26.298378 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:26.298335 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-cgsl2" podStartSLOduration=1.420433176 podStartE2EDuration="3.298321144s" podCreationTimestamp="2026-04-16 18:20:23 +0000 UTC" firstStartedPulling="2026-04-16 18:20:24.058642311 +0000 UTC m=+168.905802784" lastFinishedPulling="2026-04-16 18:20:25.936530268 +0000 UTC m=+170.783690752" observedRunningTime="2026-04-16 18:20:26.296993249 +0000 UTC m=+171.144153740" watchObservedRunningTime="2026-04-16 18:20:26.298321144 +0000 UTC m=+171.145481634" Apr 16 18:20:30.254103 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:30.254073 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-db7l8" Apr 16 18:20:36.721591 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:36.721551 2579 patch_prober.go:28] interesting pod/image-registry-67744d6d54-rrhs6 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:20:36.722082 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:36.721610 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" podUID="e8a62d11-c318-4e87-bc46-0ef49c451c49" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:20:38.140787 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.140753 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-5sn5t"] Apr 16 18:20:38.144770 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.144750 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.146963 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.146933 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:20:38.147126 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.146936 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:20:38.147126 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.147005 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:20:38.147126 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.147007 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-w7vqp\"" Apr 16 18:20:38.147334 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.147219 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:20:38.147334 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.147307 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:20:38.147710 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.147693 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:20:38.304746 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.304702 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0835cb09-27df-467b-b5a7-67f20c2fce38-root\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.304746 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.304755 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0835cb09-27df-467b-b5a7-67f20c2fce38-node-exporter-accelerators-collector-config\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.304951 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.304810 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0835cb09-27df-467b-b5a7-67f20c2fce38-node-exporter-textfile\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.304951 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.304872 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0835cb09-27df-467b-b5a7-67f20c2fce38-node-exporter-wtmp\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.304951 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.304935 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0835cb09-27df-467b-b5a7-67f20c2fce38-sys\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.305101 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.304966 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0835cb09-27df-467b-b5a7-67f20c2fce38-node-exporter-tls\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.305101 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.305012 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0835cb09-27df-467b-b5a7-67f20c2fce38-metrics-client-ca\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.305101 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.305077 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0835cb09-27df-467b-b5a7-67f20c2fce38-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.305209 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.305170 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnddf\" (UniqueName: \"kubernetes.io/projected/0835cb09-27df-467b-b5a7-67f20c2fce38-kube-api-access-nnddf\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.405927 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.405856 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0835cb09-27df-467b-b5a7-67f20c2fce38-root\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.405927 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.405894 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0835cb09-27df-467b-b5a7-67f20c2fce38-node-exporter-accelerators-collector-config\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.405927 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.405914 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0835cb09-27df-467b-b5a7-67f20c2fce38-node-exporter-textfile\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.405927 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.405931 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0835cb09-27df-467b-b5a7-67f20c2fce38-node-exporter-wtmp\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.406198 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.405952 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0835cb09-27df-467b-b5a7-67f20c2fce38-sys\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.406198 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.405969 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0835cb09-27df-467b-b5a7-67f20c2fce38-node-exporter-tls\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.406198 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.405999 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0835cb09-27df-467b-b5a7-67f20c2fce38-sys\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.406198 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.405970 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0835cb09-27df-467b-b5a7-67f20c2fce38-root\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.406198 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.406007 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0835cb09-27df-467b-b5a7-67f20c2fce38-metrics-client-ca\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.406198 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.406071 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0835cb09-27df-467b-b5a7-67f20c2fce38-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.406198 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.406121 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0835cb09-27df-467b-b5a7-67f20c2fce38-node-exporter-wtmp\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.406198 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.406126 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnddf\" (UniqueName: \"kubernetes.io/projected/0835cb09-27df-467b-b5a7-67f20c2fce38-kube-api-access-nnddf\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.406639 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.406323 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0835cb09-27df-467b-b5a7-67f20c2fce38-node-exporter-textfile\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.406757 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.406731 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0835cb09-27df-467b-b5a7-67f20c2fce38-node-exporter-accelerators-collector-config\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.406839 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.406812 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0835cb09-27df-467b-b5a7-67f20c2fce38-metrics-client-ca\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.408657 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.408637 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0835cb09-27df-467b-b5a7-67f20c2fce38-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.408919 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.408893 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0835cb09-27df-467b-b5a7-67f20c2fce38-node-exporter-tls\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.415212 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.415182 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnddf\" (UniqueName: \"kubernetes.io/projected/0835cb09-27df-467b-b5a7-67f20c2fce38-kube-api-access-nnddf\") pod \"node-exporter-5sn5t\" (UID: \"0835cb09-27df-467b-b5a7-67f20c2fce38\") " pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.453962 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:38.453941 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5sn5t" Apr 16 18:20:38.461987 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:20:38.461964 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0835cb09_27df_467b_b5a7_67f20c2fce38.slice/crio-d5de34d0c4607ce3ba5ebf73629766d5d9fe4f295d8d1d9a335f3d7d21494f14 WatchSource:0}: Error finding container d5de34d0c4607ce3ba5ebf73629766d5d9fe4f295d8d1d9a335f3d7d21494f14: Status 404 returned error can't find the container with id d5de34d0c4607ce3ba5ebf73629766d5d9fe4f295d8d1d9a335f3d7d21494f14 Apr 16 18:20:39.247264 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:39.247238 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:20:39.313742 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:39.313720 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5sn5t" event={"ID":"0835cb09-27df-467b-b5a7-67f20c2fce38","Type":"ContainerStarted","Data":"5a402c6ae94c006cae03db6676e6d062d7bd8f8acb5d363f980990285fa2e558"} Apr 16 18:20:39.313868 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:39.313752 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5sn5t" event={"ID":"0835cb09-27df-467b-b5a7-67f20c2fce38","Type":"ContainerStarted","Data":"d5de34d0c4607ce3ba5ebf73629766d5d9fe4f295d8d1d9a335f3d7d21494f14"} Apr 16 18:20:40.321414 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:40.321381 2579 generic.go:358] "Generic (PLEG): container finished" podID="0835cb09-27df-467b-b5a7-67f20c2fce38" containerID="5a402c6ae94c006cae03db6676e6d062d7bd8f8acb5d363f980990285fa2e558" exitCode=0 Apr 16 18:20:40.321765 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:40.321429 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5sn5t" event={"ID":"0835cb09-27df-467b-b5a7-67f20c2fce38","Type":"ContainerDied","Data":"5a402c6ae94c006cae03db6676e6d062d7bd8f8acb5d363f980990285fa2e558"} Apr 16 18:20:41.326092 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:41.326054 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5sn5t" event={"ID":"0835cb09-27df-467b-b5a7-67f20c2fce38","Type":"ContainerStarted","Data":"9d2e727deea9781223cee945c9a34db615d32357efd107dbf7d8c7e9de97ca84"} Apr 16 18:20:41.326092 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:41.326089 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5sn5t" event={"ID":"0835cb09-27df-467b-b5a7-67f20c2fce38","Type":"ContainerStarted","Data":"68d9db5a5484715f1895ad5c9b9c29d9429a6d007d92ec5e19d5feb68e30be67"} Apr 16 18:20:41.347279 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:41.347224 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-5sn5t" podStartSLOduration=2.596105101 podStartE2EDuration="3.347210338s" podCreationTimestamp="2026-04-16 18:20:38 +0000 UTC" firstStartedPulling="2026-04-16 18:20:38.463832983 +0000 UTC m=+183.310993464" lastFinishedPulling="2026-04-16 18:20:39.214938218 +0000 UTC m=+184.062098701" observedRunningTime="2026-04-16 18:20:41.345423221 +0000 UTC m=+186.192583714" watchObservedRunningTime="2026-04-16 18:20:41.347210338 +0000 UTC m=+186.194370829" Apr 16 18:20:45.242724 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:45.242692 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-67744d6d54-rrhs6"] Apr 16 18:20:48.920965 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:48.920921 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" podUID="832eb3b9-2a4f-434a-b186-de137df079bb" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 18:20:58.920434 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:20:58.920397 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" podUID="832eb3b9-2a4f-434a-b186-de137df079bb" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 18:21:08.921139 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:08.921101 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" podUID="832eb3b9-2a4f-434a-b186-de137df079bb" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 18:21:08.921517 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:08.921176 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" Apr 16 18:21:08.921724 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:08.921690 2579 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"283da32cd7c1cddf193b855cf72d03d8f3f627bdc702d6b1cf8aca7963357e14"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 18:21:08.921764 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:08.921747 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" podUID="832eb3b9-2a4f-434a-b186-de137df079bb" containerName="service-proxy" containerID="cri-o://283da32cd7c1cddf193b855cf72d03d8f3f627bdc702d6b1cf8aca7963357e14" gracePeriod=30 Apr 16 18:21:09.396871 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:09.396836 2579 generic.go:358] "Generic (PLEG): container finished" podID="832eb3b9-2a4f-434a-b186-de137df079bb" containerID="283da32cd7c1cddf193b855cf72d03d8f3f627bdc702d6b1cf8aca7963357e14" exitCode=2 Apr 16 18:21:09.396982 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:09.396890 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" event={"ID":"832eb3b9-2a4f-434a-b186-de137df079bb","Type":"ContainerDied","Data":"283da32cd7c1cddf193b855cf72d03d8f3f627bdc702d6b1cf8aca7963357e14"} Apr 16 18:21:09.396982 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:09.396919 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d9c75dd55-f2hwb" event={"ID":"832eb3b9-2a4f-434a-b186-de137df079bb","Type":"ContainerStarted","Data":"5a2653bfdfc53b00dd047a0e20c2b8fdd8fc794399c4c33c9a6803ae9dc8b85d"} Apr 16 18:21:10.264170 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.264115 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" podUID="e8a62d11-c318-4e87-bc46-0ef49c451c49" containerName="registry" containerID="cri-o://37462c6b7db76c7a50e1894107ef5801b60902a270526c2db12862b64cb7b703" gracePeriod=30 Apr 16 18:21:10.401156 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.401126 2579 generic.go:358] "Generic (PLEG): container finished" podID="e8a62d11-c318-4e87-bc46-0ef49c451c49" containerID="37462c6b7db76c7a50e1894107ef5801b60902a270526c2db12862b64cb7b703" exitCode=0 Apr 16 18:21:10.401313 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.401177 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" event={"ID":"e8a62d11-c318-4e87-bc46-0ef49c451c49","Type":"ContainerDied","Data":"37462c6b7db76c7a50e1894107ef5801b60902a270526c2db12862b64cb7b703"} Apr 16 18:21:10.518900 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.518853 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:21:10.619974 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.619950 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e8a62d11-c318-4e87-bc46-0ef49c451c49-trusted-ca\") pod \"e8a62d11-c318-4e87-bc46-0ef49c451c49\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " Apr 16 18:21:10.620097 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.619983 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmnn7\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-kube-api-access-mmnn7\") pod \"e8a62d11-c318-4e87-bc46-0ef49c451c49\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " Apr 16 18:21:10.620097 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.620003 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e8a62d11-c318-4e87-bc46-0ef49c451c49-image-registry-private-configuration\") pod \"e8a62d11-c318-4e87-bc46-0ef49c451c49\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " Apr 16 18:21:10.620097 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.620037 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e8a62d11-c318-4e87-bc46-0ef49c451c49-ca-trust-extracted\") pod \"e8a62d11-c318-4e87-bc46-0ef49c451c49\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " Apr 16 18:21:10.620097 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.620064 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e8a62d11-c318-4e87-bc46-0ef49c451c49-installation-pull-secrets\") pod \"e8a62d11-c318-4e87-bc46-0ef49c451c49\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " Apr 16 18:21:10.620333 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.620104 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-certificates\") pod \"e8a62d11-c318-4e87-bc46-0ef49c451c49\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " Apr 16 18:21:10.620333 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.620149 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls\") pod \"e8a62d11-c318-4e87-bc46-0ef49c451c49\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " Apr 16 18:21:10.620333 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.620176 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-bound-sa-token\") pod \"e8a62d11-c318-4e87-bc46-0ef49c451c49\" (UID: \"e8a62d11-c318-4e87-bc46-0ef49c451c49\") " Apr 16 18:21:10.620579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.620520 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e8a62d11-c318-4e87-bc46-0ef49c451c49" (UID: "e8a62d11-c318-4e87-bc46-0ef49c451c49"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:21:10.620735 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.620706 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8a62d11-c318-4e87-bc46-0ef49c451c49-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e8a62d11-c318-4e87-bc46-0ef49c451c49" (UID: "e8a62d11-c318-4e87-bc46-0ef49c451c49"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:21:10.622561 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.622483 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e8a62d11-c318-4e87-bc46-0ef49c451c49" (UID: "e8a62d11-c318-4e87-bc46-0ef49c451c49"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:21:10.622668 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.622590 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-kube-api-access-mmnn7" (OuterVolumeSpecName: "kube-api-access-mmnn7") pod "e8a62d11-c318-4e87-bc46-0ef49c451c49" (UID: "e8a62d11-c318-4e87-bc46-0ef49c451c49"). InnerVolumeSpecName "kube-api-access-mmnn7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:21:10.622782 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.622714 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8a62d11-c318-4e87-bc46-0ef49c451c49-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e8a62d11-c318-4e87-bc46-0ef49c451c49" (UID: "e8a62d11-c318-4e87-bc46-0ef49c451c49"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:10.622826 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.622775 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e8a62d11-c318-4e87-bc46-0ef49c451c49" (UID: "e8a62d11-c318-4e87-bc46-0ef49c451c49"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:21:10.622826 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.622797 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8a62d11-c318-4e87-bc46-0ef49c451c49-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "e8a62d11-c318-4e87-bc46-0ef49c451c49" (UID: "e8a62d11-c318-4e87-bc46-0ef49c451c49"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:10.629874 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.629849 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8a62d11-c318-4e87-bc46-0ef49c451c49-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e8a62d11-c318-4e87-bc46-0ef49c451c49" (UID: "e8a62d11-c318-4e87-bc46-0ef49c451c49"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:21:10.721039 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.721018 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e8a62d11-c318-4e87-bc46-0ef49c451c49-trusted-ca\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:21:10.721039 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.721036 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mmnn7\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-kube-api-access-mmnn7\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:21:10.721152 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.721047 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e8a62d11-c318-4e87-bc46-0ef49c451c49-image-registry-private-configuration\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:21:10.721152 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.721057 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e8a62d11-c318-4e87-bc46-0ef49c451c49-ca-trust-extracted\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:21:10.721152 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.721066 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e8a62d11-c318-4e87-bc46-0ef49c451c49-installation-pull-secrets\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:21:10.721152 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.721074 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-certificates\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:21:10.721152 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.721083 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-registry-tls\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:21:10.721152 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:10.721091 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e8a62d11-c318-4e87-bc46-0ef49c451c49-bound-sa-token\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:21:11.405681 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:11.405639 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" event={"ID":"e8a62d11-c318-4e87-bc46-0ef49c451c49","Type":"ContainerDied","Data":"5e39f3ce6324d69da068d002a43d65da3e0d3ed7023bca1b0f039456d809742d"} Apr 16 18:21:11.405681 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:11.405681 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67744d6d54-rrhs6" Apr 16 18:21:11.406197 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:11.405701 2579 scope.go:117] "RemoveContainer" containerID="37462c6b7db76c7a50e1894107ef5801b60902a270526c2db12862b64cb7b703" Apr 16 18:21:11.428397 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:11.428350 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-67744d6d54-rrhs6"] Apr 16 18:21:11.433019 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:11.432996 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-67744d6d54-rrhs6"] Apr 16 18:21:11.754304 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:11.754206 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8a62d11-c318-4e87-bc46-0ef49c451c49" path="/var/lib/kubelet/pods/e8a62d11-c318-4e87-bc46-0ef49c451c49/volumes" Apr 16 18:21:47.573502 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:47.573463 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs\") pod \"network-metrics-daemon-k6s9z\" (UID: \"e2bb1680-b343-4014-bde1-6cc6bcd9805c\") " pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:21:47.575945 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:47.575922 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2bb1680-b343-4014-bde1-6cc6bcd9805c-metrics-certs\") pod \"network-metrics-daemon-k6s9z\" (UID: \"e2bb1680-b343-4014-bde1-6cc6bcd9805c\") " pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:21:47.853349 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:47.853329 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-k7xtv\"" Apr 16 18:21:47.861418 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:47.861398 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6s9z" Apr 16 18:21:47.980369 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:47.980337 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k6s9z"] Apr 16 18:21:47.983256 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:21:47.983229 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2bb1680_b343_4014_bde1_6cc6bcd9805c.slice/crio-000999be6d26184e2488891d09287cbba3c327c68e24392ef2491cee5578059b WatchSource:0}: Error finding container 000999be6d26184e2488891d09287cbba3c327c68e24392ef2491cee5578059b: Status 404 returned error can't find the container with id 000999be6d26184e2488891d09287cbba3c327c68e24392ef2491cee5578059b Apr 16 18:21:48.504195 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:48.504165 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k6s9z" event={"ID":"e2bb1680-b343-4014-bde1-6cc6bcd9805c","Type":"ContainerStarted","Data":"000999be6d26184e2488891d09287cbba3c327c68e24392ef2491cee5578059b"} Apr 16 18:21:49.508704 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:49.508670 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k6s9z" event={"ID":"e2bb1680-b343-4014-bde1-6cc6bcd9805c","Type":"ContainerStarted","Data":"797565e8e891077a082194ef4c4af6bb5f45097346840d2f6e67ff5b1b03d7e7"} Apr 16 18:21:49.508704 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:49.508708 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k6s9z" event={"ID":"e2bb1680-b343-4014-bde1-6cc6bcd9805c","Type":"ContainerStarted","Data":"ff8311211625777de87d1082ab9fa9036ed10b8747eb3f0279cb5e8e6e89af13"} Apr 16 18:21:49.525056 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:21:49.525017 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-k6s9z" podStartSLOduration=253.568918475 podStartE2EDuration="4m14.525001622s" podCreationTimestamp="2026-04-16 18:17:35 +0000 UTC" firstStartedPulling="2026-04-16 18:21:47.984926058 +0000 UTC m=+252.832086526" lastFinishedPulling="2026-04-16 18:21:48.9410092 +0000 UTC m=+253.788169673" observedRunningTime="2026-04-16 18:21:49.524676687 +0000 UTC m=+254.371837184" watchObservedRunningTime="2026-04-16 18:21:49.525001622 +0000 UTC m=+254.372162114" Apr 16 18:22:35.671132 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:22:35.671098 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 18:22:35.671677 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:22:35.671206 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 18:22:35.676153 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:22:35.676130 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:23:18.590620 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:18.590582 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-n9xsw"] Apr 16 18:23:18.591061 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:18.590894 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8a62d11-c318-4e87-bc46-0ef49c451c49" containerName="registry" Apr 16 18:23:18.591061 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:18.590910 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a62d11-c318-4e87-bc46-0ef49c451c49" containerName="registry" Apr 16 18:23:18.591061 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:18.590989 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8a62d11-c318-4e87-bc46-0ef49c451c49" containerName="registry" Apr 16 18:23:18.593687 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:18.593672 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-n9xsw" Apr 16 18:23:18.595922 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:18.595892 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 18:23:18.595922 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:18.595911 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 18:23:18.595922 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:18.595892 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 18:23:18.596533 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:18.596507 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-pffcp\"" Apr 16 18:23:18.596533 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:18.596515 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 18:23:18.602398 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:18.602374 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-n9xsw"] Apr 16 18:23:18.768512 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:18.768488 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/65b09ec5-8db8-46d8-a225-468c63b5a2d9-certificates\") pod \"keda-admission-cf49989db-n9xsw\" (UID: \"65b09ec5-8db8-46d8-a225-468c63b5a2d9\") " pod="openshift-keda/keda-admission-cf49989db-n9xsw" Apr 16 18:23:18.768634 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:18.768518 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnlbg\" (UniqueName: \"kubernetes.io/projected/65b09ec5-8db8-46d8-a225-468c63b5a2d9-kube-api-access-qnlbg\") pod \"keda-admission-cf49989db-n9xsw\" (UID: \"65b09ec5-8db8-46d8-a225-468c63b5a2d9\") " pod="openshift-keda/keda-admission-cf49989db-n9xsw" Apr 16 18:23:18.868976 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:18.868955 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/65b09ec5-8db8-46d8-a225-468c63b5a2d9-certificates\") pod \"keda-admission-cf49989db-n9xsw\" (UID: \"65b09ec5-8db8-46d8-a225-468c63b5a2d9\") " pod="openshift-keda/keda-admission-cf49989db-n9xsw" Apr 16 18:23:18.869085 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:18.868983 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qnlbg\" (UniqueName: \"kubernetes.io/projected/65b09ec5-8db8-46d8-a225-468c63b5a2d9-kube-api-access-qnlbg\") pod \"keda-admission-cf49989db-n9xsw\" (UID: \"65b09ec5-8db8-46d8-a225-468c63b5a2d9\") " pod="openshift-keda/keda-admission-cf49989db-n9xsw" Apr 16 18:23:18.871547 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:18.871522 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/65b09ec5-8db8-46d8-a225-468c63b5a2d9-certificates\") pod \"keda-admission-cf49989db-n9xsw\" (UID: \"65b09ec5-8db8-46d8-a225-468c63b5a2d9\") " pod="openshift-keda/keda-admission-cf49989db-n9xsw" Apr 16 18:23:18.881336 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:18.881312 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnlbg\" (UniqueName: \"kubernetes.io/projected/65b09ec5-8db8-46d8-a225-468c63b5a2d9-kube-api-access-qnlbg\") pod \"keda-admission-cf49989db-n9xsw\" (UID: \"65b09ec5-8db8-46d8-a225-468c63b5a2d9\") " pod="openshift-keda/keda-admission-cf49989db-n9xsw" Apr 16 18:23:18.904377 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:18.904353 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-n9xsw" Apr 16 18:23:19.017647 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:19.017629 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-n9xsw"] Apr 16 18:23:19.020091 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:23:19.020064 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65b09ec5_8db8_46d8_a225_468c63b5a2d9.slice/crio-c24ea4d3fe1efb0bdf688440cd48e35d294c1875471b1117f912a1bc2da02713 WatchSource:0}: Error finding container c24ea4d3fe1efb0bdf688440cd48e35d294c1875471b1117f912a1bc2da02713: Status 404 returned error can't find the container with id c24ea4d3fe1efb0bdf688440cd48e35d294c1875471b1117f912a1bc2da02713 Apr 16 18:23:19.021259 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:19.021243 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:23:19.767909 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:19.767861 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-n9xsw" event={"ID":"65b09ec5-8db8-46d8-a225-468c63b5a2d9","Type":"ContainerStarted","Data":"c24ea4d3fe1efb0bdf688440cd48e35d294c1875471b1117f912a1bc2da02713"} Apr 16 18:23:21.774954 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:21.774921 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-n9xsw" event={"ID":"65b09ec5-8db8-46d8-a225-468c63b5a2d9","Type":"ContainerStarted","Data":"b10cb74b5b6c94c6201f3c91d7cf6ebf7881ee2bd08bb4d9260f06f9c5c8e9f2"} Apr 16 18:23:21.775320 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:21.775066 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-n9xsw" Apr 16 18:23:21.792494 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:21.792444 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-n9xsw" podStartSLOduration=1.47978313 podStartE2EDuration="3.79240859s" podCreationTimestamp="2026-04-16 18:23:18 +0000 UTC" firstStartedPulling="2026-04-16 18:23:19.021399755 +0000 UTC m=+343.868560223" lastFinishedPulling="2026-04-16 18:23:21.334025198 +0000 UTC m=+346.181185683" observedRunningTime="2026-04-16 18:23:21.790734747 +0000 UTC m=+346.637895238" watchObservedRunningTime="2026-04-16 18:23:21.79240859 +0000 UTC m=+346.639569081" Apr 16 18:23:42.780700 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:23:42.780669 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-n9xsw" Apr 16 18:24:26.959519 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:24:26.959490 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7668d57578-pls42"] Apr 16 18:24:26.961486 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:24:26.961462 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7668d57578-pls42" Apr 16 18:24:26.963805 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:24:26.963784 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 18:24:26.965796 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:24:26.965778 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:24:26.965899 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:24:26.965847 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-k6bsp\"" Apr 16 18:24:26.965962 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:24:26.965905 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:24:26.981900 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:24:26.981877 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7668d57578-pls42"] Apr 16 18:24:26.998943 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:24:26.998920 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd5f0449-dcf1-4855-8563-6fa72c56fa6a-cert\") pod \"kserve-controller-manager-7668d57578-pls42\" (UID: \"cd5f0449-dcf1-4855-8563-6fa72c56fa6a\") " pod="kserve/kserve-controller-manager-7668d57578-pls42" Apr 16 18:24:26.999028 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:24:26.998971 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkf2s\" (UniqueName: \"kubernetes.io/projected/cd5f0449-dcf1-4855-8563-6fa72c56fa6a-kube-api-access-hkf2s\") pod \"kserve-controller-manager-7668d57578-pls42\" (UID: \"cd5f0449-dcf1-4855-8563-6fa72c56fa6a\") " pod="kserve/kserve-controller-manager-7668d57578-pls42" Apr 16 18:24:27.099975 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:24:27.099951 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd5f0449-dcf1-4855-8563-6fa72c56fa6a-cert\") pod \"kserve-controller-manager-7668d57578-pls42\" (UID: \"cd5f0449-dcf1-4855-8563-6fa72c56fa6a\") " pod="kserve/kserve-controller-manager-7668d57578-pls42" Apr 16 18:24:27.100065 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:24:27.099993 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkf2s\" (UniqueName: \"kubernetes.io/projected/cd5f0449-dcf1-4855-8563-6fa72c56fa6a-kube-api-access-hkf2s\") pod \"kserve-controller-manager-7668d57578-pls42\" (UID: \"cd5f0449-dcf1-4855-8563-6fa72c56fa6a\") " pod="kserve/kserve-controller-manager-7668d57578-pls42" Apr 16 18:24:27.102209 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:24:27.102191 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd5f0449-dcf1-4855-8563-6fa72c56fa6a-cert\") pod \"kserve-controller-manager-7668d57578-pls42\" (UID: \"cd5f0449-dcf1-4855-8563-6fa72c56fa6a\") " pod="kserve/kserve-controller-manager-7668d57578-pls42" Apr 16 18:24:27.110677 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:24:27.110651 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkf2s\" (UniqueName: \"kubernetes.io/projected/cd5f0449-dcf1-4855-8563-6fa72c56fa6a-kube-api-access-hkf2s\") pod \"kserve-controller-manager-7668d57578-pls42\" (UID: \"cd5f0449-dcf1-4855-8563-6fa72c56fa6a\") " pod="kserve/kserve-controller-manager-7668d57578-pls42" Apr 16 18:24:27.271577 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:24:27.271490 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7668d57578-pls42" Apr 16 18:24:27.384789 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:24:27.384765 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7668d57578-pls42"] Apr 16 18:24:27.387191 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:24:27.387164 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd5f0449_dcf1_4855_8563_6fa72c56fa6a.slice/crio-e1e429f12d6782b58b74a7d6290f9495a959ffc7e8f19f253e730548a5da4006 WatchSource:0}: Error finding container e1e429f12d6782b58b74a7d6290f9495a959ffc7e8f19f253e730548a5da4006: Status 404 returned error can't find the container with id e1e429f12d6782b58b74a7d6290f9495a959ffc7e8f19f253e730548a5da4006 Apr 16 18:24:27.951049 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:24:27.951005 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7668d57578-pls42" event={"ID":"cd5f0449-dcf1-4855-8563-6fa72c56fa6a","Type":"ContainerStarted","Data":"e1e429f12d6782b58b74a7d6290f9495a959ffc7e8f19f253e730548a5da4006"} Apr 16 18:24:30.960956 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:24:30.960918 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7668d57578-pls42" event={"ID":"cd5f0449-dcf1-4855-8563-6fa72c56fa6a","Type":"ContainerStarted","Data":"52d80090c9bb492a160f965119bc9ab92a3f20db691375197a11ef1da698ba8c"} Apr 16 18:24:30.961369 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:24:30.960977 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7668d57578-pls42" Apr 16 18:24:30.978500 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:24:30.978454 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7668d57578-pls42" podStartSLOduration=2.180145091 podStartE2EDuration="4.978439834s" podCreationTimestamp="2026-04-16 18:24:26 +0000 UTC" firstStartedPulling="2026-04-16 18:24:27.38848939 +0000 UTC m=+412.235649859" lastFinishedPulling="2026-04-16 18:24:30.186784128 +0000 UTC m=+415.033944602" observedRunningTime="2026-04-16 18:24:30.977581108 +0000 UTC m=+415.824741599" watchObservedRunningTime="2026-04-16 18:24:30.978439834 +0000 UTC m=+415.825600326" Apr 16 18:25:01.969425 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:01.969389 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7668d57578-pls42" Apr 16 18:25:03.383685 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:03.383648 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7668d57578-pls42"] Apr 16 18:25:03.384109 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:03.383843 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-7668d57578-pls42" podUID="cd5f0449-dcf1-4855-8563-6fa72c56fa6a" containerName="manager" containerID="cri-o://52d80090c9bb492a160f965119bc9ab92a3f20db691375197a11ef1da698ba8c" gracePeriod=10 Apr 16 18:25:03.407494 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:03.407468 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7668d57578-rgt4s"] Apr 16 18:25:03.409522 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:03.409506 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7668d57578-rgt4s" Apr 16 18:25:03.418961 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:03.418932 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7668d57578-rgt4s"] Apr 16 18:25:03.542666 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:03.542630 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a287e58a-ddbe-4e6d-9fc0-b96bb20515af-cert\") pod \"kserve-controller-manager-7668d57578-rgt4s\" (UID: \"a287e58a-ddbe-4e6d-9fc0-b96bb20515af\") " pod="kserve/kserve-controller-manager-7668d57578-rgt4s" Apr 16 18:25:03.542778 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:03.542681 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjbl4\" (UniqueName: \"kubernetes.io/projected/a287e58a-ddbe-4e6d-9fc0-b96bb20515af-kube-api-access-jjbl4\") pod \"kserve-controller-manager-7668d57578-rgt4s\" (UID: \"a287e58a-ddbe-4e6d-9fc0-b96bb20515af\") " pod="kserve/kserve-controller-manager-7668d57578-rgt4s" Apr 16 18:25:03.609685 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:03.609665 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7668d57578-pls42" Apr 16 18:25:03.643929 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:03.643870 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjbl4\" (UniqueName: \"kubernetes.io/projected/a287e58a-ddbe-4e6d-9fc0-b96bb20515af-kube-api-access-jjbl4\") pod \"kserve-controller-manager-7668d57578-rgt4s\" (UID: \"a287e58a-ddbe-4e6d-9fc0-b96bb20515af\") " pod="kserve/kserve-controller-manager-7668d57578-rgt4s" Apr 16 18:25:03.644046 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:03.643946 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a287e58a-ddbe-4e6d-9fc0-b96bb20515af-cert\") pod \"kserve-controller-manager-7668d57578-rgt4s\" (UID: \"a287e58a-ddbe-4e6d-9fc0-b96bb20515af\") " pod="kserve/kserve-controller-manager-7668d57578-rgt4s" Apr 16 18:25:03.646226 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:03.646195 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a287e58a-ddbe-4e6d-9fc0-b96bb20515af-cert\") pod \"kserve-controller-manager-7668d57578-rgt4s\" (UID: \"a287e58a-ddbe-4e6d-9fc0-b96bb20515af\") " pod="kserve/kserve-controller-manager-7668d57578-rgt4s" Apr 16 18:25:03.653737 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:03.653711 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjbl4\" (UniqueName: \"kubernetes.io/projected/a287e58a-ddbe-4e6d-9fc0-b96bb20515af-kube-api-access-jjbl4\") pod \"kserve-controller-manager-7668d57578-rgt4s\" (UID: \"a287e58a-ddbe-4e6d-9fc0-b96bb20515af\") " pod="kserve/kserve-controller-manager-7668d57578-rgt4s" Apr 16 18:25:03.744440 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:03.744415 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd5f0449-dcf1-4855-8563-6fa72c56fa6a-cert\") pod \"cd5f0449-dcf1-4855-8563-6fa72c56fa6a\" (UID: \"cd5f0449-dcf1-4855-8563-6fa72c56fa6a\") " Apr 16 18:25:03.744523 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:03.744454 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkf2s\" (UniqueName: \"kubernetes.io/projected/cd5f0449-dcf1-4855-8563-6fa72c56fa6a-kube-api-access-hkf2s\") pod \"cd5f0449-dcf1-4855-8563-6fa72c56fa6a\" (UID: \"cd5f0449-dcf1-4855-8563-6fa72c56fa6a\") " Apr 16 18:25:03.746326 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:03.746303 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd5f0449-dcf1-4855-8563-6fa72c56fa6a-cert" (OuterVolumeSpecName: "cert") pod "cd5f0449-dcf1-4855-8563-6fa72c56fa6a" (UID: "cd5f0449-dcf1-4855-8563-6fa72c56fa6a"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:25:03.746410 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:03.746390 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd5f0449-dcf1-4855-8563-6fa72c56fa6a-kube-api-access-hkf2s" (OuterVolumeSpecName: "kube-api-access-hkf2s") pod "cd5f0449-dcf1-4855-8563-6fa72c56fa6a" (UID: "cd5f0449-dcf1-4855-8563-6fa72c56fa6a"). InnerVolumeSpecName "kube-api-access-hkf2s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:25:03.760166 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:03.760148 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7668d57578-rgt4s" Apr 16 18:25:03.845791 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:03.845763 2579 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd5f0449-dcf1-4855-8563-6fa72c56fa6a-cert\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:25:03.845907 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:03.845796 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hkf2s\" (UniqueName: \"kubernetes.io/projected/cd5f0449-dcf1-4855-8563-6fa72c56fa6a-kube-api-access-hkf2s\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:25:03.875321 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:03.875254 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7668d57578-rgt4s"] Apr 16 18:25:03.878054 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:25:03.878027 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda287e58a_ddbe_4e6d_9fc0_b96bb20515af.slice/crio-6de95bb98a1cbd8af72c73b1a4569dcab4fcd70dad82b41356c91b44ad439078 WatchSource:0}: Error finding container 6de95bb98a1cbd8af72c73b1a4569dcab4fcd70dad82b41356c91b44ad439078: Status 404 returned error can't find the container with id 6de95bb98a1cbd8af72c73b1a4569dcab4fcd70dad82b41356c91b44ad439078 Apr 16 18:25:04.050323 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:04.050235 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7668d57578-rgt4s" event={"ID":"a287e58a-ddbe-4e6d-9fc0-b96bb20515af","Type":"ContainerStarted","Data":"6de95bb98a1cbd8af72c73b1a4569dcab4fcd70dad82b41356c91b44ad439078"} Apr 16 18:25:04.051313 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:04.051290 2579 generic.go:358] "Generic (PLEG): container finished" podID="cd5f0449-dcf1-4855-8563-6fa72c56fa6a" containerID="52d80090c9bb492a160f965119bc9ab92a3f20db691375197a11ef1da698ba8c" exitCode=0 Apr 16 18:25:04.051423 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:04.051338 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7668d57578-pls42" event={"ID":"cd5f0449-dcf1-4855-8563-6fa72c56fa6a","Type":"ContainerDied","Data":"52d80090c9bb492a160f965119bc9ab92a3f20db691375197a11ef1da698ba8c"} Apr 16 18:25:04.051423 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:04.051351 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7668d57578-pls42" Apr 16 18:25:04.051423 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:04.051367 2579 scope.go:117] "RemoveContainer" containerID="52d80090c9bb492a160f965119bc9ab92a3f20db691375197a11ef1da698ba8c" Apr 16 18:25:04.051552 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:04.051357 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7668d57578-pls42" event={"ID":"cd5f0449-dcf1-4855-8563-6fa72c56fa6a","Type":"ContainerDied","Data":"e1e429f12d6782b58b74a7d6290f9495a959ffc7e8f19f253e730548a5da4006"} Apr 16 18:25:04.058643 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:04.058625 2579 scope.go:117] "RemoveContainer" containerID="52d80090c9bb492a160f965119bc9ab92a3f20db691375197a11ef1da698ba8c" Apr 16 18:25:04.058919 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:25:04.058898 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52d80090c9bb492a160f965119bc9ab92a3f20db691375197a11ef1da698ba8c\": container with ID starting with 52d80090c9bb492a160f965119bc9ab92a3f20db691375197a11ef1da698ba8c not found: ID does not exist" containerID="52d80090c9bb492a160f965119bc9ab92a3f20db691375197a11ef1da698ba8c" Apr 16 18:25:04.058980 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:04.058927 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52d80090c9bb492a160f965119bc9ab92a3f20db691375197a11ef1da698ba8c"} err="failed to get container status \"52d80090c9bb492a160f965119bc9ab92a3f20db691375197a11ef1da698ba8c\": rpc error: code = NotFound desc = could not find container \"52d80090c9bb492a160f965119bc9ab92a3f20db691375197a11ef1da698ba8c\": container with ID starting with 52d80090c9bb492a160f965119bc9ab92a3f20db691375197a11ef1da698ba8c not found: ID does not exist" Apr 16 18:25:04.068622 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:04.068599 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7668d57578-pls42"] Apr 16 18:25:04.073242 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:04.073221 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-7668d57578-pls42"] Apr 16 18:25:05.055675 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:05.055620 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7668d57578-rgt4s" event={"ID":"a287e58a-ddbe-4e6d-9fc0-b96bb20515af","Type":"ContainerStarted","Data":"987cbfb65777bbd3d89f70c16ba47dcba181fe16725f52897077234a06d5f6eb"} Apr 16 18:25:05.056336 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:05.055720 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7668d57578-rgt4s" Apr 16 18:25:05.075817 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:05.075772 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7668d57578-rgt4s" podStartSLOduration=1.802962593 podStartE2EDuration="2.075756916s" podCreationTimestamp="2026-04-16 18:25:03 +0000 UTC" firstStartedPulling="2026-04-16 18:25:03.879246207 +0000 UTC m=+448.726406676" lastFinishedPulling="2026-04-16 18:25:04.15204053 +0000 UTC m=+448.999200999" observedRunningTime="2026-04-16 18:25:05.074694787 +0000 UTC m=+449.921855280" watchObservedRunningTime="2026-04-16 18:25:05.075756916 +0000 UTC m=+449.922917407" Apr 16 18:25:05.753875 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:05.753846 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd5f0449-dcf1-4855-8563-6fa72c56fa6a" path="/var/lib/kubelet/pods/cd5f0449-dcf1-4855-8563-6fa72c56fa6a/volumes" Apr 16 18:25:36.064593 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:25:36.064507 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7668d57578-rgt4s" Apr 16 18:26:09.627166 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:09.627129 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt"] Apr 16 18:26:09.627837 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:09.627481 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd5f0449-dcf1-4855-8563-6fa72c56fa6a" containerName="manager" Apr 16 18:26:09.627837 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:09.627500 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd5f0449-dcf1-4855-8563-6fa72c56fa6a" containerName="manager" Apr 16 18:26:09.627837 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:09.627584 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd5f0449-dcf1-4855-8563-6fa72c56fa6a" containerName="manager" Apr 16 18:26:09.629603 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:09.629581 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" Apr 16 18:26:09.632040 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:09.632018 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-kt2gs\"" Apr 16 18:26:09.638200 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:09.638178 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt"] Apr 16 18:26:09.677676 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:09.677654 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca1a972f-21c5-48f9-a64f-1cec62e34e77-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-84f646df87-bffbt\" (UID: \"ca1a972f-21c5-48f9-a64f-1cec62e34e77\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" Apr 16 18:26:09.778509 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:09.778481 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca1a972f-21c5-48f9-a64f-1cec62e34e77-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-84f646df87-bffbt\" (UID: \"ca1a972f-21c5-48f9-a64f-1cec62e34e77\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" Apr 16 18:26:09.778877 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:09.778857 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca1a972f-21c5-48f9-a64f-1cec62e34e77-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-84f646df87-bffbt\" (UID: \"ca1a972f-21c5-48f9-a64f-1cec62e34e77\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" Apr 16 18:26:09.868764 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:09.868739 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj"] Apr 16 18:26:09.870977 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:09.870963 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" Apr 16 18:26:09.879413 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:09.879351 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e3c6afc8-1f23-4511-9ae7-0a13e296333e-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-kl7jj\" (UID: \"e3c6afc8-1f23-4511-9ae7-0a13e296333e\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" Apr 16 18:26:09.884567 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:09.884546 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj"] Apr 16 18:26:09.940500 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:09.940472 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" Apr 16 18:26:09.980178 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:09.980150 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e3c6afc8-1f23-4511-9ae7-0a13e296333e-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-kl7jj\" (UID: \"e3c6afc8-1f23-4511-9ae7-0a13e296333e\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" Apr 16 18:26:09.980476 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:09.980458 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e3c6afc8-1f23-4511-9ae7-0a13e296333e-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-kl7jj\" (UID: \"e3c6afc8-1f23-4511-9ae7-0a13e296333e\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" Apr 16 18:26:10.061478 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:10.061437 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt"] Apr 16 18:26:10.064736 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:26:10.064705 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca1a972f_21c5_48f9_a64f_1cec62e34e77.slice/crio-7d3e0592c625c574182ee7c7fe5033ad1c431c7499c8e71c55eaa8d5c72510ee WatchSource:0}: Error finding container 7d3e0592c625c574182ee7c7fe5033ad1c431c7499c8e71c55eaa8d5c72510ee: Status 404 returned error can't find the container with id 7d3e0592c625c574182ee7c7fe5033ad1c431c7499c8e71c55eaa8d5c72510ee Apr 16 18:26:10.179623 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:10.179549 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" Apr 16 18:26:10.236291 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:10.236186 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" event={"ID":"ca1a972f-21c5-48f9-a64f-1cec62e34e77","Type":"ContainerStarted","Data":"7d3e0592c625c574182ee7c7fe5033ad1c431c7499c8e71c55eaa8d5c72510ee"} Apr 16 18:26:10.299110 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:10.299077 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj"] Apr 16 18:26:10.301909 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:26:10.301867 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3c6afc8_1f23_4511_9ae7_0a13e296333e.slice/crio-ed09c9b38532c05e3aace205883a0487c307eb7d4dd666489df577c01079fcd6 WatchSource:0}: Error finding container ed09c9b38532c05e3aace205883a0487c307eb7d4dd666489df577c01079fcd6: Status 404 returned error can't find the container with id ed09c9b38532c05e3aace205883a0487c307eb7d4dd666489df577c01079fcd6 Apr 16 18:26:11.241998 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:11.241955 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" event={"ID":"e3c6afc8-1f23-4511-9ae7-0a13e296333e","Type":"ContainerStarted","Data":"ed09c9b38532c05e3aace205883a0487c307eb7d4dd666489df577c01079fcd6"} Apr 16 18:26:14.252113 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:14.252085 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" event={"ID":"ca1a972f-21c5-48f9-a64f-1cec62e34e77","Type":"ContainerStarted","Data":"5e104643e8fc70c0f6cd7bc04ccdedde755b870c10ee776d550e89cc03024009"} Apr 16 18:26:14.253456 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:14.253430 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" event={"ID":"e3c6afc8-1f23-4511-9ae7-0a13e296333e","Type":"ContainerStarted","Data":"6136b566bd1ba3d328cbc61b8da25f5600ae8706328699117ba86cdc7120e5a0"} Apr 16 18:26:18.267055 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:18.267023 2579 generic.go:358] "Generic (PLEG): container finished" podID="ca1a972f-21c5-48f9-a64f-1cec62e34e77" containerID="5e104643e8fc70c0f6cd7bc04ccdedde755b870c10ee776d550e89cc03024009" exitCode=0 Apr 16 18:26:18.267429 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:18.267101 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" event={"ID":"ca1a972f-21c5-48f9-a64f-1cec62e34e77","Type":"ContainerDied","Data":"5e104643e8fc70c0f6cd7bc04ccdedde755b870c10ee776d550e89cc03024009"} Apr 16 18:26:18.268605 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:18.268587 2579 generic.go:358] "Generic (PLEG): container finished" podID="e3c6afc8-1f23-4511-9ae7-0a13e296333e" containerID="6136b566bd1ba3d328cbc61b8da25f5600ae8706328699117ba86cdc7120e5a0" exitCode=0 Apr 16 18:26:18.268660 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:18.268629 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" event={"ID":"e3c6afc8-1f23-4511-9ae7-0a13e296333e","Type":"ContainerDied","Data":"6136b566bd1ba3d328cbc61b8da25f5600ae8706328699117ba86cdc7120e5a0"} Apr 16 18:26:42.362954 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:42.362919 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" event={"ID":"e3c6afc8-1f23-4511-9ae7-0a13e296333e","Type":"ContainerStarted","Data":"52cd897d04bd8d7f567d8228bd106c54e476d32878c563525b815a703bff4061"} Apr 16 18:26:42.363433 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:42.363236 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" Apr 16 18:26:42.364381 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:42.364349 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" podUID="e3c6afc8-1f23-4511-9ae7-0a13e296333e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 16 18:26:42.364465 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:42.364425 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" event={"ID":"ca1a972f-21c5-48f9-a64f-1cec62e34e77","Type":"ContainerStarted","Data":"fd2e7c4752abf5e95ec9add84be356dd91505cd8dbc654eccaf158e77ece441d"} Apr 16 18:26:42.364687 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:42.364674 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" Apr 16 18:26:42.365528 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:42.365505 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" podUID="ca1a972f-21c5-48f9-a64f-1cec62e34e77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 16 18:26:42.379855 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:42.379820 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" podStartSLOduration=1.712514412 podStartE2EDuration="33.379811159s" podCreationTimestamp="2026-04-16 18:26:09 +0000 UTC" firstStartedPulling="2026-04-16 18:26:10.303681898 +0000 UTC m=+515.150842367" lastFinishedPulling="2026-04-16 18:26:41.970978637 +0000 UTC m=+546.818139114" observedRunningTime="2026-04-16 18:26:42.379668931 +0000 UTC m=+547.226829422" watchObservedRunningTime="2026-04-16 18:26:42.379811159 +0000 UTC m=+547.226971694" Apr 16 18:26:42.402086 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:42.402047 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" podStartSLOduration=1.878044768 podStartE2EDuration="33.402036548s" podCreationTimestamp="2026-04-16 18:26:09 +0000 UTC" firstStartedPulling="2026-04-16 18:26:10.066553989 +0000 UTC m=+514.913714473" lastFinishedPulling="2026-04-16 18:26:41.590545781 +0000 UTC m=+546.437706253" observedRunningTime="2026-04-16 18:26:42.400198213 +0000 UTC m=+547.247358704" watchObservedRunningTime="2026-04-16 18:26:42.402036548 +0000 UTC m=+547.249197039" Apr 16 18:26:43.367537 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:43.367502 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" podUID="e3c6afc8-1f23-4511-9ae7-0a13e296333e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 16 18:26:43.367941 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:43.367505 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" podUID="ca1a972f-21c5-48f9-a64f-1cec62e34e77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 16 18:26:53.367515 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:53.367471 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" podUID="e3c6afc8-1f23-4511-9ae7-0a13e296333e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 16 18:26:53.367908 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:26:53.367488 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" podUID="ca1a972f-21c5-48f9-a64f-1cec62e34e77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 16 18:27:03.367731 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:03.367684 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" podUID="ca1a972f-21c5-48f9-a64f-1cec62e34e77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 16 18:27:03.368091 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:03.367684 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" podUID="e3c6afc8-1f23-4511-9ae7-0a13e296333e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 16 18:27:13.368172 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:13.368130 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" podUID="e3c6afc8-1f23-4511-9ae7-0a13e296333e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 16 18:27:13.368172 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:13.368141 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" podUID="ca1a972f-21c5-48f9-a64f-1cec62e34e77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 16 18:27:23.368026 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:23.367980 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" podUID="e3c6afc8-1f23-4511-9ae7-0a13e296333e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 16 18:27:23.368026 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:23.367992 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" podUID="ca1a972f-21c5-48f9-a64f-1cec62e34e77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 16 18:27:29.618895 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:29.618861 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c"] Apr 16 18:27:29.621813 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:29.621789 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" Apr 16 18:27:29.624352 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:29.624329 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-3596a-serving-cert\"" Apr 16 18:27:29.624485 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:29.624357 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:27:29.624485 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:29.624343 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-3596a-kube-rbac-proxy-sar-config\"" Apr 16 18:27:29.629103 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:29.629082 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c"] Apr 16 18:27:29.727747 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:29.727723 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/441d2dc0-7718-4f9d-ace6-9b35dae5182a-openshift-service-ca-bundle\") pod \"switch-graph-3596a-5c45cb7657-nxb8c\" (UID: \"441d2dc0-7718-4f9d-ace6-9b35dae5182a\") " pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" Apr 16 18:27:29.727880 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:29.727787 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/441d2dc0-7718-4f9d-ace6-9b35dae5182a-proxy-tls\") pod \"switch-graph-3596a-5c45cb7657-nxb8c\" (UID: \"441d2dc0-7718-4f9d-ace6-9b35dae5182a\") " pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" Apr 16 18:27:29.828986 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:29.828960 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/441d2dc0-7718-4f9d-ace6-9b35dae5182a-proxy-tls\") pod \"switch-graph-3596a-5c45cb7657-nxb8c\" (UID: \"441d2dc0-7718-4f9d-ace6-9b35dae5182a\") " pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" Apr 16 18:27:29.829110 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:29.829029 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/441d2dc0-7718-4f9d-ace6-9b35dae5182a-openshift-service-ca-bundle\") pod \"switch-graph-3596a-5c45cb7657-nxb8c\" (UID: \"441d2dc0-7718-4f9d-ace6-9b35dae5182a\") " pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" Apr 16 18:27:29.829159 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:27:29.829107 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-3596a-serving-cert: secret "switch-graph-3596a-serving-cert" not found Apr 16 18:27:29.829193 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:27:29.829184 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/441d2dc0-7718-4f9d-ace6-9b35dae5182a-proxy-tls podName:441d2dc0-7718-4f9d-ace6-9b35dae5182a nodeName:}" failed. No retries permitted until 2026-04-16 18:27:30.329162592 +0000 UTC m=+595.176323079 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/441d2dc0-7718-4f9d-ace6-9b35dae5182a-proxy-tls") pod "switch-graph-3596a-5c45cb7657-nxb8c" (UID: "441d2dc0-7718-4f9d-ace6-9b35dae5182a") : secret "switch-graph-3596a-serving-cert" not found Apr 16 18:27:29.829681 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:29.829663 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/441d2dc0-7718-4f9d-ace6-9b35dae5182a-openshift-service-ca-bundle\") pod \"switch-graph-3596a-5c45cb7657-nxb8c\" (UID: \"441d2dc0-7718-4f9d-ace6-9b35dae5182a\") " pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" Apr 16 18:27:30.332483 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:30.332439 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/441d2dc0-7718-4f9d-ace6-9b35dae5182a-proxy-tls\") pod \"switch-graph-3596a-5c45cb7657-nxb8c\" (UID: \"441d2dc0-7718-4f9d-ace6-9b35dae5182a\") " pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" Apr 16 18:27:30.334841 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:30.334811 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/441d2dc0-7718-4f9d-ace6-9b35dae5182a-proxy-tls\") pod \"switch-graph-3596a-5c45cb7657-nxb8c\" (UID: \"441d2dc0-7718-4f9d-ace6-9b35dae5182a\") " pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" Apr 16 18:27:30.534881 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:30.534856 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" Apr 16 18:27:30.651847 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:30.651819 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c"] Apr 16 18:27:30.654717 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:27:30.654694 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod441d2dc0_7718_4f9d_ace6_9b35dae5182a.slice/crio-e0837fc8ea7fa9459b5c9e5cc4be594fa5b91d03dbb7a6761d550e95ff3c7dda WatchSource:0}: Error finding container e0837fc8ea7fa9459b5c9e5cc4be594fa5b91d03dbb7a6761d550e95ff3c7dda: Status 404 returned error can't find the container with id e0837fc8ea7fa9459b5c9e5cc4be594fa5b91d03dbb7a6761d550e95ff3c7dda Apr 16 18:27:31.502848 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:31.502813 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" event={"ID":"441d2dc0-7718-4f9d-ace6-9b35dae5182a","Type":"ContainerStarted","Data":"e0837fc8ea7fa9459b5c9e5cc4be594fa5b91d03dbb7a6761d550e95ff3c7dda"} Apr 16 18:27:33.367851 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:33.367767 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" podUID="ca1a972f-21c5-48f9-a64f-1cec62e34e77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 16 18:27:33.368154 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:33.367767 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" podUID="e3c6afc8-1f23-4511-9ae7-0a13e296333e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 16 18:27:33.509546 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:33.509515 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" event={"ID":"441d2dc0-7718-4f9d-ace6-9b35dae5182a","Type":"ContainerStarted","Data":"9abea88bc2f87dafa70808e4ed10a7782600f9e3be7f94544ee70bf61ad7fe38"} Apr 16 18:27:33.509677 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:33.509627 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" Apr 16 18:27:33.526146 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:33.525855 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" podStartSLOduration=2.061299957 podStartE2EDuration="4.525839494s" podCreationTimestamp="2026-04-16 18:27:29 +0000 UTC" firstStartedPulling="2026-04-16 18:27:30.656545016 +0000 UTC m=+595.503705493" lastFinishedPulling="2026-04-16 18:27:33.121084546 +0000 UTC m=+597.968245030" observedRunningTime="2026-04-16 18:27:33.525674546 +0000 UTC m=+598.372835038" watchObservedRunningTime="2026-04-16 18:27:33.525839494 +0000 UTC m=+598.372999993" Apr 16 18:27:35.691918 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:35.691886 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 18:27:35.692496 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:35.691886 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 18:27:39.517670 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:39.517641 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" Apr 16 18:27:39.846179 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:39.846090 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c"] Apr 16 18:27:39.846515 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:39.846461 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" podUID="441d2dc0-7718-4f9d-ace6-9b35dae5182a" containerName="switch-graph-3596a" containerID="cri-o://9abea88bc2f87dafa70808e4ed10a7782600f9e3be7f94544ee70bf61ad7fe38" gracePeriod=30 Apr 16 18:27:43.367949 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:43.367914 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" podUID="ca1a972f-21c5-48f9-a64f-1cec62e34e77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 16 18:27:43.368844 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:43.368827 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" Apr 16 18:27:44.516959 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:44.516917 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" podUID="441d2dc0-7718-4f9d-ace6-9b35dae5182a" containerName="switch-graph-3596a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:27:49.515997 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:49.515949 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" podUID="441d2dc0-7718-4f9d-ace6-9b35dae5182a" containerName="switch-graph-3596a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:27:53.369032 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:53.368941 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" Apr 16 18:27:54.516353 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:54.516314 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" podUID="441d2dc0-7718-4f9d-ace6-9b35dae5182a" containerName="switch-graph-3596a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:27:54.516754 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:54.516423 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" Apr 16 18:27:59.516641 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:27:59.516599 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" podUID="441d2dc0-7718-4f9d-ace6-9b35dae5182a" containerName="switch-graph-3596a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:28:04.516834 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:04.516791 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" podUID="441d2dc0-7718-4f9d-ace6-9b35dae5182a" containerName="switch-graph-3596a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:28:09.516029 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:09.515989 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" podUID="441d2dc0-7718-4f9d-ace6-9b35dae5182a" containerName="switch-graph-3596a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:28:10.026508 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:10.026484 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" Apr 16 18:28:10.190633 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:10.190601 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/441d2dc0-7718-4f9d-ace6-9b35dae5182a-proxy-tls\") pod \"441d2dc0-7718-4f9d-ace6-9b35dae5182a\" (UID: \"441d2dc0-7718-4f9d-ace6-9b35dae5182a\") " Apr 16 18:28:10.190821 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:10.190690 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/441d2dc0-7718-4f9d-ace6-9b35dae5182a-openshift-service-ca-bundle\") pod \"441d2dc0-7718-4f9d-ace6-9b35dae5182a\" (UID: \"441d2dc0-7718-4f9d-ace6-9b35dae5182a\") " Apr 16 18:28:10.191036 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:10.191004 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/441d2dc0-7718-4f9d-ace6-9b35dae5182a-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "441d2dc0-7718-4f9d-ace6-9b35dae5182a" (UID: "441d2dc0-7718-4f9d-ace6-9b35dae5182a"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:28:10.192839 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:10.192816 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/441d2dc0-7718-4f9d-ace6-9b35dae5182a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "441d2dc0-7718-4f9d-ace6-9b35dae5182a" (UID: "441d2dc0-7718-4f9d-ace6-9b35dae5182a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:28:10.291303 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:10.291249 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/441d2dc0-7718-4f9d-ace6-9b35dae5182a-openshift-service-ca-bundle\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:28:10.291303 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:10.291301 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/441d2dc0-7718-4f9d-ace6-9b35dae5182a-proxy-tls\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:28:10.614234 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:10.614202 2579 generic.go:358] "Generic (PLEG): container finished" podID="441d2dc0-7718-4f9d-ace6-9b35dae5182a" containerID="9abea88bc2f87dafa70808e4ed10a7782600f9e3be7f94544ee70bf61ad7fe38" exitCode=0 Apr 16 18:28:10.614606 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:10.614295 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" Apr 16 18:28:10.614606 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:10.614298 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" event={"ID":"441d2dc0-7718-4f9d-ace6-9b35dae5182a","Type":"ContainerDied","Data":"9abea88bc2f87dafa70808e4ed10a7782600f9e3be7f94544ee70bf61ad7fe38"} Apr 16 18:28:10.614606 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:10.614342 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c" event={"ID":"441d2dc0-7718-4f9d-ace6-9b35dae5182a","Type":"ContainerDied","Data":"e0837fc8ea7fa9459b5c9e5cc4be594fa5b91d03dbb7a6761d550e95ff3c7dda"} Apr 16 18:28:10.614606 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:10.614361 2579 scope.go:117] "RemoveContainer" containerID="9abea88bc2f87dafa70808e4ed10a7782600f9e3be7f94544ee70bf61ad7fe38" Apr 16 18:28:10.622946 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:10.622925 2579 scope.go:117] "RemoveContainer" containerID="9abea88bc2f87dafa70808e4ed10a7782600f9e3be7f94544ee70bf61ad7fe38" Apr 16 18:28:10.623285 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:28:10.623248 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9abea88bc2f87dafa70808e4ed10a7782600f9e3be7f94544ee70bf61ad7fe38\": container with ID starting with 9abea88bc2f87dafa70808e4ed10a7782600f9e3be7f94544ee70bf61ad7fe38 not found: ID does not exist" containerID="9abea88bc2f87dafa70808e4ed10a7782600f9e3be7f94544ee70bf61ad7fe38" Apr 16 18:28:10.623354 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:10.623299 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9abea88bc2f87dafa70808e4ed10a7782600f9e3be7f94544ee70bf61ad7fe38"} err="failed to get container status \"9abea88bc2f87dafa70808e4ed10a7782600f9e3be7f94544ee70bf61ad7fe38\": rpc error: code = NotFound desc = could not find container \"9abea88bc2f87dafa70808e4ed10a7782600f9e3be7f94544ee70bf61ad7fe38\": container with ID starting with 9abea88bc2f87dafa70808e4ed10a7782600f9e3be7f94544ee70bf61ad7fe38 not found: ID does not exist" Apr 16 18:28:10.637385 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:10.637356 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c"] Apr 16 18:28:10.641633 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:10.641600 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-3596a-5c45cb7657-nxb8c"] Apr 16 18:28:11.753834 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:11.753799 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="441d2dc0-7718-4f9d-ace6-9b35dae5182a" path="/var/lib/kubelet/pods/441d2dc0-7718-4f9d-ace6-9b35dae5182a/volumes" Apr 16 18:28:19.632156 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:19.632124 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-79c585885d-665ck"] Apr 16 18:28:19.634500 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:19.632416 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="441d2dc0-7718-4f9d-ace6-9b35dae5182a" containerName="switch-graph-3596a" Apr 16 18:28:19.634500 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:19.632429 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="441d2dc0-7718-4f9d-ace6-9b35dae5182a" containerName="switch-graph-3596a" Apr 16 18:28:19.634500 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:19.632478 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="441d2dc0-7718-4f9d-ace6-9b35dae5182a" containerName="switch-graph-3596a" Apr 16 18:28:19.635360 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:19.635346 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" Apr 16 18:28:19.637547 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:19.637518 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 16 18:28:19.637677 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:19.637556 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:28:19.637677 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:19.637556 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 16 18:28:19.646136 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:19.646108 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-79c585885d-665ck"] Apr 16 18:28:19.750570 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:19.750545 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39df0897-8c7c-479b-8316-1650efd08fe8-openshift-service-ca-bundle\") pod \"model-chainer-79c585885d-665ck\" (UID: \"39df0897-8c7c-479b-8316-1650efd08fe8\") " pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" Apr 16 18:28:19.750707 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:19.750597 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/39df0897-8c7c-479b-8316-1650efd08fe8-proxy-tls\") pod \"model-chainer-79c585885d-665ck\" (UID: \"39df0897-8c7c-479b-8316-1650efd08fe8\") " pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" Apr 16 18:28:19.851105 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:19.851078 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/39df0897-8c7c-479b-8316-1650efd08fe8-proxy-tls\") pod \"model-chainer-79c585885d-665ck\" (UID: \"39df0897-8c7c-479b-8316-1650efd08fe8\") " pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" Apr 16 18:28:19.851429 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:19.851403 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39df0897-8c7c-479b-8316-1650efd08fe8-openshift-service-ca-bundle\") pod \"model-chainer-79c585885d-665ck\" (UID: \"39df0897-8c7c-479b-8316-1650efd08fe8\") " pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" Apr 16 18:28:19.851945 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:19.851927 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39df0897-8c7c-479b-8316-1650efd08fe8-openshift-service-ca-bundle\") pod \"model-chainer-79c585885d-665ck\" (UID: \"39df0897-8c7c-479b-8316-1650efd08fe8\") " pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" Apr 16 18:28:19.853433 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:19.853416 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/39df0897-8c7c-479b-8316-1650efd08fe8-proxy-tls\") pod \"model-chainer-79c585885d-665ck\" (UID: \"39df0897-8c7c-479b-8316-1650efd08fe8\") " pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" Apr 16 18:28:19.947674 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:19.947623 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" Apr 16 18:28:20.063876 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:20.063848 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-79c585885d-665ck"] Apr 16 18:28:20.066884 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:28:20.066859 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39df0897_8c7c_479b_8316_1650efd08fe8.slice/crio-ef92e07414ec254038241152ba01a50655ef38195704c40be352b781f3d00cbb WatchSource:0}: Error finding container ef92e07414ec254038241152ba01a50655ef38195704c40be352b781f3d00cbb: Status 404 returned error can't find the container with id ef92e07414ec254038241152ba01a50655ef38195704c40be352b781f3d00cbb Apr 16 18:28:20.068752 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:20.068734 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:28:20.645480 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:20.645441 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" event={"ID":"39df0897-8c7c-479b-8316-1650efd08fe8","Type":"ContainerStarted","Data":"7347ee5b9b1e7ca4cb0523743718d083827607570f7e8c426bad26244252d8a8"} Apr 16 18:28:20.645921 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:20.645485 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" event={"ID":"39df0897-8c7c-479b-8316-1650efd08fe8","Type":"ContainerStarted","Data":"ef92e07414ec254038241152ba01a50655ef38195704c40be352b781f3d00cbb"} Apr 16 18:28:20.645921 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:20.645574 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" Apr 16 18:28:26.654436 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:26.654400 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" Apr 16 18:28:26.671966 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:26.671918 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" podStartSLOduration=7.671901113 podStartE2EDuration="7.671901113s" podCreationTimestamp="2026-04-16 18:28:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:28:20.663606007 +0000 UTC m=+645.510766499" watchObservedRunningTime="2026-04-16 18:28:26.671901113 +0000 UTC m=+651.519061610" Apr 16 18:28:29.701292 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:29.701249 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-79c585885d-665ck"] Apr 16 18:28:29.701685 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:29.701482 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" podUID="39df0897-8c7c-479b-8316-1650efd08fe8" containerName="model-chainer" containerID="cri-o://7347ee5b9b1e7ca4cb0523743718d083827607570f7e8c426bad26244252d8a8" gracePeriod=30 Apr 16 18:28:29.859237 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:29.859204 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt"] Apr 16 18:28:29.859502 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:29.859478 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" podUID="ca1a972f-21c5-48f9-a64f-1cec62e34e77" containerName="kserve-container" containerID="cri-o://fd2e7c4752abf5e95ec9add84be356dd91505cd8dbc654eccaf158e77ece441d" gracePeriod=30 Apr 16 18:28:29.958644 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:29.958573 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj"] Apr 16 18:28:29.958852 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:29.958817 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" podUID="e3c6afc8-1f23-4511-9ae7-0a13e296333e" containerName="kserve-container" containerID="cri-o://52cd897d04bd8d7f567d8228bd106c54e476d32878c563525b815a703bff4061" gracePeriod=30 Apr 16 18:28:31.652183 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:31.652143 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" podUID="39df0897-8c7c-479b-8316-1650efd08fe8" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:28:33.188287 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.188246 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" Apr 16 18:28:33.345282 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.345206 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e3c6afc8-1f23-4511-9ae7-0a13e296333e-kserve-provision-location\") pod \"e3c6afc8-1f23-4511-9ae7-0a13e296333e\" (UID: \"e3c6afc8-1f23-4511-9ae7-0a13e296333e\") " Apr 16 18:28:33.345516 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.345493 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3c6afc8-1f23-4511-9ae7-0a13e296333e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e3c6afc8-1f23-4511-9ae7-0a13e296333e" (UID: "e3c6afc8-1f23-4511-9ae7-0a13e296333e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:28:33.367642 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.367615 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" podUID="ca1a972f-21c5-48f9-a64f-1cec62e34e77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 16 18:28:33.445881 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.445858 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e3c6afc8-1f23-4511-9ae7-0a13e296333e-kserve-provision-location\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:28:33.593433 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.593414 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" Apr 16 18:28:33.682972 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.682939 2579 generic.go:358] "Generic (PLEG): container finished" podID="ca1a972f-21c5-48f9-a64f-1cec62e34e77" containerID="fd2e7c4752abf5e95ec9add84be356dd91505cd8dbc654eccaf158e77ece441d" exitCode=0 Apr 16 18:28:33.683080 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.683017 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" Apr 16 18:28:33.683149 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.683016 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" event={"ID":"ca1a972f-21c5-48f9-a64f-1cec62e34e77","Type":"ContainerDied","Data":"fd2e7c4752abf5e95ec9add84be356dd91505cd8dbc654eccaf158e77ece441d"} Apr 16 18:28:33.683149 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.683136 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt" event={"ID":"ca1a972f-21c5-48f9-a64f-1cec62e34e77","Type":"ContainerDied","Data":"7d3e0592c625c574182ee7c7fe5033ad1c431c7499c8e71c55eaa8d5c72510ee"} Apr 16 18:28:33.683245 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.683167 2579 scope.go:117] "RemoveContainer" containerID="fd2e7c4752abf5e95ec9add84be356dd91505cd8dbc654eccaf158e77ece441d" Apr 16 18:28:33.684395 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.684372 2579 generic.go:358] "Generic (PLEG): container finished" podID="e3c6afc8-1f23-4511-9ae7-0a13e296333e" containerID="52cd897d04bd8d7f567d8228bd106c54e476d32878c563525b815a703bff4061" exitCode=0 Apr 16 18:28:33.684509 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.684412 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" event={"ID":"e3c6afc8-1f23-4511-9ae7-0a13e296333e","Type":"ContainerDied","Data":"52cd897d04bd8d7f567d8228bd106c54e476d32878c563525b815a703bff4061"} Apr 16 18:28:33.684509 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.684428 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" event={"ID":"e3c6afc8-1f23-4511-9ae7-0a13e296333e","Type":"ContainerDied","Data":"ed09c9b38532c05e3aace205883a0487c307eb7d4dd666489df577c01079fcd6"} Apr 16 18:28:33.684509 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.684440 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj" Apr 16 18:28:33.690844 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.690829 2579 scope.go:117] "RemoveContainer" containerID="5e104643e8fc70c0f6cd7bc04ccdedde755b870c10ee776d550e89cc03024009" Apr 16 18:28:33.697521 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.697507 2579 scope.go:117] "RemoveContainer" containerID="fd2e7c4752abf5e95ec9add84be356dd91505cd8dbc654eccaf158e77ece441d" Apr 16 18:28:33.697766 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:28:33.697749 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd2e7c4752abf5e95ec9add84be356dd91505cd8dbc654eccaf158e77ece441d\": container with ID starting with fd2e7c4752abf5e95ec9add84be356dd91505cd8dbc654eccaf158e77ece441d not found: ID does not exist" containerID="fd2e7c4752abf5e95ec9add84be356dd91505cd8dbc654eccaf158e77ece441d" Apr 16 18:28:33.697829 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.697772 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd2e7c4752abf5e95ec9add84be356dd91505cd8dbc654eccaf158e77ece441d"} err="failed to get container status \"fd2e7c4752abf5e95ec9add84be356dd91505cd8dbc654eccaf158e77ece441d\": rpc error: code = NotFound desc = could not find container \"fd2e7c4752abf5e95ec9add84be356dd91505cd8dbc654eccaf158e77ece441d\": container with ID starting with fd2e7c4752abf5e95ec9add84be356dd91505cd8dbc654eccaf158e77ece441d not found: ID does not exist" Apr 16 18:28:33.697829 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.697788 2579 scope.go:117] "RemoveContainer" containerID="5e104643e8fc70c0f6cd7bc04ccdedde755b870c10ee776d550e89cc03024009" Apr 16 18:28:33.697993 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:28:33.697976 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e104643e8fc70c0f6cd7bc04ccdedde755b870c10ee776d550e89cc03024009\": container with ID starting with 5e104643e8fc70c0f6cd7bc04ccdedde755b870c10ee776d550e89cc03024009 not found: ID does not exist" containerID="5e104643e8fc70c0f6cd7bc04ccdedde755b870c10ee776d550e89cc03024009" Apr 16 18:28:33.698028 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.698000 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e104643e8fc70c0f6cd7bc04ccdedde755b870c10ee776d550e89cc03024009"} err="failed to get container status \"5e104643e8fc70c0f6cd7bc04ccdedde755b870c10ee776d550e89cc03024009\": rpc error: code = NotFound desc = could not find container \"5e104643e8fc70c0f6cd7bc04ccdedde755b870c10ee776d550e89cc03024009\": container with ID starting with 5e104643e8fc70c0f6cd7bc04ccdedde755b870c10ee776d550e89cc03024009 not found: ID does not exist" Apr 16 18:28:33.698028 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.698015 2579 scope.go:117] "RemoveContainer" containerID="52cd897d04bd8d7f567d8228bd106c54e476d32878c563525b815a703bff4061" Apr 16 18:28:33.704131 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.704118 2579 scope.go:117] "RemoveContainer" containerID="6136b566bd1ba3d328cbc61b8da25f5600ae8706328699117ba86cdc7120e5a0" Apr 16 18:28:33.707040 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.707019 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj"] Apr 16 18:28:33.711149 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.711093 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-kl7jj"] Apr 16 18:28:33.711212 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.711172 2579 scope.go:117] "RemoveContainer" containerID="52cd897d04bd8d7f567d8228bd106c54e476d32878c563525b815a703bff4061" Apr 16 18:28:33.711484 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:28:33.711465 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52cd897d04bd8d7f567d8228bd106c54e476d32878c563525b815a703bff4061\": container with ID starting with 52cd897d04bd8d7f567d8228bd106c54e476d32878c563525b815a703bff4061 not found: ID does not exist" containerID="52cd897d04bd8d7f567d8228bd106c54e476d32878c563525b815a703bff4061" Apr 16 18:28:33.711535 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.711491 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52cd897d04bd8d7f567d8228bd106c54e476d32878c563525b815a703bff4061"} err="failed to get container status \"52cd897d04bd8d7f567d8228bd106c54e476d32878c563525b815a703bff4061\": rpc error: code = NotFound desc = could not find container \"52cd897d04bd8d7f567d8228bd106c54e476d32878c563525b815a703bff4061\": container with ID starting with 52cd897d04bd8d7f567d8228bd106c54e476d32878c563525b815a703bff4061 not found: ID does not exist" Apr 16 18:28:33.711535 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.711507 2579 scope.go:117] "RemoveContainer" containerID="6136b566bd1ba3d328cbc61b8da25f5600ae8706328699117ba86cdc7120e5a0" Apr 16 18:28:33.711765 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:28:33.711748 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6136b566bd1ba3d328cbc61b8da25f5600ae8706328699117ba86cdc7120e5a0\": container with ID starting with 6136b566bd1ba3d328cbc61b8da25f5600ae8706328699117ba86cdc7120e5a0 not found: ID does not exist" containerID="6136b566bd1ba3d328cbc61b8da25f5600ae8706328699117ba86cdc7120e5a0" Apr 16 18:28:33.711821 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.711767 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6136b566bd1ba3d328cbc61b8da25f5600ae8706328699117ba86cdc7120e5a0"} err="failed to get container status \"6136b566bd1ba3d328cbc61b8da25f5600ae8706328699117ba86cdc7120e5a0\": rpc error: code = NotFound desc = could not find container \"6136b566bd1ba3d328cbc61b8da25f5600ae8706328699117ba86cdc7120e5a0\": container with ID starting with 6136b566bd1ba3d328cbc61b8da25f5600ae8706328699117ba86cdc7120e5a0 not found: ID does not exist" Apr 16 18:28:33.748003 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.747984 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca1a972f-21c5-48f9-a64f-1cec62e34e77-kserve-provision-location\") pod \"ca1a972f-21c5-48f9-a64f-1cec62e34e77\" (UID: \"ca1a972f-21c5-48f9-a64f-1cec62e34e77\") " Apr 16 18:28:33.748260 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.748242 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca1a972f-21c5-48f9-a64f-1cec62e34e77-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ca1a972f-21c5-48f9-a64f-1cec62e34e77" (UID: "ca1a972f-21c5-48f9-a64f-1cec62e34e77"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:28:33.754597 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.754579 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3c6afc8-1f23-4511-9ae7-0a13e296333e" path="/var/lib/kubelet/pods/e3c6afc8-1f23-4511-9ae7-0a13e296333e/volumes" Apr 16 18:28:33.849178 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:33.849151 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca1a972f-21c5-48f9-a64f-1cec62e34e77-kserve-provision-location\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:28:34.002118 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:34.002049 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt"] Apr 16 18:28:34.007796 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:34.007772 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-84f646df87-bffbt"] Apr 16 18:28:35.753775 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:35.753746 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca1a972f-21c5-48f9-a64f-1cec62e34e77" path="/var/lib/kubelet/pods/ca1a972f-21c5-48f9-a64f-1cec62e34e77/volumes" Apr 16 18:28:36.652428 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:36.652388 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" podUID="39df0897-8c7c-479b-8316-1650efd08fe8" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:28:41.651905 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:41.651863 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" podUID="39df0897-8c7c-479b-8316-1650efd08fe8" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:28:41.652336 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:41.651972 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" Apr 16 18:28:46.652500 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:46.652459 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" podUID="39df0897-8c7c-479b-8316-1650efd08fe8" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:28:50.124779 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.124741 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj"] Apr 16 18:28:50.125226 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.125089 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca1a972f-21c5-48f9-a64f-1cec62e34e77" containerName="kserve-container" Apr 16 18:28:50.125226 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.125108 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1a972f-21c5-48f9-a64f-1cec62e34e77" containerName="kserve-container" Apr 16 18:28:50.125226 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.125124 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e3c6afc8-1f23-4511-9ae7-0a13e296333e" containerName="kserve-container" Apr 16 18:28:50.125226 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.125133 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c6afc8-1f23-4511-9ae7-0a13e296333e" containerName="kserve-container" Apr 16 18:28:50.125226 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.125154 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e3c6afc8-1f23-4511-9ae7-0a13e296333e" containerName="storage-initializer" Apr 16 18:28:50.125226 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.125162 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c6afc8-1f23-4511-9ae7-0a13e296333e" containerName="storage-initializer" Apr 16 18:28:50.125226 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.125170 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca1a972f-21c5-48f9-a64f-1cec62e34e77" containerName="storage-initializer" Apr 16 18:28:50.125226 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.125179 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1a972f-21c5-48f9-a64f-1cec62e34e77" containerName="storage-initializer" Apr 16 18:28:50.125647 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.125312 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca1a972f-21c5-48f9-a64f-1cec62e34e77" containerName="kserve-container" Apr 16 18:28:50.125647 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.125327 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="e3c6afc8-1f23-4511-9ae7-0a13e296333e" containerName="kserve-container" Apr 16 18:28:50.129842 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.129820 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" Apr 16 18:28:50.132012 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.131991 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-6138a-serving-cert\"" Apr 16 18:28:50.132315 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.132297 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-6138a-kube-rbac-proxy-sar-config\"" Apr 16 18:28:50.136619 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.136598 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj"] Apr 16 18:28:50.154699 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.154676 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/429e922a-5118-4d2a-8a11-1db6a2def094-openshift-service-ca-bundle\") pod \"switch-graph-6138a-58844c49d4-gm2wj\" (UID: \"429e922a-5118-4d2a-8a11-1db6a2def094\") " pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" Apr 16 18:28:50.154787 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.154712 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/429e922a-5118-4d2a-8a11-1db6a2def094-proxy-tls\") pod \"switch-graph-6138a-58844c49d4-gm2wj\" (UID: \"429e922a-5118-4d2a-8a11-1db6a2def094\") " pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" Apr 16 18:28:50.255265 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.255238 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/429e922a-5118-4d2a-8a11-1db6a2def094-openshift-service-ca-bundle\") pod \"switch-graph-6138a-58844c49d4-gm2wj\" (UID: \"429e922a-5118-4d2a-8a11-1db6a2def094\") " pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" Apr 16 18:28:50.255389 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.255307 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/429e922a-5118-4d2a-8a11-1db6a2def094-proxy-tls\") pod \"switch-graph-6138a-58844c49d4-gm2wj\" (UID: \"429e922a-5118-4d2a-8a11-1db6a2def094\") " pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" Apr 16 18:28:50.255939 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.255918 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/429e922a-5118-4d2a-8a11-1db6a2def094-openshift-service-ca-bundle\") pod \"switch-graph-6138a-58844c49d4-gm2wj\" (UID: \"429e922a-5118-4d2a-8a11-1db6a2def094\") " pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" Apr 16 18:28:50.257535 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.257517 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/429e922a-5118-4d2a-8a11-1db6a2def094-proxy-tls\") pod \"switch-graph-6138a-58844c49d4-gm2wj\" (UID: \"429e922a-5118-4d2a-8a11-1db6a2def094\") " pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" Apr 16 18:28:50.440227 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.440172 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" Apr 16 18:28:50.571496 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.571469 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj"] Apr 16 18:28:50.737319 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.737236 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" event={"ID":"429e922a-5118-4d2a-8a11-1db6a2def094","Type":"ContainerStarted","Data":"d96eb5e86552dff28026972ac7e1ffff0b432937fe01d1f820c54d71d7118d52"} Apr 16 18:28:50.737319 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.737290 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" event={"ID":"429e922a-5118-4d2a-8a11-1db6a2def094","Type":"ContainerStarted","Data":"de693f5aa6f92e8f3231aff578b7d7a6dffa1c692f6475818c58475be2fff508"} Apr 16 18:28:50.737452 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.737410 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" Apr 16 18:28:50.756252 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:50.756205 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" podStartSLOduration=0.756193576 podStartE2EDuration="756.193576ms" podCreationTimestamp="2026-04-16 18:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:28:50.754417239 +0000 UTC m=+675.601577729" watchObservedRunningTime="2026-04-16 18:28:50.756193576 +0000 UTC m=+675.603354089" Apr 16 18:28:51.652029 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:51.651995 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" podUID="39df0897-8c7c-479b-8316-1650efd08fe8" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:28:56.652854 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:56.652806 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" podUID="39df0897-8c7c-479b-8316-1650efd08fe8" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:28:56.745078 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:56.745044 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" Apr 16 18:28:59.717190 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:28:59.717157 2579 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39df0897_8c7c_479b_8316_1650efd08fe8.slice/crio-7347ee5b9b1e7ca4cb0523743718d083827607570f7e8c426bad26244252d8a8.scope/cpu.max": read /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39df0897_8c7c_479b_8316_1650efd08fe8.slice/crio-7347ee5b9b1e7ca4cb0523743718d083827607570f7e8c426bad26244252d8a8.scope/cpu.max: no such device Apr 16 18:28:59.731516 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:28:59.731483 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39df0897_8c7c_479b_8316_1650efd08fe8.slice/crio-7347ee5b9b1e7ca4cb0523743718d083827607570f7e8c426bad26244252d8a8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39df0897_8c7c_479b_8316_1650efd08fe8.slice/crio-conmon-7347ee5b9b1e7ca4cb0523743718d083827607570f7e8c426bad26244252d8a8.scope\": RecentStats: unable to find data in memory cache]" Apr 16 18:28:59.731637 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:28:59.731597 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39df0897_8c7c_479b_8316_1650efd08fe8.slice/crio-conmon-7347ee5b9b1e7ca4cb0523743718d083827607570f7e8c426bad26244252d8a8.scope\": RecentStats: unable to find data in memory cache]" Apr 16 18:28:59.763478 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:59.763451 2579 generic.go:358] "Generic (PLEG): container finished" podID="39df0897-8c7c-479b-8316-1650efd08fe8" containerID="7347ee5b9b1e7ca4cb0523743718d083827607570f7e8c426bad26244252d8a8" exitCode=0 Apr 16 18:28:59.763597 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:59.763517 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" event={"ID":"39df0897-8c7c-479b-8316-1650efd08fe8","Type":"ContainerDied","Data":"7347ee5b9b1e7ca4cb0523743718d083827607570f7e8c426bad26244252d8a8"} Apr 16 18:28:59.842191 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:59.842173 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" Apr 16 18:28:59.918963 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:59.918932 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/39df0897-8c7c-479b-8316-1650efd08fe8-proxy-tls\") pod \"39df0897-8c7c-479b-8316-1650efd08fe8\" (UID: \"39df0897-8c7c-479b-8316-1650efd08fe8\") " Apr 16 18:28:59.919072 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:59.918978 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39df0897-8c7c-479b-8316-1650efd08fe8-openshift-service-ca-bundle\") pod \"39df0897-8c7c-479b-8316-1650efd08fe8\" (UID: \"39df0897-8c7c-479b-8316-1650efd08fe8\") " Apr 16 18:28:59.919342 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:59.919319 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39df0897-8c7c-479b-8316-1650efd08fe8-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "39df0897-8c7c-479b-8316-1650efd08fe8" (UID: "39df0897-8c7c-479b-8316-1650efd08fe8"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:28:59.920848 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:28:59.920828 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39df0897-8c7c-479b-8316-1650efd08fe8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "39df0897-8c7c-479b-8316-1650efd08fe8" (UID: "39df0897-8c7c-479b-8316-1650efd08fe8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:29:00.019660 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:00.019612 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/39df0897-8c7c-479b-8316-1650efd08fe8-proxy-tls\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:29:00.019660 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:00.019637 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39df0897-8c7c-479b-8316-1650efd08fe8-openshift-service-ca-bundle\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:29:00.768764 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:00.768733 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" Apr 16 18:29:00.769142 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:00.768731 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-79c585885d-665ck" event={"ID":"39df0897-8c7c-479b-8316-1650efd08fe8","Type":"ContainerDied","Data":"ef92e07414ec254038241152ba01a50655ef38195704c40be352b781f3d00cbb"} Apr 16 18:29:00.769142 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:00.768864 2579 scope.go:117] "RemoveContainer" containerID="7347ee5b9b1e7ca4cb0523743718d083827607570f7e8c426bad26244252d8a8" Apr 16 18:29:00.794918 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:00.794888 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-79c585885d-665ck"] Apr 16 18:29:00.799514 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:00.799491 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-79c585885d-665ck"] Apr 16 18:29:01.757431 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:01.757399 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39df0897-8c7c-479b-8316-1650efd08fe8" path="/var/lib/kubelet/pods/39df0897-8c7c-479b-8316-1650efd08fe8/volumes" Apr 16 18:29:39.907051 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:39.907018 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs"] Apr 16 18:29:39.907478 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:39.907283 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39df0897-8c7c-479b-8316-1650efd08fe8" containerName="model-chainer" Apr 16 18:29:39.907478 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:39.907298 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="39df0897-8c7c-479b-8316-1650efd08fe8" containerName="model-chainer" Apr 16 18:29:39.907478 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:39.907347 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="39df0897-8c7c-479b-8316-1650efd08fe8" containerName="model-chainer" Apr 16 18:29:39.910428 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:39.910410 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" Apr 16 18:29:39.912481 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:39.912460 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-b16e7-kube-rbac-proxy-sar-config\"" Apr 16 18:29:39.912481 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:39.912473 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-b16e7-serving-cert\"" Apr 16 18:29:39.918830 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:39.918809 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs"] Apr 16 18:29:39.976107 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:39.976085 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4faf4db5-9379-4b3d-aa7f-b94616705992-openshift-service-ca-bundle\") pod \"sequence-graph-b16e7-649b6f987b-vl7zs\" (UID: \"4faf4db5-9379-4b3d-aa7f-b94616705992\") " pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" Apr 16 18:29:39.976208 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:39.976113 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4faf4db5-9379-4b3d-aa7f-b94616705992-proxy-tls\") pod \"sequence-graph-b16e7-649b6f987b-vl7zs\" (UID: \"4faf4db5-9379-4b3d-aa7f-b94616705992\") " pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" Apr 16 18:29:40.076414 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:40.076390 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4faf4db5-9379-4b3d-aa7f-b94616705992-openshift-service-ca-bundle\") pod \"sequence-graph-b16e7-649b6f987b-vl7zs\" (UID: \"4faf4db5-9379-4b3d-aa7f-b94616705992\") " pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" Apr 16 18:29:40.076513 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:40.076420 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4faf4db5-9379-4b3d-aa7f-b94616705992-proxy-tls\") pod \"sequence-graph-b16e7-649b6f987b-vl7zs\" (UID: \"4faf4db5-9379-4b3d-aa7f-b94616705992\") " pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" Apr 16 18:29:40.076567 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:29:40.076525 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-b16e7-serving-cert: secret "sequence-graph-b16e7-serving-cert" not found Apr 16 18:29:40.076604 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:29:40.076572 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4faf4db5-9379-4b3d-aa7f-b94616705992-proxy-tls podName:4faf4db5-9379-4b3d-aa7f-b94616705992 nodeName:}" failed. No retries permitted until 2026-04-16 18:29:40.576556202 +0000 UTC m=+725.423716670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4faf4db5-9379-4b3d-aa7f-b94616705992-proxy-tls") pod "sequence-graph-b16e7-649b6f987b-vl7zs" (UID: "4faf4db5-9379-4b3d-aa7f-b94616705992") : secret "sequence-graph-b16e7-serving-cert" not found Apr 16 18:29:40.076945 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:40.076928 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4faf4db5-9379-4b3d-aa7f-b94616705992-openshift-service-ca-bundle\") pod \"sequence-graph-b16e7-649b6f987b-vl7zs\" (UID: \"4faf4db5-9379-4b3d-aa7f-b94616705992\") " pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" Apr 16 18:29:40.578992 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:40.578964 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4faf4db5-9379-4b3d-aa7f-b94616705992-proxy-tls\") pod \"sequence-graph-b16e7-649b6f987b-vl7zs\" (UID: \"4faf4db5-9379-4b3d-aa7f-b94616705992\") " pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" Apr 16 18:29:40.581452 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:40.581433 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4faf4db5-9379-4b3d-aa7f-b94616705992-proxy-tls\") pod \"sequence-graph-b16e7-649b6f987b-vl7zs\" (UID: \"4faf4db5-9379-4b3d-aa7f-b94616705992\") " pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" Apr 16 18:29:40.820415 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:40.820392 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" Apr 16 18:29:40.940761 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:40.940733 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs"] Apr 16 18:29:40.943528 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:29:40.943498 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4faf4db5_9379_4b3d_aa7f_b94616705992.slice/crio-fef3d205517df8d44413fb71b425991e2a1d3c3b460296e5c41851a609233e80 WatchSource:0}: Error finding container fef3d205517df8d44413fb71b425991e2a1d3c3b460296e5c41851a609233e80: Status 404 returned error can't find the container with id fef3d205517df8d44413fb71b425991e2a1d3c3b460296e5c41851a609233e80 Apr 16 18:29:41.888748 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:41.888711 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" event={"ID":"4faf4db5-9379-4b3d-aa7f-b94616705992","Type":"ContainerStarted","Data":"cda769d350cf0f2aa740ffa1978e60f13f09eea2df22b2527ccd6fc5a50e1822"} Apr 16 18:29:41.888748 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:41.888754 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" event={"ID":"4faf4db5-9379-4b3d-aa7f-b94616705992","Type":"ContainerStarted","Data":"fef3d205517df8d44413fb71b425991e2a1d3c3b460296e5c41851a609233e80"} Apr 16 18:29:41.889016 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:41.888895 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" Apr 16 18:29:41.905538 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:41.905498 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" podStartSLOduration=2.90548311 podStartE2EDuration="2.90548311s" podCreationTimestamp="2026-04-16 18:29:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:29:41.905484186 +0000 UTC m=+726.752644669" watchObservedRunningTime="2026-04-16 18:29:41.90548311 +0000 UTC m=+726.752643602" Apr 16 18:29:47.898834 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:29:47.898803 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" Apr 16 18:32:35.712574 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:32:35.712488 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 18:32:35.713084 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:32:35.712917 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 18:37:04.843774 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:04.843742 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj"] Apr 16 18:37:04.846168 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:04.843996 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" podUID="429e922a-5118-4d2a-8a11-1db6a2def094" containerName="switch-graph-6138a" containerID="cri-o://d96eb5e86552dff28026972ac7e1ffff0b432937fe01d1f820c54d71d7118d52" gracePeriod=30 Apr 16 18:37:06.744493 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:06.744451 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" podUID="429e922a-5118-4d2a-8a11-1db6a2def094" containerName="switch-graph-6138a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:37:11.743905 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:11.743863 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" podUID="429e922a-5118-4d2a-8a11-1db6a2def094" containerName="switch-graph-6138a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:37:16.744114 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:16.744065 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" podUID="429e922a-5118-4d2a-8a11-1db6a2def094" containerName="switch-graph-6138a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:37:16.744526 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:16.744187 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" Apr 16 18:37:21.744377 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:21.744327 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" podUID="429e922a-5118-4d2a-8a11-1db6a2def094" containerName="switch-graph-6138a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:37:26.743916 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:26.743875 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" podUID="429e922a-5118-4d2a-8a11-1db6a2def094" containerName="switch-graph-6138a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:37:31.743783 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:31.743744 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" podUID="429e922a-5118-4d2a-8a11-1db6a2def094" containerName="switch-graph-6138a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:37:34.986084 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:34.986062 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" Apr 16 18:37:35.093136 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:35.093113 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/429e922a-5118-4d2a-8a11-1db6a2def094-openshift-service-ca-bundle\") pod \"429e922a-5118-4d2a-8a11-1db6a2def094\" (UID: \"429e922a-5118-4d2a-8a11-1db6a2def094\") " Apr 16 18:37:35.093256 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:35.093161 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/429e922a-5118-4d2a-8a11-1db6a2def094-proxy-tls\") pod \"429e922a-5118-4d2a-8a11-1db6a2def094\" (UID: \"429e922a-5118-4d2a-8a11-1db6a2def094\") " Apr 16 18:37:35.093485 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:35.093463 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/429e922a-5118-4d2a-8a11-1db6a2def094-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "429e922a-5118-4d2a-8a11-1db6a2def094" (UID: "429e922a-5118-4d2a-8a11-1db6a2def094"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:37:35.095060 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:35.095033 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429e922a-5118-4d2a-8a11-1db6a2def094-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "429e922a-5118-4d2a-8a11-1db6a2def094" (UID: "429e922a-5118-4d2a-8a11-1db6a2def094"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:37:35.190089 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:35.190057 2579 generic.go:358] "Generic (PLEG): container finished" podID="429e922a-5118-4d2a-8a11-1db6a2def094" containerID="d96eb5e86552dff28026972ac7e1ffff0b432937fe01d1f820c54d71d7118d52" exitCode=0 Apr 16 18:37:35.190190 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:35.190143 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" Apr 16 18:37:35.190190 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:35.190142 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" event={"ID":"429e922a-5118-4d2a-8a11-1db6a2def094","Type":"ContainerDied","Data":"d96eb5e86552dff28026972ac7e1ffff0b432937fe01d1f820c54d71d7118d52"} Apr 16 18:37:35.190190 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:35.190184 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj" event={"ID":"429e922a-5118-4d2a-8a11-1db6a2def094","Type":"ContainerDied","Data":"de693f5aa6f92e8f3231aff578b7d7a6dffa1c692f6475818c58475be2fff508"} Apr 16 18:37:35.190320 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:35.190204 2579 scope.go:117] "RemoveContainer" containerID="d96eb5e86552dff28026972ac7e1ffff0b432937fe01d1f820c54d71d7118d52" Apr 16 18:37:35.194005 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:35.193988 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/429e922a-5118-4d2a-8a11-1db6a2def094-openshift-service-ca-bundle\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:37:35.194092 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:35.194008 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/429e922a-5118-4d2a-8a11-1db6a2def094-proxy-tls\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:37:35.198430 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:35.198412 2579 scope.go:117] "RemoveContainer" containerID="d96eb5e86552dff28026972ac7e1ffff0b432937fe01d1f820c54d71d7118d52" Apr 16 18:37:35.198669 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:37:35.198652 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d96eb5e86552dff28026972ac7e1ffff0b432937fe01d1f820c54d71d7118d52\": container with ID starting with d96eb5e86552dff28026972ac7e1ffff0b432937fe01d1f820c54d71d7118d52 not found: ID does not exist" containerID="d96eb5e86552dff28026972ac7e1ffff0b432937fe01d1f820c54d71d7118d52" Apr 16 18:37:35.198727 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:35.198675 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d96eb5e86552dff28026972ac7e1ffff0b432937fe01d1f820c54d71d7118d52"} err="failed to get container status \"d96eb5e86552dff28026972ac7e1ffff0b432937fe01d1f820c54d71d7118d52\": rpc error: code = NotFound desc = could not find container \"d96eb5e86552dff28026972ac7e1ffff0b432937fe01d1f820c54d71d7118d52\": container with ID starting with d96eb5e86552dff28026972ac7e1ffff0b432937fe01d1f820c54d71d7118d52 not found: ID does not exist" Apr 16 18:37:35.212864 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:35.212844 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj"] Apr 16 18:37:35.217678 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:35.217661 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-6138a-58844c49d4-gm2wj"] Apr 16 18:37:35.732263 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:35.732245 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 18:37:35.732906 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:35.732887 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 18:37:35.754126 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:35.754041 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="429e922a-5118-4d2a-8a11-1db6a2def094" path="/var/lib/kubelet/pods/429e922a-5118-4d2a-8a11-1db6a2def094/volumes" Apr 16 18:37:54.562063 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:54.562026 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs"] Apr 16 18:37:54.562556 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:54.562296 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" podUID="4faf4db5-9379-4b3d-aa7f-b94616705992" containerName="sequence-graph-b16e7" containerID="cri-o://cda769d350cf0f2aa740ffa1978e60f13f09eea2df22b2527ccd6fc5a50e1822" gracePeriod=30 Apr 16 18:37:57.896149 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:37:57.896108 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" podUID="4faf4db5-9379-4b3d-aa7f-b94616705992" containerName="sequence-graph-b16e7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:38:02.896542 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:02.896492 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" podUID="4faf4db5-9379-4b3d-aa7f-b94616705992" containerName="sequence-graph-b16e7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:38:07.896732 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:07.896677 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" podUID="4faf4db5-9379-4b3d-aa7f-b94616705992" containerName="sequence-graph-b16e7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:38:07.897347 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:07.896829 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" Apr 16 18:38:12.896484 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:12.896431 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" podUID="4faf4db5-9379-4b3d-aa7f-b94616705992" containerName="sequence-graph-b16e7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:38:15.155525 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:15.155493 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc"] Apr 16 18:38:15.155884 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:15.155758 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="429e922a-5118-4d2a-8a11-1db6a2def094" containerName="switch-graph-6138a" Apr 16 18:38:15.155884 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:15.155771 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="429e922a-5118-4d2a-8a11-1db6a2def094" containerName="switch-graph-6138a" Apr 16 18:38:15.155884 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:15.155814 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="429e922a-5118-4d2a-8a11-1db6a2def094" containerName="switch-graph-6138a" Apr 16 18:38:15.158472 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:15.158456 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" Apr 16 18:38:15.161283 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:15.161247 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-091ad-kube-rbac-proxy-sar-config\"" Apr 16 18:38:15.161392 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:15.161247 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-091ad-serving-cert\"" Apr 16 18:38:15.168365 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:15.168343 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc"] Apr 16 18:38:15.269086 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:15.269060 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/003dfaa6-16d1-4f32-85d6-5d114a54c1fa-proxy-tls\") pod \"ensemble-graph-091ad-8999dbf5b-s5rpc\" (UID: \"003dfaa6-16d1-4f32-85d6-5d114a54c1fa\") " pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" Apr 16 18:38:15.269176 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:15.269098 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/003dfaa6-16d1-4f32-85d6-5d114a54c1fa-openshift-service-ca-bundle\") pod \"ensemble-graph-091ad-8999dbf5b-s5rpc\" (UID: \"003dfaa6-16d1-4f32-85d6-5d114a54c1fa\") " pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" Apr 16 18:38:15.370408 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:15.370386 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/003dfaa6-16d1-4f32-85d6-5d114a54c1fa-proxy-tls\") pod \"ensemble-graph-091ad-8999dbf5b-s5rpc\" (UID: \"003dfaa6-16d1-4f32-85d6-5d114a54c1fa\") " pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" Apr 16 18:38:15.370504 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:15.370427 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/003dfaa6-16d1-4f32-85d6-5d114a54c1fa-openshift-service-ca-bundle\") pod \"ensemble-graph-091ad-8999dbf5b-s5rpc\" (UID: \"003dfaa6-16d1-4f32-85d6-5d114a54c1fa\") " pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" Apr 16 18:38:15.370958 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:15.370940 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/003dfaa6-16d1-4f32-85d6-5d114a54c1fa-openshift-service-ca-bundle\") pod \"ensemble-graph-091ad-8999dbf5b-s5rpc\" (UID: \"003dfaa6-16d1-4f32-85d6-5d114a54c1fa\") " pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" Apr 16 18:38:15.372609 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:15.372593 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/003dfaa6-16d1-4f32-85d6-5d114a54c1fa-proxy-tls\") pod \"ensemble-graph-091ad-8999dbf5b-s5rpc\" (UID: \"003dfaa6-16d1-4f32-85d6-5d114a54c1fa\") " pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" Apr 16 18:38:15.468001 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:15.467953 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" Apr 16 18:38:15.592374 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:15.592349 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc"] Apr 16 18:38:15.594917 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:38:15.594892 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod003dfaa6_16d1_4f32_85d6_5d114a54c1fa.slice/crio-791ac73d32922b7bd07bdfd12cdb7bd410f92bf4097ad159bcef7dcbf9f04f8f WatchSource:0}: Error finding container 791ac73d32922b7bd07bdfd12cdb7bd410f92bf4097ad159bcef7dcbf9f04f8f: Status 404 returned error can't find the container with id 791ac73d32922b7bd07bdfd12cdb7bd410f92bf4097ad159bcef7dcbf9f04f8f Apr 16 18:38:15.596642 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:15.596623 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:38:16.304049 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:16.304014 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" event={"ID":"003dfaa6-16d1-4f32-85d6-5d114a54c1fa","Type":"ContainerStarted","Data":"1225ea3a65f56acfe9727b4f9b99900220784508d685e2ef92aeb9617105ff0d"} Apr 16 18:38:16.304049 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:16.304048 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" event={"ID":"003dfaa6-16d1-4f32-85d6-5d114a54c1fa","Type":"ContainerStarted","Data":"791ac73d32922b7bd07bdfd12cdb7bd410f92bf4097ad159bcef7dcbf9f04f8f"} Apr 16 18:38:16.304471 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:16.304159 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" Apr 16 18:38:16.322171 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:16.322126 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" podStartSLOduration=1.322065438 podStartE2EDuration="1.322065438s" podCreationTimestamp="2026-04-16 18:38:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:38:16.321872743 +0000 UTC m=+1241.169033234" watchObservedRunningTime="2026-04-16 18:38:16.322065438 +0000 UTC m=+1241.169225929" Apr 16 18:38:17.896942 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:17.896901 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" podUID="4faf4db5-9379-4b3d-aa7f-b94616705992" containerName="sequence-graph-b16e7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:38:22.313486 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:22.313400 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" Apr 16 18:38:22.896680 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:22.896637 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" podUID="4faf4db5-9379-4b3d-aa7f-b94616705992" containerName="sequence-graph-b16e7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:38:24.703124 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:24.703103 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" Apr 16 18:38:24.833142 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:24.833075 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4faf4db5-9379-4b3d-aa7f-b94616705992-openshift-service-ca-bundle\") pod \"4faf4db5-9379-4b3d-aa7f-b94616705992\" (UID: \"4faf4db5-9379-4b3d-aa7f-b94616705992\") " Apr 16 18:38:24.833257 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:24.833144 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4faf4db5-9379-4b3d-aa7f-b94616705992-proxy-tls\") pod \"4faf4db5-9379-4b3d-aa7f-b94616705992\" (UID: \"4faf4db5-9379-4b3d-aa7f-b94616705992\") " Apr 16 18:38:24.833426 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:24.833401 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4faf4db5-9379-4b3d-aa7f-b94616705992-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "4faf4db5-9379-4b3d-aa7f-b94616705992" (UID: "4faf4db5-9379-4b3d-aa7f-b94616705992"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:38:24.835025 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:24.834999 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4faf4db5-9379-4b3d-aa7f-b94616705992-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4faf4db5-9379-4b3d-aa7f-b94616705992" (UID: "4faf4db5-9379-4b3d-aa7f-b94616705992"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:38:24.934473 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:24.934448 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4faf4db5-9379-4b3d-aa7f-b94616705992-proxy-tls\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:38:24.934473 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:24.934469 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4faf4db5-9379-4b3d-aa7f-b94616705992-openshift-service-ca-bundle\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:38:25.203832 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:25.203804 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc"] Apr 16 18:38:25.204025 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:25.204004 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" podUID="003dfaa6-16d1-4f32-85d6-5d114a54c1fa" containerName="ensemble-graph-091ad" containerID="cri-o://1225ea3a65f56acfe9727b4f9b99900220784508d685e2ef92aeb9617105ff0d" gracePeriod=30 Apr 16 18:38:25.332619 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:25.332588 2579 generic.go:358] "Generic (PLEG): container finished" podID="4faf4db5-9379-4b3d-aa7f-b94616705992" containerID="cda769d350cf0f2aa740ffa1978e60f13f09eea2df22b2527ccd6fc5a50e1822" exitCode=0 Apr 16 18:38:25.332787 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:25.332633 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" event={"ID":"4faf4db5-9379-4b3d-aa7f-b94616705992","Type":"ContainerDied","Data":"cda769d350cf0f2aa740ffa1978e60f13f09eea2df22b2527ccd6fc5a50e1822"} Apr 16 18:38:25.332878 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:25.332813 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" event={"ID":"4faf4db5-9379-4b3d-aa7f-b94616705992","Type":"ContainerDied","Data":"fef3d205517df8d44413fb71b425991e2a1d3c3b460296e5c41851a609233e80"} Apr 16 18:38:25.332878 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:25.332841 2579 scope.go:117] "RemoveContainer" containerID="cda769d350cf0f2aa740ffa1978e60f13f09eea2df22b2527ccd6fc5a50e1822" Apr 16 18:38:25.332878 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:25.332667 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs" Apr 16 18:38:25.343178 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:25.343162 2579 scope.go:117] "RemoveContainer" containerID="cda769d350cf0f2aa740ffa1978e60f13f09eea2df22b2527ccd6fc5a50e1822" Apr 16 18:38:25.343468 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:38:25.343450 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cda769d350cf0f2aa740ffa1978e60f13f09eea2df22b2527ccd6fc5a50e1822\": container with ID starting with cda769d350cf0f2aa740ffa1978e60f13f09eea2df22b2527ccd6fc5a50e1822 not found: ID does not exist" containerID="cda769d350cf0f2aa740ffa1978e60f13f09eea2df22b2527ccd6fc5a50e1822" Apr 16 18:38:25.343537 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:25.343479 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cda769d350cf0f2aa740ffa1978e60f13f09eea2df22b2527ccd6fc5a50e1822"} err="failed to get container status \"cda769d350cf0f2aa740ffa1978e60f13f09eea2df22b2527ccd6fc5a50e1822\": rpc error: code = NotFound desc = could not find container \"cda769d350cf0f2aa740ffa1978e60f13f09eea2df22b2527ccd6fc5a50e1822\": container with ID starting with cda769d350cf0f2aa740ffa1978e60f13f09eea2df22b2527ccd6fc5a50e1822 not found: ID does not exist" Apr 16 18:38:25.372467 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:25.372443 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs"] Apr 16 18:38:25.378261 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:25.378233 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b16e7-649b6f987b-vl7zs"] Apr 16 18:38:25.753868 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:25.753842 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4faf4db5-9379-4b3d-aa7f-b94616705992" path="/var/lib/kubelet/pods/4faf4db5-9379-4b3d-aa7f-b94616705992/volumes" Apr 16 18:38:27.311316 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:27.311247 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" podUID="003dfaa6-16d1-4f32-85d6-5d114a54c1fa" containerName="ensemble-graph-091ad" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:38:32.311288 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:32.311239 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" podUID="003dfaa6-16d1-4f32-85d6-5d114a54c1fa" containerName="ensemble-graph-091ad" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:38:37.310856 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:37.310803 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" podUID="003dfaa6-16d1-4f32-85d6-5d114a54c1fa" containerName="ensemble-graph-091ad" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:38:37.311292 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:37.310929 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" Apr 16 18:38:42.311379 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:42.311335 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" podUID="003dfaa6-16d1-4f32-85d6-5d114a54c1fa" containerName="ensemble-graph-091ad" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:38:47.311318 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:47.311255 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" podUID="003dfaa6-16d1-4f32-85d6-5d114a54c1fa" containerName="ensemble-graph-091ad" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:38:52.311514 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:52.311473 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" podUID="003dfaa6-16d1-4f32-85d6-5d114a54c1fa" containerName="ensemble-graph-091ad" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:38:55.230804 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:38:55.230774 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod003dfaa6_16d1_4f32_85d6_5d114a54c1fa.slice/crio-1225ea3a65f56acfe9727b4f9b99900220784508d685e2ef92aeb9617105ff0d.scope\": RecentStats: unable to find data in memory cache]" Apr 16 18:38:55.231091 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:38:55.230903 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod003dfaa6_16d1_4f32_85d6_5d114a54c1fa.slice/crio-1225ea3a65f56acfe9727b4f9b99900220784508d685e2ef92aeb9617105ff0d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod003dfaa6_16d1_4f32_85d6_5d114a54c1fa.slice/crio-conmon-1225ea3a65f56acfe9727b4f9b99900220784508d685e2ef92aeb9617105ff0d.scope\": RecentStats: unable to find data in memory cache]" Apr 16 18:38:55.418258 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:55.418224 2579 generic.go:358] "Generic (PLEG): container finished" podID="003dfaa6-16d1-4f32-85d6-5d114a54c1fa" containerID="1225ea3a65f56acfe9727b4f9b99900220784508d685e2ef92aeb9617105ff0d" exitCode=0 Apr 16 18:38:55.418384 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:55.418262 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" event={"ID":"003dfaa6-16d1-4f32-85d6-5d114a54c1fa","Type":"ContainerDied","Data":"1225ea3a65f56acfe9727b4f9b99900220784508d685e2ef92aeb9617105ff0d"} Apr 16 18:38:55.843436 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:55.843416 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" Apr 16 18:38:55.934256 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:55.934226 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/003dfaa6-16d1-4f32-85d6-5d114a54c1fa-proxy-tls\") pod \"003dfaa6-16d1-4f32-85d6-5d114a54c1fa\" (UID: \"003dfaa6-16d1-4f32-85d6-5d114a54c1fa\") " Apr 16 18:38:55.934390 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:55.934263 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/003dfaa6-16d1-4f32-85d6-5d114a54c1fa-openshift-service-ca-bundle\") pod \"003dfaa6-16d1-4f32-85d6-5d114a54c1fa\" (UID: \"003dfaa6-16d1-4f32-85d6-5d114a54c1fa\") " Apr 16 18:38:55.934696 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:55.934673 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/003dfaa6-16d1-4f32-85d6-5d114a54c1fa-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "003dfaa6-16d1-4f32-85d6-5d114a54c1fa" (UID: "003dfaa6-16d1-4f32-85d6-5d114a54c1fa"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:38:55.936478 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:55.936451 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/003dfaa6-16d1-4f32-85d6-5d114a54c1fa-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "003dfaa6-16d1-4f32-85d6-5d114a54c1fa" (UID: "003dfaa6-16d1-4f32-85d6-5d114a54c1fa"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:38:56.034886 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:56.034832 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/003dfaa6-16d1-4f32-85d6-5d114a54c1fa-proxy-tls\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:38:56.034886 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:56.034853 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/003dfaa6-16d1-4f32-85d6-5d114a54c1fa-openshift-service-ca-bundle\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:38:56.422030 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:56.421998 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" event={"ID":"003dfaa6-16d1-4f32-85d6-5d114a54c1fa","Type":"ContainerDied","Data":"791ac73d32922b7bd07bdfd12cdb7bd410f92bf4097ad159bcef7dcbf9f04f8f"} Apr 16 18:38:56.422416 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:56.422041 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc" Apr 16 18:38:56.422416 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:56.422050 2579 scope.go:117] "RemoveContainer" containerID="1225ea3a65f56acfe9727b4f9b99900220784508d685e2ef92aeb9617105ff0d" Apr 16 18:38:56.444752 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:56.444727 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc"] Apr 16 18:38:56.451373 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:56.451352 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-091ad-8999dbf5b-s5rpc"] Apr 16 18:38:57.758680 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:38:57.755755 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="003dfaa6-16d1-4f32-85d6-5d114a54c1fa" path="/var/lib/kubelet/pods/003dfaa6-16d1-4f32-85d6-5d114a54c1fa/volumes" Apr 16 18:39:04.740961 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:04.740928 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw"] Apr 16 18:39:04.741362 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:04.741176 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="003dfaa6-16d1-4f32-85d6-5d114a54c1fa" containerName="ensemble-graph-091ad" Apr 16 18:39:04.741362 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:04.741187 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="003dfaa6-16d1-4f32-85d6-5d114a54c1fa" containerName="ensemble-graph-091ad" Apr 16 18:39:04.741362 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:04.741205 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4faf4db5-9379-4b3d-aa7f-b94616705992" containerName="sequence-graph-b16e7" Apr 16 18:39:04.741362 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:04.741210 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="4faf4db5-9379-4b3d-aa7f-b94616705992" containerName="sequence-graph-b16e7" Apr 16 18:39:04.741362 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:04.741248 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="003dfaa6-16d1-4f32-85d6-5d114a54c1fa" containerName="ensemble-graph-091ad" Apr 16 18:39:04.741362 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:04.741256 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="4faf4db5-9379-4b3d-aa7f-b94616705992" containerName="sequence-graph-b16e7" Apr 16 18:39:04.743997 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:04.743979 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" Apr 16 18:39:04.746288 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:04.746246 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-0f5d9-kube-rbac-proxy-sar-config\"" Apr 16 18:39:04.746418 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:04.746296 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-0f5d9-serving-cert\"" Apr 16 18:39:04.746418 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:04.746318 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:39:04.746666 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:04.746643 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-kt2gs\"" Apr 16 18:39:04.752502 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:04.752477 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw"] Apr 16 18:39:04.892496 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:04.892471 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d54043a1-da58-445d-9389-11833807c48e-openshift-service-ca-bundle\") pod \"sequence-graph-0f5d9-59b945f668-6xvgw\" (UID: \"d54043a1-da58-445d-9389-11833807c48e\") " pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" Apr 16 18:39:04.892607 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:04.892500 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d54043a1-da58-445d-9389-11833807c48e-proxy-tls\") pod \"sequence-graph-0f5d9-59b945f668-6xvgw\" (UID: \"d54043a1-da58-445d-9389-11833807c48e\") " pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" Apr 16 18:39:04.993458 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:04.993395 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d54043a1-da58-445d-9389-11833807c48e-openshift-service-ca-bundle\") pod \"sequence-graph-0f5d9-59b945f668-6xvgw\" (UID: \"d54043a1-da58-445d-9389-11833807c48e\") " pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" Apr 16 18:39:04.993458 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:04.993427 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d54043a1-da58-445d-9389-11833807c48e-proxy-tls\") pod \"sequence-graph-0f5d9-59b945f668-6xvgw\" (UID: \"d54043a1-da58-445d-9389-11833807c48e\") " pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" Apr 16 18:39:04.993617 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:39:04.993598 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-0f5d9-serving-cert: secret "sequence-graph-0f5d9-serving-cert" not found Apr 16 18:39:04.993670 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:39:04.993661 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d54043a1-da58-445d-9389-11833807c48e-proxy-tls podName:d54043a1-da58-445d-9389-11833807c48e nodeName:}" failed. No retries permitted until 2026-04-16 18:39:05.493644149 +0000 UTC m=+1290.340804619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d54043a1-da58-445d-9389-11833807c48e-proxy-tls") pod "sequence-graph-0f5d9-59b945f668-6xvgw" (UID: "d54043a1-da58-445d-9389-11833807c48e") : secret "sequence-graph-0f5d9-serving-cert" not found Apr 16 18:39:04.993978 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:04.993961 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d54043a1-da58-445d-9389-11833807c48e-openshift-service-ca-bundle\") pod \"sequence-graph-0f5d9-59b945f668-6xvgw\" (UID: \"d54043a1-da58-445d-9389-11833807c48e\") " pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" Apr 16 18:39:05.497375 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:05.497344 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d54043a1-da58-445d-9389-11833807c48e-proxy-tls\") pod \"sequence-graph-0f5d9-59b945f668-6xvgw\" (UID: \"d54043a1-da58-445d-9389-11833807c48e\") " pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" Apr 16 18:39:05.499983 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:05.499961 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d54043a1-da58-445d-9389-11833807c48e-proxy-tls\") pod \"sequence-graph-0f5d9-59b945f668-6xvgw\" (UID: \"d54043a1-da58-445d-9389-11833807c48e\") " pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" Apr 16 18:39:05.655127 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:05.655098 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" Apr 16 18:39:05.775017 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:05.774952 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw"] Apr 16 18:39:05.777825 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:39:05.777803 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd54043a1_da58_445d_9389_11833807c48e.slice/crio-ab9befcabd509bef300893b5811b9f4e0acaa88ea1b453b1648296bf171fb31c WatchSource:0}: Error finding container ab9befcabd509bef300893b5811b9f4e0acaa88ea1b453b1648296bf171fb31c: Status 404 returned error can't find the container with id ab9befcabd509bef300893b5811b9f4e0acaa88ea1b453b1648296bf171fb31c Apr 16 18:39:06.453697 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:06.453658 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" event={"ID":"d54043a1-da58-445d-9389-11833807c48e","Type":"ContainerStarted","Data":"21dd2696e6b9655d8ae5f2c87ed130d72a38257091f426ce52f0cbedae428e7b"} Apr 16 18:39:06.453697 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:06.453700 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" event={"ID":"d54043a1-da58-445d-9389-11833807c48e","Type":"ContainerStarted","Data":"ab9befcabd509bef300893b5811b9f4e0acaa88ea1b453b1648296bf171fb31c"} Apr 16 18:39:06.453895 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:06.453799 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" Apr 16 18:39:06.472352 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:06.472307 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" podStartSLOduration=2.472291459 podStartE2EDuration="2.472291459s" podCreationTimestamp="2026-04-16 18:39:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:39:06.470715098 +0000 UTC m=+1291.317875589" watchObservedRunningTime="2026-04-16 18:39:06.472291459 +0000 UTC m=+1291.319451947" Apr 16 18:39:12.463153 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:12.463123 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" Apr 16 18:39:14.806958 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:14.806925 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw"] Apr 16 18:39:14.807352 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:14.807140 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" podUID="d54043a1-da58-445d-9389-11833807c48e" containerName="sequence-graph-0f5d9" containerID="cri-o://21dd2696e6b9655d8ae5f2c87ed130d72a38257091f426ce52f0cbedae428e7b" gracePeriod=30 Apr 16 18:39:17.461074 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:17.461037 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" podUID="d54043a1-da58-445d-9389-11833807c48e" containerName="sequence-graph-0f5d9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:39:22.461402 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:22.461352 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" podUID="d54043a1-da58-445d-9389-11833807c48e" containerName="sequence-graph-0f5d9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:39:27.461330 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:27.461282 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" podUID="d54043a1-da58-445d-9389-11833807c48e" containerName="sequence-graph-0f5d9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:39:27.461726 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:27.461407 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" Apr 16 18:39:32.461727 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:32.461677 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" podUID="d54043a1-da58-445d-9389-11833807c48e" containerName="sequence-graph-0f5d9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:39:35.434779 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:35.434746 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z"] Apr 16 18:39:35.437736 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:35.437719 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" Apr 16 18:39:35.439865 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:35.439840 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-976d9-kube-rbac-proxy-sar-config\"" Apr 16 18:39:35.439865 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:35.439852 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-976d9-serving-cert\"" Apr 16 18:39:35.445783 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:35.445757 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z"] Apr 16 18:39:35.600197 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:35.600166 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b73e10f9-f7c9-4b71-8157-dcf5e6120e91-openshift-service-ca-bundle\") pod \"ensemble-graph-976d9-5dd8566c9d-lqx8z\" (UID: \"b73e10f9-f7c9-4b71-8157-dcf5e6120e91\") " pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" Apr 16 18:39:35.600341 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:35.600233 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b73e10f9-f7c9-4b71-8157-dcf5e6120e91-proxy-tls\") pod \"ensemble-graph-976d9-5dd8566c9d-lqx8z\" (UID: \"b73e10f9-f7c9-4b71-8157-dcf5e6120e91\") " pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" Apr 16 18:39:35.701060 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:35.700997 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b73e10f9-f7c9-4b71-8157-dcf5e6120e91-openshift-service-ca-bundle\") pod \"ensemble-graph-976d9-5dd8566c9d-lqx8z\" (UID: \"b73e10f9-f7c9-4b71-8157-dcf5e6120e91\") " pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" Apr 16 18:39:35.701167 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:35.701056 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b73e10f9-f7c9-4b71-8157-dcf5e6120e91-proxy-tls\") pod \"ensemble-graph-976d9-5dd8566c9d-lqx8z\" (UID: \"b73e10f9-f7c9-4b71-8157-dcf5e6120e91\") " pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" Apr 16 18:39:35.701574 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:35.701556 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b73e10f9-f7c9-4b71-8157-dcf5e6120e91-openshift-service-ca-bundle\") pod \"ensemble-graph-976d9-5dd8566c9d-lqx8z\" (UID: \"b73e10f9-f7c9-4b71-8157-dcf5e6120e91\") " pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" Apr 16 18:39:35.703609 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:35.703594 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-976d9-serving-cert\"" Apr 16 18:39:35.713424 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:35.713406 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b73e10f9-f7c9-4b71-8157-dcf5e6120e91-proxy-tls\") pod \"ensemble-graph-976d9-5dd8566c9d-lqx8z\" (UID: \"b73e10f9-f7c9-4b71-8157-dcf5e6120e91\") " pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" Apr 16 18:39:35.747887 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:35.747865 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" Apr 16 18:39:35.865891 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:35.865863 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z"] Apr 16 18:39:36.536780 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:36.536731 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" event={"ID":"b73e10f9-f7c9-4b71-8157-dcf5e6120e91","Type":"ContainerStarted","Data":"04fc75b2a67b06d7eb1342f85e1ffe2040a29c88c3f618c3c7608ddb6bd2b800"} Apr 16 18:39:36.536780 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:36.536776 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" event={"ID":"b73e10f9-f7c9-4b71-8157-dcf5e6120e91","Type":"ContainerStarted","Data":"637362ef955e26ba5c1c31f872f559384d8da20de7f18bcf1d17c9896f3dc483"} Apr 16 18:39:36.537217 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:36.536898 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" Apr 16 18:39:36.556006 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:36.555955 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" podStartSLOduration=1.555938849 podStartE2EDuration="1.555938849s" podCreationTimestamp="2026-04-16 18:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:39:36.554779946 +0000 UTC m=+1321.401940437" watchObservedRunningTime="2026-04-16 18:39:36.555938849 +0000 UTC m=+1321.403099341" Apr 16 18:39:37.461507 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:37.461462 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" podUID="d54043a1-da58-445d-9389-11833807c48e" containerName="sequence-graph-0f5d9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:39:42.461589 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:42.461549 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" podUID="d54043a1-da58-445d-9389-11833807c48e" containerName="sequence-graph-0f5d9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:39:42.545635 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:42.545608 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" Apr 16 18:39:44.942450 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:44.942421 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" Apr 16 18:39:45.062145 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:45.062081 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d54043a1-da58-445d-9389-11833807c48e-openshift-service-ca-bundle\") pod \"d54043a1-da58-445d-9389-11833807c48e\" (UID: \"d54043a1-da58-445d-9389-11833807c48e\") " Apr 16 18:39:45.062145 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:45.062127 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d54043a1-da58-445d-9389-11833807c48e-proxy-tls\") pod \"d54043a1-da58-445d-9389-11833807c48e\" (UID: \"d54043a1-da58-445d-9389-11833807c48e\") " Apr 16 18:39:45.062468 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:45.062448 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d54043a1-da58-445d-9389-11833807c48e-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d54043a1-da58-445d-9389-11833807c48e" (UID: "d54043a1-da58-445d-9389-11833807c48e"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:39:45.064116 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:45.064096 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d54043a1-da58-445d-9389-11833807c48e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d54043a1-da58-445d-9389-11833807c48e" (UID: "d54043a1-da58-445d-9389-11833807c48e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:39:45.163136 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:45.163109 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d54043a1-da58-445d-9389-11833807c48e-openshift-service-ca-bundle\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:39:45.163136 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:45.163133 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d54043a1-da58-445d-9389-11833807c48e-proxy-tls\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:39:45.567989 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:45.567958 2579 generic.go:358] "Generic (PLEG): container finished" podID="d54043a1-da58-445d-9389-11833807c48e" containerID="21dd2696e6b9655d8ae5f2c87ed130d72a38257091f426ce52f0cbedae428e7b" exitCode=0 Apr 16 18:39:45.568093 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:45.568013 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" Apr 16 18:39:45.568093 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:45.568043 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" event={"ID":"d54043a1-da58-445d-9389-11833807c48e","Type":"ContainerDied","Data":"21dd2696e6b9655d8ae5f2c87ed130d72a38257091f426ce52f0cbedae428e7b"} Apr 16 18:39:45.568093 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:45.568077 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw" event={"ID":"d54043a1-da58-445d-9389-11833807c48e","Type":"ContainerDied","Data":"ab9befcabd509bef300893b5811b9f4e0acaa88ea1b453b1648296bf171fb31c"} Apr 16 18:39:45.568208 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:45.568094 2579 scope.go:117] "RemoveContainer" containerID="21dd2696e6b9655d8ae5f2c87ed130d72a38257091f426ce52f0cbedae428e7b" Apr 16 18:39:45.580534 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:45.580513 2579 scope.go:117] "RemoveContainer" containerID="21dd2696e6b9655d8ae5f2c87ed130d72a38257091f426ce52f0cbedae428e7b" Apr 16 18:39:45.580803 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:39:45.580782 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21dd2696e6b9655d8ae5f2c87ed130d72a38257091f426ce52f0cbedae428e7b\": container with ID starting with 21dd2696e6b9655d8ae5f2c87ed130d72a38257091f426ce52f0cbedae428e7b not found: ID does not exist" containerID="21dd2696e6b9655d8ae5f2c87ed130d72a38257091f426ce52f0cbedae428e7b" Apr 16 18:39:45.580846 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:45.580811 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21dd2696e6b9655d8ae5f2c87ed130d72a38257091f426ce52f0cbedae428e7b"} err="failed to get container status \"21dd2696e6b9655d8ae5f2c87ed130d72a38257091f426ce52f0cbedae428e7b\": rpc error: code = NotFound desc = could not find container \"21dd2696e6b9655d8ae5f2c87ed130d72a38257091f426ce52f0cbedae428e7b\": container with ID starting with 21dd2696e6b9655d8ae5f2c87ed130d72a38257091f426ce52f0cbedae428e7b not found: ID does not exist" Apr 16 18:39:45.591242 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:45.591220 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw"] Apr 16 18:39:45.595819 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:45.595801 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-0f5d9-59b945f668-6xvgw"] Apr 16 18:39:45.754007 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:39:45.753970 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d54043a1-da58-445d-9389-11833807c48e" path="/var/lib/kubelet/pods/d54043a1-da58-445d-9389-11833807c48e/volumes" Apr 16 18:40:25.022201 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:40:25.022170 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs"] Apr 16 18:40:25.022705 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:40:25.022524 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d54043a1-da58-445d-9389-11833807c48e" containerName="sequence-graph-0f5d9" Apr 16 18:40:25.022705 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:40:25.022541 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54043a1-da58-445d-9389-11833807c48e" containerName="sequence-graph-0f5d9" Apr 16 18:40:25.022705 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:40:25.022628 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="d54043a1-da58-445d-9389-11833807c48e" containerName="sequence-graph-0f5d9" Apr 16 18:40:25.025427 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:40:25.025406 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" Apr 16 18:40:25.027641 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:40:25.027613 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-5aeb1-serving-cert\"" Apr 16 18:40:25.027736 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:40:25.027701 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-5aeb1-kube-rbac-proxy-sar-config\"" Apr 16 18:40:25.033229 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:40:25.033207 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs"] Apr 16 18:40:25.131298 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:40:25.131263 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d32adf8-83ce-4e65-b767-0421700df8b6-proxy-tls\") pod \"sequence-graph-5aeb1-658b497c45-rh6fs\" (UID: \"5d32adf8-83ce-4e65-b767-0421700df8b6\") " pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" Apr 16 18:40:25.131422 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:40:25.131313 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d32adf8-83ce-4e65-b767-0421700df8b6-openshift-service-ca-bundle\") pod \"sequence-graph-5aeb1-658b497c45-rh6fs\" (UID: \"5d32adf8-83ce-4e65-b767-0421700df8b6\") " pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" Apr 16 18:40:25.231735 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:40:25.231714 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d32adf8-83ce-4e65-b767-0421700df8b6-proxy-tls\") pod \"sequence-graph-5aeb1-658b497c45-rh6fs\" (UID: \"5d32adf8-83ce-4e65-b767-0421700df8b6\") " pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" Apr 16 18:40:25.231848 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:40:25.231743 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d32adf8-83ce-4e65-b767-0421700df8b6-openshift-service-ca-bundle\") pod \"sequence-graph-5aeb1-658b497c45-rh6fs\" (UID: \"5d32adf8-83ce-4e65-b767-0421700df8b6\") " pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" Apr 16 18:40:25.232326 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:40:25.232307 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d32adf8-83ce-4e65-b767-0421700df8b6-openshift-service-ca-bundle\") pod \"sequence-graph-5aeb1-658b497c45-rh6fs\" (UID: \"5d32adf8-83ce-4e65-b767-0421700df8b6\") " pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" Apr 16 18:40:25.233998 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:40:25.233980 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d32adf8-83ce-4e65-b767-0421700df8b6-proxy-tls\") pod \"sequence-graph-5aeb1-658b497c45-rh6fs\" (UID: \"5d32adf8-83ce-4e65-b767-0421700df8b6\") " pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" Apr 16 18:40:25.335199 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:40:25.335147 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" Apr 16 18:40:25.455601 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:40:25.455578 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs"] Apr 16 18:40:25.457863 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:40:25.457839 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d32adf8_83ce_4e65_b767_0421700df8b6.slice/crio-46c4e81b6766c11759efc4902fbd58136c602103ad66b174d89c48de05908d81 WatchSource:0}: Error finding container 46c4e81b6766c11759efc4902fbd58136c602103ad66b174d89c48de05908d81: Status 404 returned error can't find the container with id 46c4e81b6766c11759efc4902fbd58136c602103ad66b174d89c48de05908d81 Apr 16 18:40:25.678994 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:40:25.678961 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" event={"ID":"5d32adf8-83ce-4e65-b767-0421700df8b6","Type":"ContainerStarted","Data":"d51a6a469ccbb75fc221dd595f5de02854265085a49ec8f50ff887d64be3cede"} Apr 16 18:40:25.679115 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:40:25.679004 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" event={"ID":"5d32adf8-83ce-4e65-b767-0421700df8b6","Type":"ContainerStarted","Data":"46c4e81b6766c11759efc4902fbd58136c602103ad66b174d89c48de05908d81"} Apr 16 18:40:25.679115 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:40:25.679042 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" Apr 16 18:40:25.697619 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:40:25.697586 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" podStartSLOduration=0.697572641 podStartE2EDuration="697.572641ms" podCreationTimestamp="2026-04-16 18:40:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:40:25.696247014 +0000 UTC m=+1370.543407521" watchObservedRunningTime="2026-04-16 18:40:25.697572641 +0000 UTC m=+1370.544733132" Apr 16 18:40:31.688957 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:40:31.688920 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" Apr 16 18:42:35.751404 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:42:35.751371 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 18:42:35.753923 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:42:35.753442 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 18:47:35.773362 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:47:35.773231 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 18:47:35.779326 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:47:35.775830 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 18:47:50.099896 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:47:50.099863 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z"] Apr 16 18:47:50.100361 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:47:50.100158 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" podUID="b73e10f9-f7c9-4b71-8157-dcf5e6120e91" containerName="ensemble-graph-976d9" containerID="cri-o://04fc75b2a67b06d7eb1342f85e1ffe2040a29c88c3f618c3c7608ddb6bd2b800" gracePeriod=30 Apr 16 18:47:52.543486 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:47:52.543446 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" podUID="b73e10f9-f7c9-4b71-8157-dcf5e6120e91" containerName="ensemble-graph-976d9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:47:57.543788 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:47:57.543738 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" podUID="b73e10f9-f7c9-4b71-8157-dcf5e6120e91" containerName="ensemble-graph-976d9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:48:02.544331 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:02.544260 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" podUID="b73e10f9-f7c9-4b71-8157-dcf5e6120e91" containerName="ensemble-graph-976d9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:48:02.544721 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:02.544405 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" Apr 16 18:48:07.544161 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:07.544107 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" podUID="b73e10f9-f7c9-4b71-8157-dcf5e6120e91" containerName="ensemble-graph-976d9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:48:12.543772 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:12.543721 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" podUID="b73e10f9-f7c9-4b71-8157-dcf5e6120e91" containerName="ensemble-graph-976d9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:48:17.543733 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:17.543673 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" podUID="b73e10f9-f7c9-4b71-8157-dcf5e6120e91" containerName="ensemble-graph-976d9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:48:20.243145 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:20.243124 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" Apr 16 18:48:20.346557 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:20.346527 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b73e10f9-f7c9-4b71-8157-dcf5e6120e91-openshift-service-ca-bundle\") pod \"b73e10f9-f7c9-4b71-8157-dcf5e6120e91\" (UID: \"b73e10f9-f7c9-4b71-8157-dcf5e6120e91\") " Apr 16 18:48:20.346669 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:20.346573 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b73e10f9-f7c9-4b71-8157-dcf5e6120e91-proxy-tls\") pod \"b73e10f9-f7c9-4b71-8157-dcf5e6120e91\" (UID: \"b73e10f9-f7c9-4b71-8157-dcf5e6120e91\") " Apr 16 18:48:20.346882 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:20.346855 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b73e10f9-f7c9-4b71-8157-dcf5e6120e91-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "b73e10f9-f7c9-4b71-8157-dcf5e6120e91" (UID: "b73e10f9-f7c9-4b71-8157-dcf5e6120e91"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:48:20.348467 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:20.348440 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b73e10f9-f7c9-4b71-8157-dcf5e6120e91-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b73e10f9-f7c9-4b71-8157-dcf5e6120e91" (UID: "b73e10f9-f7c9-4b71-8157-dcf5e6120e91"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:48:20.446988 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:20.446932 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b73e10f9-f7c9-4b71-8157-dcf5e6120e91-openshift-service-ca-bundle\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:48:20.446988 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:20.446953 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b73e10f9-f7c9-4b71-8157-dcf5e6120e91-proxy-tls\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:48:21.010138 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:21.010108 2579 generic.go:358] "Generic (PLEG): container finished" podID="b73e10f9-f7c9-4b71-8157-dcf5e6120e91" containerID="04fc75b2a67b06d7eb1342f85e1ffe2040a29c88c3f618c3c7608ddb6bd2b800" exitCode=0 Apr 16 18:48:21.010317 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:21.010170 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" Apr 16 18:48:21.010317 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:21.010176 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" event={"ID":"b73e10f9-f7c9-4b71-8157-dcf5e6120e91","Type":"ContainerDied","Data":"04fc75b2a67b06d7eb1342f85e1ffe2040a29c88c3f618c3c7608ddb6bd2b800"} Apr 16 18:48:21.010317 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:21.010201 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z" event={"ID":"b73e10f9-f7c9-4b71-8157-dcf5e6120e91","Type":"ContainerDied","Data":"637362ef955e26ba5c1c31f872f559384d8da20de7f18bcf1d17c9896f3dc483"} Apr 16 18:48:21.010317 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:21.010214 2579 scope.go:117] "RemoveContainer" containerID="04fc75b2a67b06d7eb1342f85e1ffe2040a29c88c3f618c3c7608ddb6bd2b800" Apr 16 18:48:21.017817 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:21.017795 2579 scope.go:117] "RemoveContainer" containerID="04fc75b2a67b06d7eb1342f85e1ffe2040a29c88c3f618c3c7608ddb6bd2b800" Apr 16 18:48:21.018081 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:48:21.018062 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04fc75b2a67b06d7eb1342f85e1ffe2040a29c88c3f618c3c7608ddb6bd2b800\": container with ID starting with 04fc75b2a67b06d7eb1342f85e1ffe2040a29c88c3f618c3c7608ddb6bd2b800 not found: ID does not exist" containerID="04fc75b2a67b06d7eb1342f85e1ffe2040a29c88c3f618c3c7608ddb6bd2b800" Apr 16 18:48:21.018135 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:21.018089 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04fc75b2a67b06d7eb1342f85e1ffe2040a29c88c3f618c3c7608ddb6bd2b800"} err="failed to get container status \"04fc75b2a67b06d7eb1342f85e1ffe2040a29c88c3f618c3c7608ddb6bd2b800\": rpc error: code = NotFound desc = could not find container \"04fc75b2a67b06d7eb1342f85e1ffe2040a29c88c3f618c3c7608ddb6bd2b800\": container with ID starting with 04fc75b2a67b06d7eb1342f85e1ffe2040a29c88c3f618c3c7608ddb6bd2b800 not found: ID does not exist" Apr 16 18:48:21.030674 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:21.030649 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z"] Apr 16 18:48:21.034890 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:21.034873 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-976d9-5dd8566c9d-lqx8z"] Apr 16 18:48:21.754978 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:21.754944 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b73e10f9-f7c9-4b71-8157-dcf5e6120e91" path="/var/lib/kubelet/pods/b73e10f9-f7c9-4b71-8157-dcf5e6120e91/volumes" Apr 16 18:48:39.638000 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:39.637960 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs"] Apr 16 18:48:39.638539 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:39.638318 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" podUID="5d32adf8-83ce-4e65-b767-0421700df8b6" containerName="sequence-graph-5aeb1" containerID="cri-o://d51a6a469ccbb75fc221dd595f5de02854265085a49ec8f50ff887d64be3cede" gracePeriod=30 Apr 16 18:48:41.686931 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:41.686881 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" podUID="5d32adf8-83ce-4e65-b767-0421700df8b6" containerName="sequence-graph-5aeb1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:48:46.687492 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:46.687443 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" podUID="5d32adf8-83ce-4e65-b767-0421700df8b6" containerName="sequence-graph-5aeb1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:48:51.687261 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:51.687168 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" podUID="5d32adf8-83ce-4e65-b767-0421700df8b6" containerName="sequence-graph-5aeb1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:48:51.687647 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:51.687307 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" Apr 16 18:48:56.686569 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:48:56.686520 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" podUID="5d32adf8-83ce-4e65-b767-0421700df8b6" containerName="sequence-graph-5aeb1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:49:00.338075 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:00.338004 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2"] Apr 16 18:49:00.338421 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:00.338238 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b73e10f9-f7c9-4b71-8157-dcf5e6120e91" containerName="ensemble-graph-976d9" Apr 16 18:49:00.338421 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:00.338248 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="b73e10f9-f7c9-4b71-8157-dcf5e6120e91" containerName="ensemble-graph-976d9" Apr 16 18:49:00.338421 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:00.338318 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="b73e10f9-f7c9-4b71-8157-dcf5e6120e91" containerName="ensemble-graph-976d9" Apr 16 18:49:00.342356 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:00.342329 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" Apr 16 18:49:00.344461 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:00.344441 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-ebde8-kube-rbac-proxy-sar-config\"" Apr 16 18:49:00.344579 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:00.344482 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-ebde8-serving-cert\"" Apr 16 18:49:00.349021 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:00.348998 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2"] Apr 16 18:49:00.512623 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:00.512599 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48fd4505-0055-468b-911d-1f6feecd7faa-openshift-service-ca-bundle\") pod \"splitter-graph-ebde8-7c98c9c67-6zzq2\" (UID: \"48fd4505-0055-468b-911d-1f6feecd7faa\") " pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" Apr 16 18:49:00.512740 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:00.512649 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48fd4505-0055-468b-911d-1f6feecd7faa-proxy-tls\") pod \"splitter-graph-ebde8-7c98c9c67-6zzq2\" (UID: \"48fd4505-0055-468b-911d-1f6feecd7faa\") " pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" Apr 16 18:49:00.613657 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:00.613637 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48fd4505-0055-468b-911d-1f6feecd7faa-proxy-tls\") pod \"splitter-graph-ebde8-7c98c9c67-6zzq2\" (UID: \"48fd4505-0055-468b-911d-1f6feecd7faa\") " pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" Apr 16 18:49:00.613752 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:00.613670 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48fd4505-0055-468b-911d-1f6feecd7faa-openshift-service-ca-bundle\") pod \"splitter-graph-ebde8-7c98c9c67-6zzq2\" (UID: \"48fd4505-0055-468b-911d-1f6feecd7faa\") " pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" Apr 16 18:49:00.614207 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:00.614190 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48fd4505-0055-468b-911d-1f6feecd7faa-openshift-service-ca-bundle\") pod \"splitter-graph-ebde8-7c98c9c67-6zzq2\" (UID: \"48fd4505-0055-468b-911d-1f6feecd7faa\") " pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" Apr 16 18:49:00.616113 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:00.616096 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48fd4505-0055-468b-911d-1f6feecd7faa-proxy-tls\") pod \"splitter-graph-ebde8-7c98c9c67-6zzq2\" (UID: \"48fd4505-0055-468b-911d-1f6feecd7faa\") " pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" Apr 16 18:49:00.653587 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:00.653562 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" Apr 16 18:49:00.768053 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:00.768008 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2"] Apr 16 18:49:00.770474 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:49:00.770450 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48fd4505_0055_468b_911d_1f6feecd7faa.slice/crio-4eaafe4883cd566789f5995d0b282e704b612ff5a583458c2618d60879afba3c WatchSource:0}: Error finding container 4eaafe4883cd566789f5995d0b282e704b612ff5a583458c2618d60879afba3c: Status 404 returned error can't find the container with id 4eaafe4883cd566789f5995d0b282e704b612ff5a583458c2618d60879afba3c Apr 16 18:49:00.772326 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:00.772307 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:49:01.118443 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:01.118414 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" event={"ID":"48fd4505-0055-468b-911d-1f6feecd7faa","Type":"ContainerStarted","Data":"ce22aff075805f6f2df8c68d8c7ed6d8c200a3615d6cd84418e189ecd5d1dee9"} Apr 16 18:49:01.118574 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:01.118446 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" event={"ID":"48fd4505-0055-468b-911d-1f6feecd7faa","Type":"ContainerStarted","Data":"4eaafe4883cd566789f5995d0b282e704b612ff5a583458c2618d60879afba3c"} Apr 16 18:49:01.118574 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:01.118552 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" Apr 16 18:49:01.135932 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:01.135884 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" podStartSLOduration=1.135865718 podStartE2EDuration="1.135865718s" podCreationTimestamp="2026-04-16 18:49:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:49:01.1346991 +0000 UTC m=+1885.981859590" watchObservedRunningTime="2026-04-16 18:49:01.135865718 +0000 UTC m=+1885.983026209" Apr 16 18:49:01.686670 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:01.686637 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" podUID="5d32adf8-83ce-4e65-b767-0421700df8b6" containerName="sequence-graph-5aeb1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:49:06.686878 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:06.686834 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" podUID="5d32adf8-83ce-4e65-b767-0421700df8b6" containerName="sequence-graph-5aeb1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:49:07.127030 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:07.127000 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" Apr 16 18:49:09.775366 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:09.775343 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" Apr 16 18:49:09.873279 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:09.873242 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d32adf8-83ce-4e65-b767-0421700df8b6-openshift-service-ca-bundle\") pod \"5d32adf8-83ce-4e65-b767-0421700df8b6\" (UID: \"5d32adf8-83ce-4e65-b767-0421700df8b6\") " Apr 16 18:49:09.873374 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:09.873335 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d32adf8-83ce-4e65-b767-0421700df8b6-proxy-tls\") pod \"5d32adf8-83ce-4e65-b767-0421700df8b6\" (UID: \"5d32adf8-83ce-4e65-b767-0421700df8b6\") " Apr 16 18:49:09.873480 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:09.873458 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d32adf8-83ce-4e65-b767-0421700df8b6-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "5d32adf8-83ce-4e65-b767-0421700df8b6" (UID: "5d32adf8-83ce-4e65-b767-0421700df8b6"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:49:09.875164 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:09.875144 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d32adf8-83ce-4e65-b767-0421700df8b6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5d32adf8-83ce-4e65-b767-0421700df8b6" (UID: "5d32adf8-83ce-4e65-b767-0421700df8b6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:49:09.974622 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:09.974571 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d32adf8-83ce-4e65-b767-0421700df8b6-openshift-service-ca-bundle\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:49:09.974622 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:09.974593 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d32adf8-83ce-4e65-b767-0421700df8b6-proxy-tls\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:49:10.144165 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:10.144133 2579 generic.go:358] "Generic (PLEG): container finished" podID="5d32adf8-83ce-4e65-b767-0421700df8b6" containerID="d51a6a469ccbb75fc221dd595f5de02854265085a49ec8f50ff887d64be3cede" exitCode=0 Apr 16 18:49:10.144287 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:10.144203 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" Apr 16 18:49:10.144287 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:10.144223 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" event={"ID":"5d32adf8-83ce-4e65-b767-0421700df8b6","Type":"ContainerDied","Data":"d51a6a469ccbb75fc221dd595f5de02854265085a49ec8f50ff887d64be3cede"} Apr 16 18:49:10.144287 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:10.144284 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs" event={"ID":"5d32adf8-83ce-4e65-b767-0421700df8b6","Type":"ContainerDied","Data":"46c4e81b6766c11759efc4902fbd58136c602103ad66b174d89c48de05908d81"} Apr 16 18:49:10.144402 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:10.144303 2579 scope.go:117] "RemoveContainer" containerID="d51a6a469ccbb75fc221dd595f5de02854265085a49ec8f50ff887d64be3cede" Apr 16 18:49:10.152344 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:10.152317 2579 scope.go:117] "RemoveContainer" containerID="d51a6a469ccbb75fc221dd595f5de02854265085a49ec8f50ff887d64be3cede" Apr 16 18:49:10.152629 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:49:10.152605 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d51a6a469ccbb75fc221dd595f5de02854265085a49ec8f50ff887d64be3cede\": container with ID starting with d51a6a469ccbb75fc221dd595f5de02854265085a49ec8f50ff887d64be3cede not found: ID does not exist" containerID="d51a6a469ccbb75fc221dd595f5de02854265085a49ec8f50ff887d64be3cede" Apr 16 18:49:10.152698 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:10.152640 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d51a6a469ccbb75fc221dd595f5de02854265085a49ec8f50ff887d64be3cede"} err="failed to get container status \"d51a6a469ccbb75fc221dd595f5de02854265085a49ec8f50ff887d64be3cede\": rpc error: code = NotFound desc = could not find container \"d51a6a469ccbb75fc221dd595f5de02854265085a49ec8f50ff887d64be3cede\": container with ID starting with d51a6a469ccbb75fc221dd595f5de02854265085a49ec8f50ff887d64be3cede not found: ID does not exist" Apr 16 18:49:10.166036 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:10.166010 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs"] Apr 16 18:49:10.169348 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:10.169325 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-5aeb1-658b497c45-rh6fs"] Apr 16 18:49:10.391863 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:10.391836 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2"] Apr 16 18:49:10.392071 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:10.392051 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" podUID="48fd4505-0055-468b-911d-1f6feecd7faa" containerName="splitter-graph-ebde8" containerID="cri-o://ce22aff075805f6f2df8c68d8c7ed6d8c200a3615d6cd84418e189ecd5d1dee9" gracePeriod=30 Apr 16 18:49:11.754420 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:11.754381 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d32adf8-83ce-4e65-b767-0421700df8b6" path="/var/lib/kubelet/pods/5d32adf8-83ce-4e65-b767-0421700df8b6/volumes" Apr 16 18:49:12.125849 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:12.125806 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" podUID="48fd4505-0055-468b-911d-1f6feecd7faa" containerName="splitter-graph-ebde8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:49:17.125228 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:17.125185 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" podUID="48fd4505-0055-468b-911d-1f6feecd7faa" containerName="splitter-graph-ebde8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:49:22.125903 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:22.125860 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" podUID="48fd4505-0055-468b-911d-1f6feecd7faa" containerName="splitter-graph-ebde8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:49:22.126367 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:22.125969 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" Apr 16 18:49:27.125421 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:27.125373 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" podUID="48fd4505-0055-468b-911d-1f6feecd7faa" containerName="splitter-graph-ebde8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:49:32.125057 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:32.125024 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" podUID="48fd4505-0055-468b-911d-1f6feecd7faa" containerName="splitter-graph-ebde8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:49:37.125359 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:37.125312 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" podUID="48fd4505-0055-468b-911d-1f6feecd7faa" containerName="splitter-graph-ebde8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:49:40.431440 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:49:40.431406 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48fd4505_0055_468b_911d_1f6feecd7faa.slice/crio-conmon-ce22aff075805f6f2df8c68d8c7ed6d8c200a3615d6cd84418e189ecd5d1dee9.scope\": RecentStats: unable to find data in memory cache]" Apr 16 18:49:40.531783 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:40.531762 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" Apr 16 18:49:40.569106 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:40.569076 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48fd4505-0055-468b-911d-1f6feecd7faa-proxy-tls\") pod \"48fd4505-0055-468b-911d-1f6feecd7faa\" (UID: \"48fd4505-0055-468b-911d-1f6feecd7faa\") " Apr 16 18:49:40.569221 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:40.569135 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48fd4505-0055-468b-911d-1f6feecd7faa-openshift-service-ca-bundle\") pod \"48fd4505-0055-468b-911d-1f6feecd7faa\" (UID: \"48fd4505-0055-468b-911d-1f6feecd7faa\") " Apr 16 18:49:40.569482 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:40.569460 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48fd4505-0055-468b-911d-1f6feecd7faa-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "48fd4505-0055-468b-911d-1f6feecd7faa" (UID: "48fd4505-0055-468b-911d-1f6feecd7faa"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:49:40.570945 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:40.570919 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48fd4505-0055-468b-911d-1f6feecd7faa-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "48fd4505-0055-468b-911d-1f6feecd7faa" (UID: "48fd4505-0055-468b-911d-1f6feecd7faa"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:49:40.669982 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:40.669929 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48fd4505-0055-468b-911d-1f6feecd7faa-proxy-tls\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:49:40.669982 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:40.669950 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48fd4505-0055-468b-911d-1f6feecd7faa-openshift-service-ca-bundle\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:49:41.229820 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:41.229784 2579 generic.go:358] "Generic (PLEG): container finished" podID="48fd4505-0055-468b-911d-1f6feecd7faa" containerID="ce22aff075805f6f2df8c68d8c7ed6d8c200a3615d6cd84418e189ecd5d1dee9" exitCode=0 Apr 16 18:49:41.230018 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:41.229885 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" Apr 16 18:49:41.230018 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:41.229884 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" event={"ID":"48fd4505-0055-468b-911d-1f6feecd7faa","Type":"ContainerDied","Data":"ce22aff075805f6f2df8c68d8c7ed6d8c200a3615d6cd84418e189ecd5d1dee9"} Apr 16 18:49:41.230018 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:41.229932 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2" event={"ID":"48fd4505-0055-468b-911d-1f6feecd7faa","Type":"ContainerDied","Data":"4eaafe4883cd566789f5995d0b282e704b612ff5a583458c2618d60879afba3c"} Apr 16 18:49:41.230018 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:41.229949 2579 scope.go:117] "RemoveContainer" containerID="ce22aff075805f6f2df8c68d8c7ed6d8c200a3615d6cd84418e189ecd5d1dee9" Apr 16 18:49:41.238314 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:41.238262 2579 scope.go:117] "RemoveContainer" containerID="ce22aff075805f6f2df8c68d8c7ed6d8c200a3615d6cd84418e189ecd5d1dee9" Apr 16 18:49:41.238626 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:49:41.238594 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce22aff075805f6f2df8c68d8c7ed6d8c200a3615d6cd84418e189ecd5d1dee9\": container with ID starting with ce22aff075805f6f2df8c68d8c7ed6d8c200a3615d6cd84418e189ecd5d1dee9 not found: ID does not exist" containerID="ce22aff075805f6f2df8c68d8c7ed6d8c200a3615d6cd84418e189ecd5d1dee9" Apr 16 18:49:41.238750 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:41.238631 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce22aff075805f6f2df8c68d8c7ed6d8c200a3615d6cd84418e189ecd5d1dee9"} err="failed to get container status \"ce22aff075805f6f2df8c68d8c7ed6d8c200a3615d6cd84418e189ecd5d1dee9\": rpc error: code = NotFound desc = could not find container \"ce22aff075805f6f2df8c68d8c7ed6d8c200a3615d6cd84418e189ecd5d1dee9\": container with ID starting with ce22aff075805f6f2df8c68d8c7ed6d8c200a3615d6cd84418e189ecd5d1dee9 not found: ID does not exist" Apr 16 18:49:41.252351 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:41.252323 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2"] Apr 16 18:49:41.257182 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:41.257154 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-ebde8-7c98c9c67-6zzq2"] Apr 16 18:49:41.754471 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:41.754446 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48fd4505-0055-468b-911d-1f6feecd7faa" path="/var/lib/kubelet/pods/48fd4505-0055-468b-911d-1f6feecd7faa/volumes" Apr 16 18:49:49.847648 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:49.847613 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v"] Apr 16 18:49:49.848210 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:49.847909 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d32adf8-83ce-4e65-b767-0421700df8b6" containerName="sequence-graph-5aeb1" Apr 16 18:49:49.848210 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:49.847923 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d32adf8-83ce-4e65-b767-0421700df8b6" containerName="sequence-graph-5aeb1" Apr 16 18:49:49.848210 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:49.847934 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48fd4505-0055-468b-911d-1f6feecd7faa" containerName="splitter-graph-ebde8" Apr 16 18:49:49.848210 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:49.847939 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fd4505-0055-468b-911d-1f6feecd7faa" containerName="splitter-graph-ebde8" Apr 16 18:49:49.848210 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:49.847987 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="48fd4505-0055-468b-911d-1f6feecd7faa" containerName="splitter-graph-ebde8" Apr 16 18:49:49.848210 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:49.847997 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d32adf8-83ce-4e65-b767-0421700df8b6" containerName="sequence-graph-5aeb1" Apr 16 18:49:49.852047 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:49.852026 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" Apr 16 18:49:49.854231 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:49.854204 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-kt2gs\"" Apr 16 18:49:49.854381 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:49.854204 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-979d7-serving-cert\"" Apr 16 18:49:49.854381 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:49.854248 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-979d7-kube-rbac-proxy-sar-config\"" Apr 16 18:49:49.854873 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:49.854855 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:49:49.859203 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:49.859181 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v"] Apr 16 18:49:49.925519 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:49.925494 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15bf30bd-7b7e-4f3e-b09b-59fc968da5d7-openshift-service-ca-bundle\") pod \"switch-graph-979d7-565d6b4454-s4t4v\" (UID: \"15bf30bd-7b7e-4f3e-b09b-59fc968da5d7\") " pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" Apr 16 18:49:49.925608 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:49.925540 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15bf30bd-7b7e-4f3e-b09b-59fc968da5d7-proxy-tls\") pod \"switch-graph-979d7-565d6b4454-s4t4v\" (UID: \"15bf30bd-7b7e-4f3e-b09b-59fc968da5d7\") " pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" Apr 16 18:49:50.025784 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:50.025759 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15bf30bd-7b7e-4f3e-b09b-59fc968da5d7-proxy-tls\") pod \"switch-graph-979d7-565d6b4454-s4t4v\" (UID: \"15bf30bd-7b7e-4f3e-b09b-59fc968da5d7\") " pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" Apr 16 18:49:50.025873 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:50.025802 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15bf30bd-7b7e-4f3e-b09b-59fc968da5d7-openshift-service-ca-bundle\") pod \"switch-graph-979d7-565d6b4454-s4t4v\" (UID: \"15bf30bd-7b7e-4f3e-b09b-59fc968da5d7\") " pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" Apr 16 18:49:50.025917 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:49:50.025892 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-979d7-serving-cert: secret "switch-graph-979d7-serving-cert" not found Apr 16 18:49:50.025987 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:49:50.025977 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15bf30bd-7b7e-4f3e-b09b-59fc968da5d7-proxy-tls podName:15bf30bd-7b7e-4f3e-b09b-59fc968da5d7 nodeName:}" failed. No retries permitted until 2026-04-16 18:49:50.525958055 +0000 UTC m=+1935.373118527 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/15bf30bd-7b7e-4f3e-b09b-59fc968da5d7-proxy-tls") pod "switch-graph-979d7-565d6b4454-s4t4v" (UID: "15bf30bd-7b7e-4f3e-b09b-59fc968da5d7") : secret "switch-graph-979d7-serving-cert" not found Apr 16 18:49:50.026398 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:50.026381 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15bf30bd-7b7e-4f3e-b09b-59fc968da5d7-openshift-service-ca-bundle\") pod \"switch-graph-979d7-565d6b4454-s4t4v\" (UID: \"15bf30bd-7b7e-4f3e-b09b-59fc968da5d7\") " pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" Apr 16 18:49:50.529020 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:50.528990 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15bf30bd-7b7e-4f3e-b09b-59fc968da5d7-proxy-tls\") pod \"switch-graph-979d7-565d6b4454-s4t4v\" (UID: \"15bf30bd-7b7e-4f3e-b09b-59fc968da5d7\") " pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" Apr 16 18:49:50.531548 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:50.531525 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15bf30bd-7b7e-4f3e-b09b-59fc968da5d7-proxy-tls\") pod \"switch-graph-979d7-565d6b4454-s4t4v\" (UID: \"15bf30bd-7b7e-4f3e-b09b-59fc968da5d7\") " pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" Apr 16 18:49:50.762029 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:50.762003 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" Apr 16 18:49:50.880029 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:50.880007 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v"] Apr 16 18:49:50.883328 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:49:50.883300 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15bf30bd_7b7e_4f3e_b09b_59fc968da5d7.slice/crio-3b66bb986a711da122e0a3a594711bbe38d1a2ccf258c5b6d65e2f8375135735 WatchSource:0}: Error finding container 3b66bb986a711da122e0a3a594711bbe38d1a2ccf258c5b6d65e2f8375135735: Status 404 returned error can't find the container with id 3b66bb986a711da122e0a3a594711bbe38d1a2ccf258c5b6d65e2f8375135735 Apr 16 18:49:51.260343 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:51.260306 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" event={"ID":"15bf30bd-7b7e-4f3e-b09b-59fc968da5d7","Type":"ContainerStarted","Data":"58267955b5a1d040e640a8fad928cf954debff27d65f179d011783393b9d1b0c"} Apr 16 18:49:51.260506 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:51.260356 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" event={"ID":"15bf30bd-7b7e-4f3e-b09b-59fc968da5d7","Type":"ContainerStarted","Data":"3b66bb986a711da122e0a3a594711bbe38d1a2ccf258c5b6d65e2f8375135735"} Apr 16 18:49:51.260506 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:51.260424 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" Apr 16 18:49:51.277370 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:51.277325 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" podStartSLOduration=2.277310178 podStartE2EDuration="2.277310178s" podCreationTimestamp="2026-04-16 18:49:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:49:51.275385467 +0000 UTC m=+1936.122545958" watchObservedRunningTime="2026-04-16 18:49:51.277310178 +0000 UTC m=+1936.124470669" Apr 16 18:49:57.268529 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:49:57.268495 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" Apr 16 18:50:20.625242 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:50:20.625125 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq"] Apr 16 18:50:20.628104 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:50:20.628083 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" Apr 16 18:50:20.630085 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:50:20.630069 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-56ec1-serving-cert\"" Apr 16 18:50:20.630340 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:50:20.630319 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-56ec1-kube-rbac-proxy-sar-config\"" Apr 16 18:50:20.635627 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:50:20.635608 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq"] Apr 16 18:50:20.722803 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:50:20.722778 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d67a998f-616d-44a6-b8df-ff46590a864e-openshift-service-ca-bundle\") pod \"splitter-graph-56ec1-696b76c44b-v9dvq\" (UID: \"d67a998f-616d-44a6-b8df-ff46590a864e\") " pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" Apr 16 18:50:20.722888 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:50:20.722808 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d67a998f-616d-44a6-b8df-ff46590a864e-proxy-tls\") pod \"splitter-graph-56ec1-696b76c44b-v9dvq\" (UID: \"d67a998f-616d-44a6-b8df-ff46590a864e\") " pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" Apr 16 18:50:20.823179 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:50:20.823158 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d67a998f-616d-44a6-b8df-ff46590a864e-openshift-service-ca-bundle\") pod \"splitter-graph-56ec1-696b76c44b-v9dvq\" (UID: \"d67a998f-616d-44a6-b8df-ff46590a864e\") " pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" Apr 16 18:50:20.823309 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:50:20.823188 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d67a998f-616d-44a6-b8df-ff46590a864e-proxy-tls\") pod \"splitter-graph-56ec1-696b76c44b-v9dvq\" (UID: \"d67a998f-616d-44a6-b8df-ff46590a864e\") " pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" Apr 16 18:50:20.823309 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:50:20.823293 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-56ec1-serving-cert: secret "splitter-graph-56ec1-serving-cert" not found Apr 16 18:50:20.823437 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:50:20.823346 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d67a998f-616d-44a6-b8df-ff46590a864e-proxy-tls podName:d67a998f-616d-44a6-b8df-ff46590a864e nodeName:}" failed. No retries permitted until 2026-04-16 18:50:21.323328972 +0000 UTC m=+1966.170489445 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d67a998f-616d-44a6-b8df-ff46590a864e-proxy-tls") pod "splitter-graph-56ec1-696b76c44b-v9dvq" (UID: "d67a998f-616d-44a6-b8df-ff46590a864e") : secret "splitter-graph-56ec1-serving-cert" not found Apr 16 18:50:20.823851 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:50:20.823826 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d67a998f-616d-44a6-b8df-ff46590a864e-openshift-service-ca-bundle\") pod \"splitter-graph-56ec1-696b76c44b-v9dvq\" (UID: \"d67a998f-616d-44a6-b8df-ff46590a864e\") " pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" Apr 16 18:50:21.326678 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:50:21.326634 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d67a998f-616d-44a6-b8df-ff46590a864e-proxy-tls\") pod \"splitter-graph-56ec1-696b76c44b-v9dvq\" (UID: \"d67a998f-616d-44a6-b8df-ff46590a864e\") " pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" Apr 16 18:50:21.333699 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:50:21.333670 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d67a998f-616d-44a6-b8df-ff46590a864e-proxy-tls\") pod \"splitter-graph-56ec1-696b76c44b-v9dvq\" (UID: \"d67a998f-616d-44a6-b8df-ff46590a864e\") " pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" Apr 16 18:50:21.538328 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:50:21.538284 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" Apr 16 18:50:21.661253 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:50:21.661222 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq"] Apr 16 18:50:21.664188 ip-10-0-134-167 kubenswrapper[2579]: W0416 18:50:21.664162 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd67a998f_616d_44a6_b8df_ff46590a864e.slice/crio-ca28f829220fb76cce227c3a9f30cd8153945d202d0c9596ca741e086466e11a WatchSource:0}: Error finding container ca28f829220fb76cce227c3a9f30cd8153945d202d0c9596ca741e086466e11a: Status 404 returned error can't find the container with id ca28f829220fb76cce227c3a9f30cd8153945d202d0c9596ca741e086466e11a Apr 16 18:50:22.350024 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:50:22.349988 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" event={"ID":"d67a998f-616d-44a6-b8df-ff46590a864e","Type":"ContainerStarted","Data":"cd4a4620f6f3782a27780703555b92048fd18745e304cb965c510148de70c24e"} Apr 16 18:50:22.350024 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:50:22.350031 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" event={"ID":"d67a998f-616d-44a6-b8df-ff46590a864e","Type":"ContainerStarted","Data":"ca28f829220fb76cce227c3a9f30cd8153945d202d0c9596ca741e086466e11a"} Apr 16 18:50:22.350281 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:50:22.350079 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" Apr 16 18:50:22.366872 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:50:22.366816 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" podStartSLOduration=2.366800701 podStartE2EDuration="2.366800701s" podCreationTimestamp="2026-04-16 18:50:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:50:22.366140402 +0000 UTC m=+1967.213300893" watchObservedRunningTime="2026-04-16 18:50:22.366800701 +0000 UTC m=+1967.213961193" Apr 16 18:50:28.358863 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:50:28.358828 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" Apr 16 18:52:35.793670 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:52:35.793563 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 18:52:35.797535 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:52:35.796817 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 18:57:35.812984 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:57:35.812869 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 18:57:35.816860 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:57:35.813943 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 18:58:35.290157 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:58:35.290125 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq"] Apr 16 18:58:35.290680 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:58:35.290361 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" podUID="d67a998f-616d-44a6-b8df-ff46590a864e" containerName="splitter-graph-56ec1" containerID="cri-o://cd4a4620f6f3782a27780703555b92048fd18745e304cb965c510148de70c24e" gracePeriod=30 Apr 16 18:58:38.357177 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:58:38.357133 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" podUID="d67a998f-616d-44a6-b8df-ff46590a864e" containerName="splitter-graph-56ec1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:58:43.356730 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:58:43.356682 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" podUID="d67a998f-616d-44a6-b8df-ff46590a864e" containerName="splitter-graph-56ec1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:58:48.357701 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:58:48.357654 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" podUID="d67a998f-616d-44a6-b8df-ff46590a864e" containerName="splitter-graph-56ec1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:58:48.358192 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:58:48.357776 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" Apr 16 18:58:53.357372 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:58:53.357324 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" podUID="d67a998f-616d-44a6-b8df-ff46590a864e" containerName="splitter-graph-56ec1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:58:58.357102 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:58:58.357060 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" podUID="d67a998f-616d-44a6-b8df-ff46590a864e" containerName="splitter-graph-56ec1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:59:03.356997 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:59:03.356951 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" podUID="d67a998f-616d-44a6-b8df-ff46590a864e" containerName="splitter-graph-56ec1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:59:05.434410 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:59:05.434388 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" Apr 16 18:59:05.572746 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:59:05.572685 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d67a998f-616d-44a6-b8df-ff46590a864e-proxy-tls\") pod \"d67a998f-616d-44a6-b8df-ff46590a864e\" (UID: \"d67a998f-616d-44a6-b8df-ff46590a864e\") " Apr 16 18:59:05.572868 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:59:05.572750 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d67a998f-616d-44a6-b8df-ff46590a864e-openshift-service-ca-bundle\") pod \"d67a998f-616d-44a6-b8df-ff46590a864e\" (UID: \"d67a998f-616d-44a6-b8df-ff46590a864e\") " Apr 16 18:59:05.573105 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:59:05.573079 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d67a998f-616d-44a6-b8df-ff46590a864e-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d67a998f-616d-44a6-b8df-ff46590a864e" (UID: "d67a998f-616d-44a6-b8df-ff46590a864e"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:59:05.574531 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:59:05.574511 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d67a998f-616d-44a6-b8df-ff46590a864e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d67a998f-616d-44a6-b8df-ff46590a864e" (UID: "d67a998f-616d-44a6-b8df-ff46590a864e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:59:05.673147 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:59:05.673119 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d67a998f-616d-44a6-b8df-ff46590a864e-openshift-service-ca-bundle\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:59:05.673147 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:59:05.673145 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d67a998f-616d-44a6-b8df-ff46590a864e-proxy-tls\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 18:59:05.777498 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:59:05.777461 2579 generic.go:358] "Generic (PLEG): container finished" podID="d67a998f-616d-44a6-b8df-ff46590a864e" containerID="cd4a4620f6f3782a27780703555b92048fd18745e304cb965c510148de70c24e" exitCode=0 Apr 16 18:59:05.777613 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:59:05.777541 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" Apr 16 18:59:05.777613 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:59:05.777551 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" event={"ID":"d67a998f-616d-44a6-b8df-ff46590a864e","Type":"ContainerDied","Data":"cd4a4620f6f3782a27780703555b92048fd18745e304cb965c510148de70c24e"} Apr 16 18:59:05.777613 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:59:05.777585 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq" event={"ID":"d67a998f-616d-44a6-b8df-ff46590a864e","Type":"ContainerDied","Data":"ca28f829220fb76cce227c3a9f30cd8153945d202d0c9596ca741e086466e11a"} Apr 16 18:59:05.777613 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:59:05.777602 2579 scope.go:117] "RemoveContainer" containerID="cd4a4620f6f3782a27780703555b92048fd18745e304cb965c510148de70c24e" Apr 16 18:59:05.785760 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:59:05.785740 2579 scope.go:117] "RemoveContainer" containerID="cd4a4620f6f3782a27780703555b92048fd18745e304cb965c510148de70c24e" Apr 16 18:59:05.786027 ip-10-0-134-167 kubenswrapper[2579]: E0416 18:59:05.786008 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd4a4620f6f3782a27780703555b92048fd18745e304cb965c510148de70c24e\": container with ID starting with cd4a4620f6f3782a27780703555b92048fd18745e304cb965c510148de70c24e not found: ID does not exist" containerID="cd4a4620f6f3782a27780703555b92048fd18745e304cb965c510148de70c24e" Apr 16 18:59:05.786091 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:59:05.786048 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd4a4620f6f3782a27780703555b92048fd18745e304cb965c510148de70c24e"} err="failed to get container status \"cd4a4620f6f3782a27780703555b92048fd18745e304cb965c510148de70c24e\": rpc error: code = NotFound desc = could not find container \"cd4a4620f6f3782a27780703555b92048fd18745e304cb965c510148de70c24e\": container with ID starting with cd4a4620f6f3782a27780703555b92048fd18745e304cb965c510148de70c24e not found: ID does not exist" Apr 16 18:59:05.795507 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:59:05.795481 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq"] Apr 16 18:59:05.799623 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:59:05.799603 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-56ec1-696b76c44b-v9dvq"] Apr 16 18:59:07.754489 ip-10-0-134-167 kubenswrapper[2579]: I0416 18:59:07.754453 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d67a998f-616d-44a6-b8df-ff46590a864e" path="/var/lib/kubelet/pods/d67a998f-616d-44a6-b8df-ff46590a864e/volumes" Apr 16 19:02:35.832214 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:02:35.832106 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 19:02:35.836338 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:02:35.833647 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 19:06:09.324134 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:09.324104 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v"] Apr 16 19:06:09.324622 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:09.324338 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" podUID="15bf30bd-7b7e-4f3e-b09b-59fc968da5d7" containerName="switch-graph-979d7" containerID="cri-o://58267955b5a1d040e640a8fad928cf954debff27d65f179d011783393b9d1b0c" gracePeriod=30 Apr 16 19:06:12.267745 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:12.267699 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" podUID="15bf30bd-7b7e-4f3e-b09b-59fc968da5d7" containerName="switch-graph-979d7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 19:06:17.267867 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:17.267823 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" podUID="15bf30bd-7b7e-4f3e-b09b-59fc968da5d7" containerName="switch-graph-979d7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 19:06:22.267198 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:22.267148 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" podUID="15bf30bd-7b7e-4f3e-b09b-59fc968da5d7" containerName="switch-graph-979d7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 19:06:22.267726 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:22.267360 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" Apr 16 19:06:24.552875 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:24.552821 2579 ???:1] "http: TLS handshake error from 10.0.134.167:59206: EOF" Apr 16 19:06:24.554006 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:24.553982 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-979d7-565d6b4454-s4t4v_15bf30bd-7b7e-4f3e-b09b-59fc968da5d7/switch-graph-979d7/0.log" Apr 16 19:06:25.350017 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:25.349988 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-979d7-565d6b4454-s4t4v_15bf30bd-7b7e-4f3e-b09b-59fc968da5d7/switch-graph-979d7/0.log" Apr 16 19:06:26.129571 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:26.129534 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-979d7-565d6b4454-s4t4v_15bf30bd-7b7e-4f3e-b09b-59fc968da5d7/switch-graph-979d7/0.log" Apr 16 19:06:26.887623 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:26.887587 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-979d7-565d6b4454-s4t4v_15bf30bd-7b7e-4f3e-b09b-59fc968da5d7/switch-graph-979d7/0.log" Apr 16 19:06:27.267734 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:27.267656 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" podUID="15bf30bd-7b7e-4f3e-b09b-59fc968da5d7" containerName="switch-graph-979d7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 19:06:27.621690 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:27.621657 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-979d7-565d6b4454-s4t4v_15bf30bd-7b7e-4f3e-b09b-59fc968da5d7/switch-graph-979d7/0.log" Apr 16 19:06:28.388078 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:28.388045 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-979d7-565d6b4454-s4t4v_15bf30bd-7b7e-4f3e-b09b-59fc968da5d7/switch-graph-979d7/0.log" Apr 16 19:06:29.145869 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:29.145838 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-979d7-565d6b4454-s4t4v_15bf30bd-7b7e-4f3e-b09b-59fc968da5d7/switch-graph-979d7/0.log" Apr 16 19:06:29.931919 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:29.931898 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-979d7-565d6b4454-s4t4v_15bf30bd-7b7e-4f3e-b09b-59fc968da5d7/switch-graph-979d7/0.log" Apr 16 19:06:30.691889 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:30.691863 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-979d7-565d6b4454-s4t4v_15bf30bd-7b7e-4f3e-b09b-59fc968da5d7/switch-graph-979d7/0.log" Apr 16 19:06:31.449443 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:31.449411 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-979d7-565d6b4454-s4t4v_15bf30bd-7b7e-4f3e-b09b-59fc968da5d7/switch-graph-979d7/0.log" Apr 16 19:06:32.225142 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:32.225110 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-979d7-565d6b4454-s4t4v_15bf30bd-7b7e-4f3e-b09b-59fc968da5d7/switch-graph-979d7/0.log" Apr 16 19:06:32.266680 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:32.266640 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" podUID="15bf30bd-7b7e-4f3e-b09b-59fc968da5d7" containerName="switch-graph-979d7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 19:06:33.049528 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:33.049492 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-979d7-565d6b4454-s4t4v_15bf30bd-7b7e-4f3e-b09b-59fc968da5d7/switch-graph-979d7/0.log" Apr 16 19:06:37.267479 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:37.267427 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" podUID="15bf30bd-7b7e-4f3e-b09b-59fc968da5d7" containerName="switch-graph-979d7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 19:06:38.088083 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:38.088046 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-6zhb2_530ccfcb-dfe5-440b-8750-09dd186b8702/global-pull-secret-syncer/0.log" Apr 16 19:06:38.250837 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:38.250799 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-q8spv_3ea5a78e-da64-4bee-b206-3f22bfd07fbc/konnectivity-agent/0.log" Apr 16 19:06:38.391298 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:38.391253 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-167.ec2.internal_69e28dff0cb5be54806a6f5d3d910a4e/haproxy/0.log" Apr 16 19:06:39.963704 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:39.963683 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" Apr 16 19:06:40.018634 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:40.018603 2579 generic.go:358] "Generic (PLEG): container finished" podID="15bf30bd-7b7e-4f3e-b09b-59fc968da5d7" containerID="58267955b5a1d040e640a8fad928cf954debff27d65f179d011783393b9d1b0c" exitCode=0 Apr 16 19:06:40.018752 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:40.018685 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" Apr 16 19:06:40.018752 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:40.018696 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" event={"ID":"15bf30bd-7b7e-4f3e-b09b-59fc968da5d7","Type":"ContainerDied","Data":"58267955b5a1d040e640a8fad928cf954debff27d65f179d011783393b9d1b0c"} Apr 16 19:06:40.018752 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:40.018733 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v" event={"ID":"15bf30bd-7b7e-4f3e-b09b-59fc968da5d7","Type":"ContainerDied","Data":"3b66bb986a711da122e0a3a594711bbe38d1a2ccf258c5b6d65e2f8375135735"} Apr 16 19:06:40.018752 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:40.018752 2579 scope.go:117] "RemoveContainer" containerID="58267955b5a1d040e640a8fad928cf954debff27d65f179d011783393b9d1b0c" Apr 16 19:06:40.025860 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:40.025843 2579 scope.go:117] "RemoveContainer" containerID="58267955b5a1d040e640a8fad928cf954debff27d65f179d011783393b9d1b0c" Apr 16 19:06:40.026086 ip-10-0-134-167 kubenswrapper[2579]: E0416 19:06:40.026066 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58267955b5a1d040e640a8fad928cf954debff27d65f179d011783393b9d1b0c\": container with ID starting with 58267955b5a1d040e640a8fad928cf954debff27d65f179d011783393b9d1b0c not found: ID does not exist" containerID="58267955b5a1d040e640a8fad928cf954debff27d65f179d011783393b9d1b0c" Apr 16 19:06:40.026175 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:40.026093 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58267955b5a1d040e640a8fad928cf954debff27d65f179d011783393b9d1b0c"} err="failed to get container status \"58267955b5a1d040e640a8fad928cf954debff27d65f179d011783393b9d1b0c\": rpc error: code = NotFound desc = could not find container \"58267955b5a1d040e640a8fad928cf954debff27d65f179d011783393b9d1b0c\": container with ID starting with 58267955b5a1d040e640a8fad928cf954debff27d65f179d011783393b9d1b0c not found: ID does not exist" Apr 16 19:06:40.073886 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:40.073867 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15bf30bd-7b7e-4f3e-b09b-59fc968da5d7-openshift-service-ca-bundle\") pod \"15bf30bd-7b7e-4f3e-b09b-59fc968da5d7\" (UID: \"15bf30bd-7b7e-4f3e-b09b-59fc968da5d7\") " Apr 16 19:06:40.073968 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:40.073898 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15bf30bd-7b7e-4f3e-b09b-59fc968da5d7-proxy-tls\") pod \"15bf30bd-7b7e-4f3e-b09b-59fc968da5d7\" (UID: \"15bf30bd-7b7e-4f3e-b09b-59fc968da5d7\") " Apr 16 19:06:40.074186 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:40.074167 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15bf30bd-7b7e-4f3e-b09b-59fc968da5d7-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "15bf30bd-7b7e-4f3e-b09b-59fc968da5d7" (UID: "15bf30bd-7b7e-4f3e-b09b-59fc968da5d7"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:06:40.075727 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:40.075702 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15bf30bd-7b7e-4f3e-b09b-59fc968da5d7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "15bf30bd-7b7e-4f3e-b09b-59fc968da5d7" (UID: "15bf30bd-7b7e-4f3e-b09b-59fc968da5d7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:06:40.174235 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:40.174213 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15bf30bd-7b7e-4f3e-b09b-59fc968da5d7-openshift-service-ca-bundle\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 19:06:40.174235 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:40.174233 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15bf30bd-7b7e-4f3e-b09b-59fc968da5d7-proxy-tls\") on node \"ip-10-0-134-167.ec2.internal\" DevicePath \"\"" Apr 16 19:06:40.339424 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:40.339400 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v"] Apr 16 19:06:40.344626 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:40.344603 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-979d7-565d6b4454-s4t4v"] Apr 16 19:06:41.755246 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:41.755207 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15bf30bd-7b7e-4f3e-b09b-59fc968da5d7" path="/var/lib/kubelet/pods/15bf30bd-7b7e-4f3e-b09b-59fc968da5d7/volumes" Apr 16 19:06:42.058137 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:42.058056 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5sn5t_0835cb09-27df-467b-b5a7-67f20c2fce38/node-exporter/0.log" Apr 16 19:06:42.081798 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:42.081772 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5sn5t_0835cb09-27df-467b-b5a7-67f20c2fce38/kube-rbac-proxy/0.log" Apr 16 19:06:42.106860 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:42.106841 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5sn5t_0835cb09-27df-467b-b5a7-67f20c2fce38/init-textfile/0.log" Apr 16 19:06:44.014912 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.014884 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-xcpp6_0464fc65-3816-4d16-840a-ffda5744de6c/networking-console-plugin/0.log" Apr 16 19:06:44.650050 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.650008 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685"] Apr 16 19:06:44.650364 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.650346 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d67a998f-616d-44a6-b8df-ff46590a864e" containerName="splitter-graph-56ec1" Apr 16 19:06:44.650462 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.650367 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d67a998f-616d-44a6-b8df-ff46590a864e" containerName="splitter-graph-56ec1" Apr 16 19:06:44.650462 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.650402 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15bf30bd-7b7e-4f3e-b09b-59fc968da5d7" containerName="switch-graph-979d7" Apr 16 19:06:44.650462 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.650411 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="15bf30bd-7b7e-4f3e-b09b-59fc968da5d7" containerName="switch-graph-979d7" Apr 16 19:06:44.650610 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.650531 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="15bf30bd-7b7e-4f3e-b09b-59fc968da5d7" containerName="switch-graph-979d7" Apr 16 19:06:44.650610 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.650548 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="d67a998f-616d-44a6-b8df-ff46590a864e" containerName="splitter-graph-56ec1" Apr 16 19:06:44.654738 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.654712 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685" Apr 16 19:06:44.657439 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.657403 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6xqvt\"/\"default-dockercfg-4xt97\"" Apr 16 19:06:44.657568 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.657494 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6xqvt\"/\"openshift-service-ca.crt\"" Apr 16 19:06:44.657568 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.657494 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6xqvt\"/\"kube-root-ca.crt\"" Apr 16 19:06:44.663879 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.663851 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685"] Apr 16 19:06:44.705052 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.705025 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8ca68410-07f8-4732-9d9f-7ebae4261168-sys\") pod \"perf-node-gather-daemonset-k4685\" (UID: \"8ca68410-07f8-4732-9d9f-7ebae4261168\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685" Apr 16 19:06:44.705140 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.705072 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8ca68410-07f8-4732-9d9f-7ebae4261168-podres\") pod \"perf-node-gather-daemonset-k4685\" (UID: \"8ca68410-07f8-4732-9d9f-7ebae4261168\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685" Apr 16 19:06:44.705140 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.705096 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96tzc\" (UniqueName: \"kubernetes.io/projected/8ca68410-07f8-4732-9d9f-7ebae4261168-kube-api-access-96tzc\") pod \"perf-node-gather-daemonset-k4685\" (UID: \"8ca68410-07f8-4732-9d9f-7ebae4261168\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685" Apr 16 19:06:44.705208 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.705139 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8ca68410-07f8-4732-9d9f-7ebae4261168-proc\") pod \"perf-node-gather-daemonset-k4685\" (UID: \"8ca68410-07f8-4732-9d9f-7ebae4261168\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685" Apr 16 19:06:44.705208 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.705157 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8ca68410-07f8-4732-9d9f-7ebae4261168-lib-modules\") pod \"perf-node-gather-daemonset-k4685\" (UID: \"8ca68410-07f8-4732-9d9f-7ebae4261168\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685" Apr 16 19:06:44.805393 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.805357 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8ca68410-07f8-4732-9d9f-7ebae4261168-podres\") pod \"perf-node-gather-daemonset-k4685\" (UID: \"8ca68410-07f8-4732-9d9f-7ebae4261168\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685" Apr 16 19:06:44.805498 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.805405 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96tzc\" (UniqueName: \"kubernetes.io/projected/8ca68410-07f8-4732-9d9f-7ebae4261168-kube-api-access-96tzc\") pod \"perf-node-gather-daemonset-k4685\" (UID: \"8ca68410-07f8-4732-9d9f-7ebae4261168\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685" Apr 16 19:06:44.805498 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.805444 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8ca68410-07f8-4732-9d9f-7ebae4261168-proc\") pod \"perf-node-gather-daemonset-k4685\" (UID: \"8ca68410-07f8-4732-9d9f-7ebae4261168\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685" Apr 16 19:06:44.805498 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.805468 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8ca68410-07f8-4732-9d9f-7ebae4261168-lib-modules\") pod \"perf-node-gather-daemonset-k4685\" (UID: \"8ca68410-07f8-4732-9d9f-7ebae4261168\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685" Apr 16 19:06:44.805609 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.805509 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8ca68410-07f8-4732-9d9f-7ebae4261168-sys\") pod \"perf-node-gather-daemonset-k4685\" (UID: \"8ca68410-07f8-4732-9d9f-7ebae4261168\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685" Apr 16 19:06:44.805609 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.805557 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8ca68410-07f8-4732-9d9f-7ebae4261168-podres\") pod \"perf-node-gather-daemonset-k4685\" (UID: \"8ca68410-07f8-4732-9d9f-7ebae4261168\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685" Apr 16 19:06:44.805609 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.805586 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8ca68410-07f8-4732-9d9f-7ebae4261168-proc\") pod \"perf-node-gather-daemonset-k4685\" (UID: \"8ca68410-07f8-4732-9d9f-7ebae4261168\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685" Apr 16 19:06:44.805702 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.805598 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8ca68410-07f8-4732-9d9f-7ebae4261168-sys\") pod \"perf-node-gather-daemonset-k4685\" (UID: \"8ca68410-07f8-4732-9d9f-7ebae4261168\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685" Apr 16 19:06:44.805702 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.805645 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8ca68410-07f8-4732-9d9f-7ebae4261168-lib-modules\") pod \"perf-node-gather-daemonset-k4685\" (UID: \"8ca68410-07f8-4732-9d9f-7ebae4261168\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685" Apr 16 19:06:44.818627 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.818595 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96tzc\" (UniqueName: \"kubernetes.io/projected/8ca68410-07f8-4732-9d9f-7ebae4261168-kube-api-access-96tzc\") pod \"perf-node-gather-daemonset-k4685\" (UID: \"8ca68410-07f8-4732-9d9f-7ebae4261168\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685" Apr 16 19:06:44.966553 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:44.966496 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685" Apr 16 19:06:45.084686 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:45.084660 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685"] Apr 16 19:06:45.087821 ip-10-0-134-167 kubenswrapper[2579]: W0416 19:06:45.087790 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8ca68410_07f8_4732_9d9f_7ebae4261168.slice/crio-dbf459f6abee888bc41052016b54d5be011ae770488f3558693d748941f7f557 WatchSource:0}: Error finding container dbf459f6abee888bc41052016b54d5be011ae770488f3558693d748941f7f557: Status 404 returned error can't find the container with id dbf459f6abee888bc41052016b54d5be011ae770488f3558693d748941f7f557 Apr 16 19:06:45.089379 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:45.089361 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:06:46.036342 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:46.036305 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685" event={"ID":"8ca68410-07f8-4732-9d9f-7ebae4261168","Type":"ContainerStarted","Data":"3e0a1715d144b27e12f0afd4f37b6c33494b5e2708a596d25b8685e2b78039e0"} Apr 16 19:06:46.036342 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:46.036343 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685" event={"ID":"8ca68410-07f8-4732-9d9f-7ebae4261168","Type":"ContainerStarted","Data":"dbf459f6abee888bc41052016b54d5be011ae770488f3558693d748941f7f557"} Apr 16 19:06:46.036535 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:46.036439 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685" Apr 16 19:06:46.056671 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:46.056627 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685" podStartSLOduration=2.056614792 podStartE2EDuration="2.056614792s" podCreationTimestamp="2026-04-16 19:06:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:06:46.054947576 +0000 UTC m=+2950.902108066" watchObservedRunningTime="2026-04-16 19:06:46.056614792 +0000 UTC m=+2950.903775283" Apr 16 19:06:46.094220 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:46.094199 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-db7l8_53b4818f-dde2-45a8-a8aa-831951359360/dns/0.log" Apr 16 19:06:46.119492 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:46.119469 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-db7l8_53b4818f-dde2-45a8-a8aa-831951359360/kube-rbac-proxy/0.log" Apr 16 19:06:46.228898 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:46.228880 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mt4jc_9e454ceb-b1f7-44d0-899f-7d0a1be98b35/dns-node-resolver/0.log" Apr 16 19:06:46.737338 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:46.737310 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-l4nhg_2b843015-ad80-4dc0-aad1-a22e5a3909f6/node-ca/0.log" Apr 16 19:06:47.969695 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:47.969662 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-g7fm9_9d741d54-0257-44ff-8680-6b59a49600e3/serve-healthcheck-canary/0.log" Apr 16 19:06:48.548693 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:48.548668 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cgsl2_8d567708-c8b6-4dd6-a935-bbba6f557f09/kube-rbac-proxy/0.log" Apr 16 19:06:48.572830 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:48.572807 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cgsl2_8d567708-c8b6-4dd6-a935-bbba6f557f09/exporter/0.log" Apr 16 19:06:48.600092 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:48.600064 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cgsl2_8d567708-c8b6-4dd6-a935-bbba6f557f09/extractor/0.log" Apr 16 19:06:50.710241 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:50.710166 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-7668d57578-rgt4s_a287e58a-ddbe-4e6d-9fc0-b96bb20515af/manager/0.log" Apr 16 19:06:52.047846 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:52.047813 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-k4685" Apr 16 19:06:57.034866 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:57.034834 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nw6hr_e6e3d626-b1d8-4140-83e2-92db90a4eae4/kube-multus-additional-cni-plugins/0.log" Apr 16 19:06:57.060171 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:57.060139 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nw6hr_e6e3d626-b1d8-4140-83e2-92db90a4eae4/egress-router-binary-copy/0.log" Apr 16 19:06:57.085540 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:57.085520 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nw6hr_e6e3d626-b1d8-4140-83e2-92db90a4eae4/cni-plugins/0.log" Apr 16 19:06:57.110801 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:57.110783 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nw6hr_e6e3d626-b1d8-4140-83e2-92db90a4eae4/bond-cni-plugin/0.log" Apr 16 19:06:57.135309 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:57.135292 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nw6hr_e6e3d626-b1d8-4140-83e2-92db90a4eae4/routeoverride-cni/0.log" Apr 16 19:06:57.164062 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:57.164041 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nw6hr_e6e3d626-b1d8-4140-83e2-92db90a4eae4/whereabouts-cni-bincopy/0.log" Apr 16 19:06:57.186486 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:57.186471 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nw6hr_e6e3d626-b1d8-4140-83e2-92db90a4eae4/whereabouts-cni/0.log" Apr 16 19:06:57.407112 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:57.407083 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mtzwt_5d102faf-ea74-4afa-95c1-4133f4d71f8b/kube-multus/0.log" Apr 16 19:06:57.545839 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:57.545813 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-k6s9z_e2bb1680-b343-4014-bde1-6cc6bcd9805c/network-metrics-daemon/0.log" Apr 16 19:06:57.568619 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:57.568596 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-k6s9z_e2bb1680-b343-4014-bde1-6cc6bcd9805c/kube-rbac-proxy/0.log" Apr 16 19:06:58.347193 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:58.347160 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-controller/0.log" Apr 16 19:06:58.367280 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:58.367253 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/0.log" Apr 16 19:06:58.393107 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:58.393086 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovn-acl-logging/1.log" Apr 16 19:06:58.421139 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:58.421118 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/kube-rbac-proxy-node/0.log" Apr 16 19:06:58.447037 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:58.447018 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:06:58.466807 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:58.466773 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/northd/0.log" Apr 16 19:06:58.490286 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:58.490246 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/nbdb/0.log" Apr 16 19:06:58.514330 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:58.514314 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/sbdb/0.log" Apr 16 19:06:58.697165 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:06:58.697144 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-md54p_e302771b-8af3-42dd-92e1-04faaff1c6e8/ovnkube-controller/0.log" Apr 16 19:07:00.521849 ip-10-0-134-167 kubenswrapper[2579]: I0416 19:07:00.518155 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-sq4vb_17b7bd66-ef40-4ff5-89de-2c7c3408fdc6/network-check-target-container/0.log"